Fei Fei Li (on a human-centered approach to AI) AI transcript and summary - episode of podcast Armchair Expert with Dax Shepard
Go to PodExtra AI's episode page (Fei Fei Li (on a human-centered approach to AI)) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.
Go to PodExtra AI's podcast page (Armchair Expert with Dax Shepard) to view the AI-processed content of all episodes of this podcast.
Armchair Expert with Dax Shepard episodes list: view full AI transcripts and summaries of this podcast on the blog
Episode: Fei Fei Li (on a human-centered approach to AI)
Author: Armchair Umbrella
Duration: 01:54:15
Episode Shownotes
Fei Fei Li (The Worlds I See: Curiosity, Exploration, and Discovery at the Dawn of AI) is a computer scientist, co-director of the Stanford Institute for Human-Centered Artificial Intelligence, and is considered by many to be the godmother of AI. Fei Fei joins the Armchair Expert to discuss her initial
reluctance to tell her personal story as a part of her book on AI, starting a laundromat with her parents to support themselves, and the high school teacher that changed the course of her life. Fei Fei and Dax talk about how math is the closest thing there is to magic, why being fearlessly stupid is sometimes the best asset you can have, and the reason her north star is asking the audacious question. Fei Fei explains her perspective on the tech-lash, why there is so much humanness in everything we do in technology, and how essential it is to put dignity into how we both create and govern AI.Follow Armchair Expert on the Wondery App or wherever you get your podcasts. Watch new content on YouTube or listen to Armchair Expert early and ad-free by joining Wondery+ in the Wondery App, Apple Podcasts, or Spotify. Start your free trial by visiting wondery.com/links/armchair-expert-with-dax-shepard/ now.See Privacy Policy at https://art19.com/privacy
and California Privacy Notice at https://art19.com/privacy
#do-not-sell-my-info.
Summary
In this episode of Armchair Expert, Fei Fei Li, a leading figure in artificial intelligence and co-director of Stanford's HAI Institute, discusses her personal journey as an immigrant and her experiences in STEM. She emphasizes the significance of human-centered AI, where technology should incorporate human dignity and ethical considerations. Fei Fei reflects on the influential roles her parents, teachers, and early experiences have played in shaping her perspective on AI and technology's societal impacts. The conversation invites listeners to appreciate personal narratives while providing insights on the importance of collaboration and ethical frameworks in AI development.
Go to PodExtra AI's episode page (Fei Fei Li (on a human-centered approach to AI)) to play and view complete AI-processed content: summary, mindmap, topics, takeaways, transcript, keywords and highlights.
Full Transcript
00:00:03 Speaker_06
Dan Rather, and I'm joined by Modest Mouse. Hello. I've been talking about this book quite a bit over the last six months. The world's I see curiosity, exploration and discovery at the dawn of AI. by Dr. Fei-Fei Li.
00:00:18 Speaker_06
She is an expert on computer vision, machine learning, and cognitive and computational neuroscience. She is a computer science professor at Stanford University and founding director of Stanford Institute for Human-Centered Artificial Intelligence.
00:00:31 Speaker_06
So we've had a lot of AI, but I'll say that this, what makes this episode so special is Dr. Fei-Fei Li's personal story is so compelling.
00:00:42 Speaker_06
I mean, it's the fact that people can land in this country, not speaking English, deep into school and fucking pick it all up and then master all these fields.
00:00:51 Speaker_03
Become better than everyone.
00:00:53 Speaker_06
Oh my God, it's so impressive. Yeah. Oh, I loved her. Please enjoy Dr. Faye Fahey. We are presented by Amazon Prime. It's more than just fast, free shipping. Whatever you're into, it's on Prime.
00:01:05 Speaker_06
So keep listening, keep watching, keep on keeping on to hear more about why we love Prime during the fact check.
00:01:13 Speaker_04
This is a very very thinking
00:01:35 Speaker_00
I know.
00:01:36 Speaker_06
We're up to a terrible start.
00:01:37 Speaker_00
You want to swap? You want to sit in this chair?
00:01:40 Speaker_06
You can.
00:01:40 Speaker_03
I feel like I'm going through a therapy very soon. Yeah, well, that is the goal. We do want people to feel very relaxed. Too comfortable, really.
00:01:48 Speaker_06
Yeah, if the spirit moves you to lay down, supine. You're invited to do so.
00:01:54 Speaker_00
I might. I woke up so early.
00:01:57 Speaker_06
What time did you wake up?
00:01:58 Speaker_00
Probably 5.40 something. That's early. Yeah.
00:02:01 Speaker_06
What time do you normally rise?
00:02:03 Speaker_00
Not that much later. My alarm is 6.20.
00:02:06 Speaker_06
OK, mine's 6.40.
00:02:07 Speaker_00
Oh, OK.
00:02:08 Speaker_06
I'm aspiring. But you know what's funny? Today it was 6.20.
00:02:11 Speaker_00
You have kids?
00:02:12 Speaker_06
I have kids.
00:02:12 Speaker_00
How old are they? 9 and 11.
00:02:13 Speaker_06
Mine is 8 and 12.
00:02:16 Speaker_00
You have 8 and 12. Yeah, so we're in the same stage of life.
00:02:20 Speaker_06
Boys or girls?
00:02:20 Speaker_00
12 is boy, 8 is girl. How about you?
00:02:24 Speaker_06
Girl, girl. Yes, I'm so lucky. Although 11 about to be 12, I'm starting to get an inkling of what's coming my way. In a house with three ladies. In fact, yesterday was a very emotional day. I'm barely hanging on.
00:02:40 Speaker_03
What happened?
00:02:42 Speaker_06
I don't know what's going on with my three ladies, but all of them are in some kind of hormonal turmoil. And every variety, which is fine. We're always recording.
00:02:54 Speaker_03
We'll call it AVR, always be recording. Always be recording.
00:02:59 Speaker_06
Okay, so you have 12 and 8, and are they close? They're very close. He's a good big brother. He totally is.
00:03:06 Speaker_00
Yeah. He's sweet.
00:03:08 Speaker_06
Silvio's your husband, yeah?
00:03:09 Speaker_00
Yes.
00:03:10 Speaker_06
And does one of them have Silvio's personality and one have yours?
00:03:13 Speaker_00
Well, I can tell you one of them has Silvio's hair. Okay, which is... A lot of curls. Curls. World of curls. Yeah, world of curls.
00:03:23 Speaker_06
Well, you're here because I read your book maybe two months ago. I was having dinner with Ashton Kutcher. Do you know who that actor is?
00:03:30 Speaker_00
Yes, he just texted me last night. He's like, you're seeing my friend, Dex. Oh, good.
00:03:36 Speaker_06
Yeah, so we were at dinner and we were just chatting about people we thought were really interesting. And then he asked me if I had read your book and I hadn't.
00:03:42 Speaker_06
I went into it thinking I would get a history lesson on AI, which I did and a very thorough one. But I would not have invited you for that. Your life story is so interesting and beautiful and the way you write about it. You're an incredible writer.
00:03:58 Speaker_00
Thank you. I do want to give credit to my friend, Alex, who co-wrote the book with me.
00:04:03 Speaker_06
OK, that's very big of you. Who's Alex?
00:04:06 Speaker_00
Alex is a friend of mine. He does not claim to be a writer, but he's a very talented writer. And we've known each other for years. He loves AI and we talk about it.
00:04:17 Speaker_00
So when I was invited to write this book, I do feel like I want Alex to co-create this with me. So we became a creative team. It was a really fun process.
00:04:27 Speaker_06
How do you know Alex?
00:04:28 Speaker_00
I know Alex through TED. 2015, I gave my first TED talk.
00:04:33 Speaker_06
I watched it. Yeah.
00:04:34 Speaker_00
Thank you.
00:04:35 Speaker_06
About image, how hard it is for a computer to see.
00:04:38 Speaker_00
Yes. And that's how I got to know Alex.
00:04:40 Speaker_06
Because he worked with Ted in some capacity?
00:04:42 Speaker_00
I think so, yeah. He was in some kind of partnership with Ted and he was helping me to put together my slides. Since then we've become friends and we talk about AI and he's also helped me in some of the Stanford HAI, Human-Centered AI Institute stuff.
00:04:58 Speaker_06
You're kind of creative partners.
00:05:00 Speaker_00
Yes, it's very interesting because book writing is a very different creation compared to doing science. We wrote almost three years or two and a half of those years. During the day I do my science and some of the evenings I do the creative writing.
00:05:16 Speaker_00
It's such a different part of the brain.
00:05:18 Speaker_06
Yeah. Which one do you find more exhaustive?
00:05:20 Speaker_00
Both in different ways. No, but they are very different. Of course, I've been a scientist for almost three decades, so I'm more familiar with being a scientist. But the creative writing journey, I loved every minute of it.
00:05:33 Speaker_00
When I say I love, it's not necessarily happy love. It's a painful, lovesome part of it. But I really loved it.
00:05:41 Speaker_06
That's what I want to start with, because I'm curious, when you sat down with Alex, I'm sure the historical part, the scientific part, that stuff is probably easy. But had you ever told your life story to anyone in that detail?
00:05:53 Speaker_00
No, and I didn't want to, and I still don't.
00:05:55 Speaker_06
Yeah. Do you think that's a personal disposition or where you come from culturally? I think of the story of your father, which we'll get to, and how little he told you about his own childhood until the time was right.
00:06:06 Speaker_06
And I glean from that, well, this isn't a culture that is just divulging all this emotional trauma and baggage.
00:06:13 Speaker_00
Well, I have to say, I think culture in the case of an individual sometimes is too broad a brushstroke. I think it's more individual. I'm a relatively shy person. And Alex and I wrote the first version. It was purely scientific.
00:06:29 Speaker_00
It was the first year of COVID. We talk on the phone almost every night. And one of my best friend is a philosopher at Stanford called John Etchmendy. He's a very revered higher education leader. He was Stanford's provost for 17 years.
00:06:44 Speaker_00
And he is co-director with me at Stanford Human Center AI Institute. So I So I was really proud I wrote this first draft. I showed it to him. It was during COVID. After like two weeks, he called me.
00:06:56 Speaker_00
He said, Fei-Fei, you and Alex should come to my backyard. That's how we meet because we were social distancing. And then we went and he said, you have to rewrite. I was like, what? That's the last thing I want to do here.
00:07:11 Speaker_00
He said, you're missing an opportunity to tell your story, tell AI story through your lens. And I was just so rejecting that idea. I was like, who? Who wants to hear my story? I want to write about AI. I call him Edge.
00:07:25 Speaker_00
Edge said... There are many AI scientists who can write a so-called, quote-unquote, objective story of AI, but your angle would be so meaningful, your voice, to the young people out there, the young women, the immigrants, people of all kinds of backgrounds.
00:07:42 Speaker_00
And we were sitting in his backyard with a triangular shape, three chairs. And I looked at Alex. He was almost jumping off his chair. With excitement. He said, I told you. He said, I told you so many times. Of course, it only takes Edge to tell you that.
00:08:00 Speaker_06
Well, let's jump to a really big philosophical question about that. I think when reading your story, you came here, this huge language barrier, such a fish out of water, but your work, if good enough, would speak for itself.
00:08:16 Speaker_06
And it would be a meritocracy. And so it's not surprising to me that someone who got to where they wanted to go with that belief would have a hard time thinking, wait, I was trying to transcend this otherness.
00:08:27 Speaker_06
This otherness is the thing that would be most interesting and worthy of attention and affection. What a gap.
00:08:34 Speaker_00
It's very subtle you caught that because when I go into the world of science, I don't think too much about many other things. I just follow that light, follow the curiosity.
00:08:46 Speaker_00
And to this day, even when I was writing the book, it's AI that fascinates me and I wanted to write about AI. So it was very strange that someone wants to read about. Me.
00:08:57 Speaker_06
Yeah. Well, I think even the notion that you're struggling so hard. I got to set up your story more. This is the last thing I'll say out of context.
00:09:04 Speaker_06
Monica's like, not everyone's read the book, but just of course, math was appealing because math didn't have a language barrier.
00:09:11 Speaker_00
Yes, but I do want to be honest, even when I was a young kid in China, I loved math and physics. I love physics, I would say, even more than math itself. I saw math more as a tool. I saw more beauty and fascination in physics.
00:09:28 Speaker_06
Yeah, there's more philosophy.
00:09:30 Speaker_00
Yes.
00:09:31 Speaker_06
OK, so let's start in China in 1976. You're the only child of your parents.
00:09:36 Speaker_00
Yeah.
00:09:37 Speaker_06
And talk about your mother, because she's very interesting.
00:09:39 Speaker_00
She is very interesting. My mom come from a normal family. But as the book says, our family is in a difficult position because of the history. So she was a very good student.
00:09:51 Speaker_00
I think the intellectual intensity, I have a large part of it come from my mom. She was a curious student. She was very intense. But her dream was pretty shattered when she was not able to go to a normal high school, when she had a dream for college.
00:10:08 Speaker_00
And that carried her through.
00:10:11 Speaker_06
And then you arrive and you show this great aptitude. And now she has, in a sense, a second chance at this dream. But she starts recognizing pretty soon your path is going to be stilted as well if you stay there. So what's happening?
00:10:25 Speaker_06
What is she noticing as you start getting educated and show this aptitude?
00:10:29 Speaker_00
A lot of this is hindsight, because I didn't talk to my mom in this way, right? I think it was a combination that my mom has her own longing to be more free, maybe.
00:10:43 Speaker_00
And in hindsight, I don't know if she knew how to translate that in the world she was living in. And the opportunity to go to a new world was as appetizing to her as it is for her on my behalf. It's also true she saw me as a bit of a quirky kid.
00:11:03 Speaker_00
I think that blend of what she was longing for and what she was longing for on my behalf without me realizing was the motivation of many of the changes, the decision of immigration.
00:11:17 Speaker_06
Well, what would have been your trajectory had you stayed in China in 1988 when you're 12? Am I misremembering that your mom felt like they weren't giving you the attention and encouragement that she was hoping you would get?
00:11:29 Speaker_00
My mom was not looking for attention for me. My mom was looking for freedom for me. And for herself, a lot of that is projection, was looking for a world where I can just be who I am. She wasn't necessarily looking for attention.
00:11:46 Speaker_00
What do you mean by quirky? This goes to my dad. I didn't follow rules in the average way. In hindsight, maybe it was just me being immature. But also there is a part of me, why should girls not play soccer?
00:12:03 Speaker_00
Why should girls be told they are biologically less smart than boys? I was told at least more than once, watch out that girls will in general be less smart by the time you hit your teenage time.
00:12:18 Speaker_06
This is what I'm remembering from the book, that you are explicitly told you're not as smart as boys.
00:12:24 Speaker_00
I wasn't told in the context of one-on-one. Like, let me sit you down, Fei-Fei, and tell you. I was told in the way that teachers will say things to boys or the context. Society had a whole different expectation for boys.
00:12:41 Speaker_00
I was very lucky my own family protected me, but they can only protect me so much as soon as you enter a school system, as soon as you interact with society, all that came through. From that point of view, I was not following the normal path.
00:12:57 Speaker_00
I was reading different books. You know, I was so passionate about flights, UFOs, physics, special relativity. I would grab my classmates to talk about that. But that was just not normal.
00:13:11 Speaker_06
Yeah. Who was exposing you to all that stuff?
00:13:14 Speaker_00
Great question. I was trying to ask myself that question when I was writing the book and I still don't have a strong answer. I think the early curiosity, the exposure came from both my parents. My dad loved nature. My mom loved books and literature.
00:13:31 Speaker_00
But how did I fell in love with physics and UFOs and all that? I'm not totally sure. It could be my dad before he came to New Jersey. He was ordering me some magazines outside of the school reading, and that exposed me to those topics.
00:13:48 Speaker_00
And because my parents protected my curiosity, when I say protected, it really just meant they left it alone. They did a meadow with it. I kind of followed it through myself.
00:13:59 Speaker_06
So your dad leaves when you're 12. He goes to New Jersey. He's there for three years on his own. And he is setting up a landing for you and your mother, yeah?
00:14:10 Speaker_00
Yeah.
00:14:10 Speaker_06
Do you remember that three years missing him terribly? How was that experience?
00:14:15 Speaker_00
It was tough. I mean, it was early teenagehood. There was no Internet. Phone calls are extremely expensive to the point of being prohibitively expensive. So it was mostly letters every couple of months. But then
00:14:29 Speaker_00
I was a teenager, so I had my own world to explore as well. So it wasn't like I was sitting in the room crying or anything. Yeah.
00:14:37 Speaker_06
So then you come your sophomore year. Yes. You start a public high school in New Jersey.
00:14:42 Speaker_00
Perseverance High School.
00:14:44 Speaker_06
One of the experiences you had, I came in and told Monica immediately about, and you were in a class, some kind of a study hall or something. Library. Library. And you were with a group of other ESL kids, English as a Second Language kids.
00:14:58 Speaker_06
And you saw a very, well, no, I want to say how insignificant this first interaction was, like benign brushing up against a kid's backpack or something. Right. And what happened?
00:15:09 Speaker_00
A group of Yeso students were in the library and then the bell rang or something. We have to file out of the library door. And I remember it was crowded. I honestly did not see what happened to that boy, but all I knew was my Yeso
00:15:24 Speaker_00
friend was on the floor. By the time I realized there was some commotion, being kicked and punched, I think there was nose bleeding and he was holding his head.
00:15:33 Speaker_06
Yeah, he said he got a concussion and a broken nose. And there's two boys kicking him. Yeah. And that's not even maybe the most traumatic part. It's that after he's gone for a couple of weeks, he comes back and he's just not the same boy.
00:15:48 Speaker_00
Yeah, I mean, nobody would be the same after going through something like that. Definitely, it's a huge impact. It was an experience that was definitely pretty intense for all ESL students. Nobody felt safe for a long while.
00:16:04 Speaker_06
Yeah, I think it changes your worldview on a dime, which is, ooh, this new place I'm in can get pretty violent and a little out of control. And if you're other, this could happen. I have to imagine.
00:16:16 Speaker_06
Yeah, it's an incredibly scary recognition of where you're at.
00:16:20 Speaker_00
Yes, but also I do want to give more colors, right? I love that your show focuses on the messiness of being human. Being messy is being multi-dimensional. But it was also an environment where there was so much support, there was so much friendliness.
00:16:38 Speaker_00
And there was also so much opportunity. So it was very confusing. I'm not trying to say that experience itself is not heavy. I don't feel lucky about that experience. And I mean, there was anger and all that.
00:16:50 Speaker_00
But in the meantime, the fuller context of that community was also quite a supportive community. So it was very confusing. It gave me the multidimensionality of the new country I landed in.
00:17:04 Speaker_06
Everything's happening. A lot of opportunity is happening as promised and then a lot of xenophobia and violence is happening.
00:17:11 Speaker_03
Right. Yeah. Did you feel like you had to after that sort of like keep your head down? Maybe it's just my own personality.
00:17:17 Speaker_00
I always felt I had to keep my head down. Right. Especially as an immigrant, sometimes I feel that way even now, especially given the AI world we live in. I feel I need to keep my head down to do work.
00:17:31 Speaker_00
Of course, that particular event probably added a layer of complication, at least for a while. But they also taught me you have to stand up for yourself. They did open different insights to me.
00:17:45 Speaker_06
I don't know if you would rank these things in your life of like serendipitous things happening, but meeting Mr. Sabella has to be minimally in the top 10 and I would hope in the top three.
00:17:56 Speaker_00
Yeah, it's possibly in the top three for sure. Meeting Mr. Sabella was so lucky for me.
00:18:02 Speaker_06
Yeah, I find this to be one of the sweetest stories I've ever read about and kind of makes me hopeful for people how generous they can be. But in a nutshell, minimally, you're thinking I'm going to do good at math.
00:18:14 Speaker_06
I don't have to go to my dictionary back and forth like I do in every other class. And you're in math and you're getting problems wrong and you yourself cannot identify any pattern in this. You don't know what's going on.
00:18:27 Speaker_06
And you go to see Mr. Sabella in his office.
00:18:30 Speaker_03
That's your teacher?
00:18:31 Speaker_00
Yes, Mr. Savella was my math teacher. I got into calculus, and then Parsippany High School doesn't have calculus BC. We only had AP calculus for AB, so he had to teach me during his lunch hour for BC. But this story you're talking about was earlier.
00:18:47 Speaker_00
It was during some pre-calculus stuff, and it turned out I was using a broken calculator.
00:18:55 Speaker_06
that they had gotten at a garage sale. Her father loves garage sales. It was his favorite thing in the world. Every weekend they'd go to... I love garage sales.
00:19:04 Speaker_00
I don't have time to go to garage sales, but I love it. Mr. Sabella was tough. He is a tough love kind of teacher. Even though I was ESL, I was this mousy girl, he didn't think I needed any extra.
00:19:22 Speaker_05
He didn't pity you.
00:19:23 Speaker_00
No, not at all. For one quarter semester, I got 89.4 or something. I still remember that. I was like, oh god, 0.6. I would get to at least an A and he would not give me that A.
00:19:37 Speaker_06
You asked about extra credit and he was like, get real. How about you go to good grade on the test?
00:19:41 Speaker_00
Yeah, he would say there are many smart kids in the class. You just have to work hard.
00:19:45 Speaker_06
But it sounds in the retelling like the breakthrough. And I think this scene would be in a movie if I were writing the movie. You're there. He discovers the tan symbol on your calculator is malfunctioning.
00:19:55 Speaker_06
He helps her figure this out because he too can't figure out the pattern of all these errors. And then somehow you guys start talking about books, and he asks if you've read a certain science fiction writer.
00:20:05 Speaker_06
You try to tell him, you haven't read that one, but you really love A Million Kilometers Under the Sea. You can't translate it, and you can't pronounce Jules Verne. But he figures out you've read Jules Verne.
00:20:16 Speaker_04
Yes.
00:20:17 Speaker_06
And he is like shook. He's like, you've read Jules Verne?
00:20:20 Speaker_04
Yes.
00:20:20 Speaker_06
And then you go on to say, yes, and you've read Hemingway, and you've read everything.
00:20:25 Speaker_00
Well, I've read everything my mom gave me, which was a lot. Which was extensive. Yeah.
00:20:30 Speaker_06
If I were him and this young girl from China comes in and she has read most of the classics, that's a real like, what am I dealing with here? I gotta imagine for him at least that was a moment where he's like, OK, I'm betting on this horse.
00:20:45 Speaker_00
I think he saw a person he can befriend with just the way I saw in him. Later on, I realized, again, this is hindsight, that he does that to so many students.
00:20:59 Speaker_00
And he used this way of opening up in different ways, not necessarily science fiction or classic literature, to really get to be so helpful for him and me beyond math and calculator.
00:21:14 Speaker_00
When we started talking about science fiction and the English classics, he realized that he was seeing me more than ESL kid at that point. And he's also a shy person himself.
00:21:25 Speaker_00
Later, his wife, Jean and Bob, later I call him Bob, we became such good friends. Many, many years, way longer than maybe. Jean said he's such a bookworm. Even during his family parties, he'll be by himself reading. Yeah.
00:21:39 Speaker_00
So he's totally an introvert in a way that we just had chemistry. But this is one thing I was not able to fit into the book, is that for years, he would keep a diary. And his diary talks about just his teaching life.
00:21:56 Speaker_00
And I know in this diary, there are so many stories about different students he helped with. Not in the sense of bragging, it's just he's a writer, right? So years later, before he passed away, we didn't know he was going to pass away.
00:22:11 Speaker_00
I told Bob, I said, Bob, you've got to turn this into a book. Of course, we could anonymize, but this is an American teacher's story of So many students, many of them are immigrant students because they lack the support. They lack the family.
00:22:28 Speaker_00
Some of them are in high school by themselves. Family is overseas. Many of them are like me. Parents are so busy that the students don't have that emotional support. And he supported so many students. I can sit here and tell you endless story.
00:22:43 Speaker_00
I think he wanted to translate that into a book, but somehow he just couldn't bring himself to do it. Maybe he's too shy. Maybe he's too humble.
00:22:51 Speaker_06
Yeah, I think he's struggling with the same issue you're struggling with. You don't feel entitled to tell this story.
00:22:57 Speaker_00
Right. I feel so strongly he needed to write this book. I almost felt like one day I would write it for him. But of course, He passed away so suddenly because of the brain tumor.
00:23:08 Speaker_00
So when I was writing my book, I realized, let me tell the Bob Sabella story. Let me tell the story on behalf of so many American public school teachers. They don't have much of a voice.
00:23:22 Speaker_00
Nobody knows their name, but they work above and beyond every day for the students in their community. They don't care which part of the world they come from, which kind of family background they come from.
00:23:36 Speaker_00
But they invested so much in these students and they changed lives.
00:23:41 Speaker_06
Yeah, they're very unsung heroes. They're not tenured professors at elite universities.
00:23:45 Speaker_00
They're totally unsung heroes.
00:23:47 Speaker_06
And they're the ones that get the people to those destinations. Yeah, it's a really beautiful story. How instrumental was he in you finding your way to Princeton?
00:23:55 Speaker_00
He was instrumental more than Princeton because he was instrumental as the second dad. He helped me to be grounded. When you're an immigrant kid, ESL kid, you land in a country without speaking the language and going through so many things.
00:24:11 Speaker_00
It feels so unstable.
00:24:13 Speaker_06
I think you're underplaying your story. If you came here in seventh grade and ended up at Princeton, that's one story. You had two years to get yourself ready. To learn English. To start Princeton and you didn't speak any English.
00:24:25 Speaker_06
You're very much under, which is fine. I think so would your teacher. Yeah, you feel maybe that self-indulgent or something, but that's really bonkers.
00:24:35 Speaker_06
Again, AI aside, to land and go, okay, if you dropped me in Russia and told me I have two years to land at their most elite university, it's not gonna happen. It's not gonna happen for 99.999% of people.
00:24:50 Speaker_06
Let's talk for a second about you going to Princeton. This is another fun moment for me in the book because there's something so much more important about Einstein than the theory of special relativity.
00:24:59 Speaker_06
And I can't really articulate what it is, but I know you have a good dose of it. So what was it like going there and seeing the statue of Albert Einstein and imagining that you would in some way be touching that reality?
00:25:11 Speaker_00
So the first time I saw the statue of Albert Einstein before I was applying for college, it was probably early junior year. My dad continued to find things for us to do that's free. It's very important. It's free.
00:25:25 Speaker_00
Princeton's Natural History Museum was free.
00:25:32 Speaker_06
Garage sales, free.
00:25:33 Speaker_00
Exactly. Museums, free. Yes. Seeing Einstein's statue was kind of symbolic for me that I'm getting back to where really my soul wanted to be at.
00:25:46 Speaker_00
Because as a teenager, landed in a new country, trying to learn language, deal with all the messiness, you know, Chinese restaurant, walking dogs.
00:25:54 Speaker_06
You're working a ton of hours.
00:25:56 Speaker_00
Yeah, exactly. I didn't forget about physics. I was taking physics class in school, but I forget about the sense of love.
00:26:03 Speaker_06
Romanticism.
00:26:04 Speaker_00
Yes, it really is that first love. And it kind of got me back to that, rekindled something.
00:26:10 Speaker_06
Well, don't you think it left an imaginary world where this person existed and it put it in your own three-dimensional reality?
00:26:17 Speaker_00
Yes. Suddenly I feel so much closer to that person. And that person symbolizes the entire world of physics. I feel so much closer. I was literally in Princeton. That felt very different.
00:26:31 Speaker_06
And he lived there for, what, 30 some years, maybe more. I think that would be a special moment as well.
00:26:37 Speaker_00
I'm sure you watched the movie Oppenheimer.
00:26:39 Speaker_06
Yeah, yeah.
00:26:40 Speaker_00
Do you remember the opening scene was Einstein in front of that little pond?
00:26:44 Speaker_06
Yep, yep. Talking with Oppenheimer.
00:26:46 Speaker_00
Right. He was first there by himself. Yeah. I call that my pond. That pond literally exists. It was very close to my dorm by the time I got to Princeton.
00:26:56 Speaker_00
And I would go there a lot because I know that was close to the advanced institute where Einstein worked. Yeah. So like when the scene came out, Sylvia was sitting next to him like, Sylvia, this is my pond.
00:27:09 Speaker_03
You have such a full circle.
00:27:14 Speaker_06
I'm currently stuck in a rut where I'm learning a lot about physicists, historical physicists. And I'm wondering, have you read When We Cease to Understand the World? Have you read that book?
00:27:23 Speaker_04
No.
00:27:24 Speaker_06
Or have you read The Maniac? Either of those? The Maniac's all about Janusz von Neumann.
00:27:31 Speaker_00
I'm reading a different bio of him, but not the Maniac.
00:27:34 Speaker_06
Which one?
00:27:35 Speaker_00
Oh, it's in my phone. Yeah, same. Don't worry about it.
00:27:38 Speaker_06
This one's fun because it has the perspective of a million different people in his life, like the student he was friends with at school, one of his wives, people who worked with him, and you get this really comprehensive view.
00:27:47 Speaker_00
Another Princeton guy.
00:27:49 Speaker_06
Yeah, I'm obsessed with all these guys.
00:27:50 Speaker_06
And then when we cease to understand the world is many of these physicists who were so brilliant at a time who ultimately became crazy in how many of their breakthroughs in the math of quantum mechanics coming to this guy in a nine day, 106 degree fever, writing down the matrices and not understanding the math when he comes out of it.
00:28:10 Speaker_06
But it holds. There's a lot of weird magic in this space, I think. where people have these breakthrough thoughts and they touch some understanding and they're in a compromised state mentally is just fascinating to me.
00:28:23 Speaker_00
It's like mystical. Yeah. Physics is absolutely the discipline that pushes you to think so audaciously that you have to transcend the immediate reality.
00:28:34 Speaker_06
Yes.
00:28:35 Speaker_00
That's what I loved. I loved about Einstein. I loved about modern physics, even Newton, classic physics. You have to think so beyond the immediate reality.
00:28:46 Speaker_06
All those stories of him getting asked a question and then answering it two and a half days later, and he hasn't left the chair and the person left. Like he went away for two and a half days and then came back with the answer.
00:28:58 Speaker_06
Or just the notion, I think one of the most intriguing parts is like, you're going to have thoughts that cannot be expressed in language, but can only exist in math. That already is like, what?
00:29:09 Speaker_00
There is actually even beyond math.
00:29:11 Speaker_06
Right. And then there's a realm beyond math. It's the closest thing I think we have to magic, where it's like completely outside of our grasp, but for a handful of people.
00:29:21 Speaker_00
I love how you call it magic. It is also the furthest thing we have to AI. It's that humanity in us, that magic, that creativity, that intuition, that almost ungraspable way of intelligence. Yes. We should keep that in mind.
00:29:39 Speaker_06
So you're at Princeton. You're also working a ton, right?
00:29:41 Speaker_00
Yes.
00:29:42 Speaker_06
When do your parents start the dry cleaner?
00:29:44 Speaker_00
So we started very quickly, right after my freshman year started, because my mom's health was going so badly. They were working in Newark, New Jersey. I don't know if you guys know that part. New Jersey.
00:29:58 Speaker_00
From Persepone to Newark, New Jersey is a very difficult drive. My mom's health was bad and it was long working hours. I was really worried about them. Doctor was worried.
00:30:11 Speaker_00
We finally decided if we can do a local thing in Persepone, it'll be better for the family. And it was very important for me that the business is a weekend business because that way I can do the lion's share of work.
00:30:26 Speaker_00
But there are pretty much three kinds of weekend business for immigration families like us. Open a restaurant, open a grocery store, or open a laundry. And restaurant and grocery require very late working hours for restaurant.
00:30:42 Speaker_00
And grocery is very early. You have to go to Chinatown to get supplies. So neither of these work for my mom's health. Whereas dry cleaning was actually perfect because it's a daytime business.
00:30:53 Speaker_00
It's very long hours during the weekend, but it's at least daytime. And a lot of my mom's work, especially when it comes to alteration, she can sit in front of the sewing machine.
00:31:04 Speaker_06
Because your mother had had a reoccurring fever as a child and it greatly degenerated some of her heart. So she was really struggling with heart issues.
00:31:12 Speaker_00
Yes. She carried that illness with her all her life.
00:31:15 Speaker_06
And there's no money in the dry cleaning. There's only money in the seamstress scene, whatever we call it.
00:31:20 Speaker_00
The tailoring.
00:31:21 Speaker_06
The tailoring.
00:31:22 Speaker_00
I mean, there's no money in any of this. But having that tailoring ability was nice because it helps a little bit. And my mom is incredible. She never learned this. She was a bookworm and she's kind of a brainy.
00:31:37 Speaker_06
She should have done what you did.
00:31:38 Speaker_00
Right. Yeah. I don't think she would love physics. But you know what I mean? She should have probably been an academic. Yes. She would have been an academic. But then she just kind of figured out tailoring by herself. I still don't know. Like, I tried.
00:31:51 Speaker_00
I could not. The only thing I can do is sit there and unstitch things for her.
00:31:55 Speaker_06
I think a chimp can do that. Thank you.
00:32:00 Speaker_00
Yeah, exactly.
00:32:01 Speaker_06
I say that because I know how to remove stitches from garments. And I don't have more skills than a chimp.
00:32:05 Speaker_00
Yeah, so we opened a dry cleaner shop during the middle of my freshman year, and that became my entire memory of my undergraduate. Here's a fun fact. Princeton is organized by residential dorms. I lived in one of them called Forbes.
00:32:22 Speaker_00
It turned out Forbes is very famous for its Sunday brunch. I didn't know there was a Sunday brunch. Because I was home doing dry cleaning.
00:32:32 Speaker_06
He said he didn't go to a single party.
00:32:34 Speaker_00
Right. But then when I went back to Princeton as a faculty, Forbes was very kind. They made me a faculty fellow. And I discovered Sunday.
00:32:47 Speaker_06
Instead of the freshman 10, you gained the 3010. So I felt so good.
00:32:54 Speaker_00
I finally got my Sunday brunch.
00:32:56 Speaker_06
And I think it's worth mentioning when you guys were trying to open that dry cleaners, you were trying to raise $100,000 and you were $20,000 short. And again, Mr. Sabella, Monica, money.
00:33:10 Speaker_00
Yeah, it was a total shock. To this day, actually, as a 19-year-old, as much as I appreciated Gene and Bob, I did not realize the extent. We're talking about late 1990s. They are two public school teachers with two kids about to go to college.
00:33:30 Speaker_06
It's unimaginable.
00:33:31 Speaker_00
He said Jing and he decided to do that. I mean, at that moment, I was very, very grateful. But now after I became a grown up, this is unimaginable.
00:33:41 Speaker_06
It's impossible that someone would do that.
00:33:43 Speaker_00
Especially he later told me, I think when I was returning the money, he said, I didn't realize you'll be able to return.
00:33:51 Speaker_06
Of course, you have to give it thinking you'll never get it back. I guarantee he and his wife were like, we're giving this money away.
00:33:59 Speaker_00
I did not know that. He did use the word lend. And of course, in my mind, I was like, of course, I'm going to return, like I'll do anything to return. But Jing and Bob did not expect that. They could not have assumed that.
00:34:10 Speaker_00
So the money was being raised to help your mom? No, to help my family. To start the business. Yeah, we as a family, I still consider myself the CEO of the dry cleaner. I live in Silicon Valley. You have to claim yourself to be a CEO of something.
00:34:28 Speaker_00
Yeah, exactly. So Bob and Gene, it's incredible. I don't even think their kids knew about this till they read my book. Wow.
00:34:38 Speaker_06
Oh, my God. How proud I'd be of my dad. OK, so you graduate from Princeton and you have a degree in physics as well, some kind of computational.
00:34:47 Speaker_00
Yeah. So Princeton is a quirky school. It didn't have minors. So it has these certificates, but they're just minors. I had a computational mathematics as well as an engineering physics minors.
00:34:59 Speaker_06
And when you're there, unless I'm misremembering, you had a very singular focus on being a physicist. But while you're there, you start realizing you're maybe open to something different.
00:35:09 Speaker_00
It's actually really interesting. I never necessarily thought I would be a physicist, but I wanted to be a scientist. That was almost a sacred calling for me.
00:35:19 Speaker_06
It was an identity you wanted.
00:35:20 Speaker_00
It was an identity. For some reason, this girl who works in dry cleaners just wanted to be a scientist. And then I loved physics. But I loved physics for its audacity and curiosity. I didn't necessarily feel I'm married to a big telescope inquiry.
00:35:39 Speaker_00
So I was just reading a lot. And what really caught my attention was the physicists I admired so much Einstein, Schrodinger, Roger Penrose. They actually are curious beyond just the atomic world.
00:35:53 Speaker_00
They were curious about other things, especially life, intelligence, minds. And that was immediately a no-point and an eye-opener for me. I realized I love that.
00:36:05 Speaker_06
Yeah, understanding how this brain works. It's crazy the overlap that has now been proven, but at that time, that's not an obvious, we haven't figured out neural pathways and we're not going to map that onto computers yet.
00:36:18 Speaker_06
So these seem on the surface, very different fields. One's biology and one is, you know.
00:36:24 Speaker_00
Right. But for me, it was the science of intelligence. I always believed it's the science of intelligence that will unite our understanding of both the brain and the computers.
00:36:37 Speaker_06
Right. OK, so then you choose Caltech to go to graduate school. Yes. What did you think of California? I mean, my God, what a place, right?
00:36:45 Speaker_00
I know we're 15 minutes away from Caltech here, or 20 minutes. So I was choosing among MIT, Stanford, and Caltech, and honest to God, I almost chose Caltech because of the weather. Yeah, that's fair. It was so balmy. And the vibe.
00:37:01 Speaker_00
Yes, the turtles, the garden-like campus. And of course, I walk into this building. I think it was Moore Building at Caltech. And guess whose photo was there? It was Albert Einstein. And I was like, what?
00:37:15 Speaker_00
It turned out he was visiting and of course there was Richard Feynman, the Feynman Lecture. So I just followed these physicists apparently. And New Jersey was cold.
00:37:26 Speaker_00
And also I really have an issue with cold because my mom's illness is exacerbated by cold. So every winter she suffers a lot. So I have this negative affinity to coldness coming from taking care of my mom.
00:37:41 Speaker_00
So coming to Southern California, I was like, oh my God, I love this place. Did your parents come with you? Later they did. In the middle of my grad school, they did. Were you worried about that, leaving?
00:37:50 Speaker_00
I had to switch from being on site to remotely run the dry cleaning. The dry cleaning was stabilized that the customers are all returning customers. So my mom would be able to handle with one part-time worker.
00:38:07 Speaker_00
And Bob Sabella was doing bills for my mom. Oh my god. Yeah. He was just helping me. And another thing he helped me, as a young graduate student, I would be entering the world of writing scientific articles. That's pretty intense.
00:38:24 Speaker_00
He would still proofread my English for me, all my papers.
00:38:28 Speaker_06
Tell me about North Star and how you discovered yours, because this happens at Caltech.
00:38:33 Speaker_00
Yes, the prelude of the North Star was my education from physics is always about asking the right questions. If you go to the Nobel Museum in Stockholm, there is an Einstein quote about much of science is asking the right questions.
00:38:54 Speaker_00
Once you ask the right questions, solutions follow. You'll find a way for solutions. Some people call it hypothesis-driven thinking. I've always been just thinking this way.
00:39:03 Speaker_00
So as I was studying computational neuroscience, as well as artificial intelligence at Caltech, I was always kind of seeking, what is that audacious question I wanted to ask?
00:39:17 Speaker_00
And of course, my co-advisor Pietro Perona and Christoph Koch, they were great mentors guiding me. But many things start to converge, not just my own work, but the field.
00:39:28 Speaker_00
People working on visual intelligence, from neuroscience, from AI, start to orbit around this idea that the ability to recognize all kinds of objects is so critical for human visual intelligence.
00:39:47 Speaker_00
When I say all kinds of objects, I really mean all kinds. I'm sitting here in your beautiful room. There's table, bottles, couch, pillow, a globe, books, flower, vase, plants.
00:40:02 Speaker_06
T-Rex skeleton.
00:40:03 Speaker_00
Okay, that's behind you. It's about to eat you. Shirts and skirts and boots and TV. So the ability for humans to be able to learn such a complicated world of objects.
00:40:16 Speaker_06
Millions and millions of objects.
00:40:18 Speaker_00
It's so fascinating, and I started to believe, along with my advisors, this is a critical problem for the foundation of intelligence.
00:40:29 Speaker_00
And that really started to become the North Star of my scientific pursuit, is how do we crack the problem of object recognition?
00:40:41 Speaker_06
OK, so now I think is a great point to just go through a couple of the landmark events that take us to where the technology is at that time. So I guess we could start with Turing. We could start in 1956.
00:40:54 Speaker_06
Give us a couple of things that have happened in computing up to that point.
00:40:58 Speaker_00
Right. So that's the parallel story I was writing in the book.
00:41:01 Speaker_06
Now that I have people hooked into you as an individual, now we can get a little protein and learn some stuff.
00:41:07 Speaker_00
Right. Well, the field of computing, thanks to people like Alan Turing von Neumann, was starting during World War II time, basically. Of course, for the
00:41:16 Speaker_00
world of AI, a very important moment was 1956, when what we now call the founding fathers of AI, like Marvin Minsky, John McCarthy, Klaus Schellen, they get together under, I believe, a U.S. government grant.
00:41:35 Speaker_06
DARPA funded it or something?
00:41:36 Speaker_00
DARPA funded to have a summer long workshop at Dartmouth with a group of computer scientists. At that point, the field of AI was barely kind of born, not born yet. They got together and wrote this memo or this white paper about artificial intelligence.
00:41:55 Speaker_00
In fact, John McCarthy, one of the group leaders, was responsible for coining the term artificial intelligence.
00:42:04 Speaker_06
I think we could get even more rudimentary, right? So up until that point, a computer was something that could solve a problem. It could do computations.
00:42:13 Speaker_00
It could calculate.
00:42:14 Speaker_06
And this notion of artificial intelligence, what it really meant is, could we ever ask a computer questions that it hadn't been pre-programmed to answer?
00:42:23 Speaker_06
What are the hallmark things that separated at that time artificial intelligence from just computing?
00:42:28 Speaker_06
Because I think we've just fast forwarded to everyone saying AI, and I don't think they really even take a second to think of what that step is between computing and computation and thinking.
00:42:37 Speaker_00
Right. Up to that point, you can think no matter how powerful the computer was, it was used for programmed calculation. So what was the inflection concept? I think two intertwined concepts. One is reasoning.
00:42:52 Speaker_00
Like you said, if I ask you a question, can you reason with it? Could you deduce if a red ball is bigger than a yellow ball, a yellow ball is bigger than a blue ball. Therefore, the red ball must be bigger than the blue ball.
00:43:07 Speaker_06
Right. Without having been programmed.
00:43:08 Speaker_00
Right. Without directly saying red ball is bigger than the blue ball. So that's a reasoning. So that's one aspect. A very, very intertwined aspect of that is learning. A calculator doesn't learn whether you have a good 10 button or not.
00:43:23 Speaker_00
It just does what it is.
00:43:25 Speaker_06
You had a bad one.
00:43:25 Speaker_00
Once I had a bad one. So artificial intelligence software should be able to learn.
00:43:31 Speaker_00
That means if I learn to see Tiger 1, Tiger 2, Tiger 3, at some point, when someone gives me Tiger number 5, I should be able to learn, oh, that's a tiger, even though that's not Tiger 1, 2, 3. So that's learning.
00:43:45 Speaker_00
But even before the Dartmouth workshop, there were early inklings, like Alan Turing's daring question to humanity.
00:43:56 Speaker_00
Can you make a machine that can converse with people, QA with people, question and answer, so that you don't really know if it's a machine or a person. It's this curtain set up that he conjectured.
00:44:10 Speaker_00
So it was already there, but I think the founding fathers kind of formalized the field. Of course, what's interesting is for the first few decades, they went straight to reasoning. So they were less about learning. They were more about reasoning.
00:44:26 Speaker_00
They were more about using logic to deduce the red ball, yellow ball, blue ball question.
00:44:33 Speaker_00
So that was one branch of computer science and AI that went on during the years, predated my birth, but during the years of my formative years, without me knowing. I wasn't in there. But there was a parallel branch. That branch was messier.
00:44:51 Speaker_00
It took longer to prove to be right. But as of last week, we had the Nobel Prize awarded to that, which was the neural network. So that happened again in a very interesting way.
00:45:05 Speaker_00
Even in the 50s, neuroscientists were asking questions, nothing to do with AI, about how neurons work. And again, my own field, vision, was the pioneering study about cat mammalian visual system. And Hubel and Wiesel in the 1950s and 60s
00:45:25 Speaker_00
were sticking electrodes into cats' visual cortex to learn about how cat neurons work. Details aside, what they have learned and confirmed was a conjecture that our brain, our mammalian brain, is filled with neurons that are organized hierarchically.
00:45:46 Speaker_00
layered. They're not like thrown into a salad bowl. That means information travels in a hierarchical way.
00:45:53 Speaker_06
Up these columns.
00:45:54 Speaker_00
Yes. For example, light hits our retina. Our retina sends neural information back to our primary cortex. Our primary cortex processes it, sends it up to, say, another layer, and then it keeps going up.
00:46:09 Speaker_00
And as the information travels, the neurons process this information in somewhat different ways. And that hierarchical processing gets you to complex intelligent capabilities.
00:46:23 Speaker_06
That's a mouse I'm seeing, if I'm a cat.
00:46:25 Speaker_00
Or this tiger sneaking up on me.
00:46:28 Speaker_06
And I think this could be a bad analogy, but you might be misled to think, oh, well, a camera can take a picture and then the computer can show the picture.
00:46:35 Speaker_06
So the computer understands that's a photo, but really the camera has broken what it's seen into thousands of pixels. They are coded with a numerical sequence. The computer reconstructs those colors. It's a grid and virtually that's what our eyes do.
00:46:48 Speaker_06
Our eyes are just grabbing photons and they're sending back the ones and zeros. And then back here in the cortex, it's assembling it all.
00:46:55 Speaker_00
Yes. And how did evolution assemble us so that we can recognize all this beautiful world? Not only we can recognize, we can reason with it, we can learn from it.
00:47:06 Speaker_00
Many scientists have used this example is that children don't have to see too many examples of a tiger to recognize a tiger. It's not like you have to show a million tigers to children. So we learn really fast.
00:47:19 Speaker_06
And as you point out in the book, it took us 540 million years of evolution to get this system.
00:47:25 Speaker_00
Exactly. So just to finish, so the neuroscientists were studying the structure of the mammalian brain and how that visual information was processed. Fast forward, that study got the Nobel Prize in the 1980s because it's such a fundamental discovery.
00:47:42 Speaker_00
But that inspired computer scientists. So there is a separate small group of computer scientists who are starting to build algorithms inspired by this hierarchical information processing architecture.
00:47:55 Speaker_06
You build one algorithm at the bottom that's maybe generic.
00:47:58 Speaker_00
No, it's a whole algorithm, but you build mathematical functions that are layered.
00:48:03 Speaker_06
OK.
00:48:04 Speaker_00
So you can have one small function that process brightness, another that process curvature. I'm being schematic. And then you process the information.
00:48:15 Speaker_00
But what was really interesting of this approach is that in the early 80s, this neural network approach found a learning rule. So suddenly it unlocked how to learn this automatically without hand code. It's called backpropagation.
00:48:37 Speaker_00
And also Geoff Hinton, along with others who have discovered this, was awarded the Nobel Prize last week for this. But that is the algorithm neural network.
00:48:48 Speaker_06
Could you think of it as almost a filtration device, which is like this data comes in, we filter out these three key points that then filters up and then we come to our conclusion at the top of this hierarchy.
00:49:00 Speaker_00
You could actually.
00:49:01 Speaker_06
Because it's just like all this raw info at the bottom and then we kind of recombine it into this layer and then another process filters. Well, it's not a school bus. It's not this.
00:49:10 Speaker_00
You just keep filtering it. Of course, you combine it in mathematically very intricate way, but it is like layers of filtration a little bit.
00:49:19 Speaker_06
OK, great. So and now also when you find your North Star, another thing that's happening at the same time is WardNet, right? This is kind of a big breakthrough for early AI.
00:49:30 Speaker_00
for linguistics. So WordLat had nothing to do with AI. It had nothing to do with vision. But what happened for my own North Star is that I was obsessed with the problem of making computers recognize millions of objects in the world.
00:49:47 Speaker_00
While I was obsessing with it, I was not satisfied because my field was using extremely contrived datasets, like datasets of four objects or 20 objects.
00:50:01 Speaker_00
I was really struggling with this discrepancy because my hypothesis was that we need to learn the much more complex world. We need to solve that deeper problem than focusing on a very handful of objects. But I couldn't really wrap my head around that.
00:50:19 Speaker_00
And then again, Southern California, I remember that Biedermann number in my book is that I read a psychologist paper, Irv Biedermann, who was up till two years ago, a professor at University of Southern California.
00:50:32 Speaker_00
He conjectured that humans can recognize tens of thousands of object categories. So we can recognize millions of objects, but categories are a little more abstract.
00:50:45 Speaker_06
Animal, food, furniture. German shepherd. Transportation.
00:50:49 Speaker_00
Yeah. Sedan, fighter jet and all that. Yeah. So he conjectured that, but that conjecture didn't go anywhere. It was just buried in one of his papers. And I dug it out and I was very fascinated.
00:51:03 Speaker_00
I called it the Biedermann number because I thought that number was meaningful, but I don't know how to translate that into anything actionable because As a computer scientist, we're all using data sets of 20 objects. That's it.
00:51:20 Speaker_00
And then I stumbled upon WordNet. What WordNet was, was a completely independent study from the world of linguistics. It was George Miller, a linguist in Princeton. He was trying to organize taxonomy of concepts.
00:51:35 Speaker_00
And he feels alphabetically organized dictionary was unsatisfactory. Because in dictionary, an apple and an appliance would be close to each other. But then apple should be closer to a pear. Oh, I see. So how do you organize that?
00:51:54 Speaker_00
How do you regroup concepts? So he created WordNet, which hierarchically organized concepts according to meaning and similarity, rather than alphabetical ordering.
00:52:08 Speaker_06
Does WordNet not lead to the machine that can read the zip codes? No. It doesn't. What's that called? That's what I meant to bring up.
00:52:15 Speaker_00
That was ConvNet, Convolutional Neural Network.
00:52:18 Speaker_06
That's happening as you're getting your idea about the images right. We've trained a machine to read zip codes, basically, handwritten zip codes.
00:52:26 Speaker_00
So that was Yang Lecun's work in Bell Labs. That was an early application of neural network in the 1980s and 1990s, where that neural network at that time was not very powerful. giving enough training example of digits.
00:52:44 Speaker_00
The scientists in Bell Labs were able to read from zero to nine or the 26 letters. And with that, they created an application to read zip codes to sort mail.
00:52:58 Speaker_06
But its data set was, I forget, it was like a thousand or something. It wasn't that.
00:53:03 Speaker_00
It was a lot of handwritten digits.
00:53:05 Speaker_06
Yeah. And common mistakes, they would cheat it.
00:53:07 Speaker_00
That data set was probably tens of thousands of examples, but we're talking about just letters and digits.
00:53:14 Speaker_06
What they had proved in concept, you're going to try to do in images. But the lift for images is so exponentially larger than getting the machine to read.
00:53:25 Speaker_00
Exactly.
00:53:26 Speaker_06
By a factor of what? I mean, when you lay out what it's going to take for you to prove this theory you have and you figure out how long it's going to take, it's going to take like a decade of you feeding them, right?
00:53:37 Speaker_06
There's some moment where the amount of images you're going to have to feed this computer to train it can't almost be done by the group of you.
00:53:45 Speaker_00
So I think what you were referring to was the process of making ImageNet.
00:53:49 Speaker_06
Yes.
00:53:50 Speaker_00
And that process was once we realized, thanks to the inspiration of WordNet and also Biderman's number and also many other previous inspiration, we realized what computers really need is big data.
00:54:06 Speaker_00
And that was so common today because everybody talks about big data, you know, OpenAI talks about big data. But back in the 2000s, 2006, 2007, that was not a concept. But we decided that was the missing piece. So we need to create a big dataset.
00:54:25 Speaker_00
How big is big? Nobody knows. My conjecture went with Biderman's number. Why don't we just map out the entire world's visual concept? Oh my god.
00:54:36 Speaker_06
Yeah.
00:54:36 Speaker_00
Why don't we?
00:54:37 Speaker_06
And you wrangled someone in that this wasn't even really their North Star.
00:54:40 Speaker_00
OK, so Professor Kylie at Princeton, he was very supportive of me. He was a senior faculty. But what was really critical was he recommended his student to join my lab, Jia Deng.
00:54:55 Speaker_00
And Jia was just a deer in the headlight as a young first year graduate student. He didn't know what's going on. He got this crazy assistant professor of me.
00:55:05 Speaker_00
and told him that we're going to create a dataset that map out the whole world's visual concept. He's like, sure. You know, I don't know what you're talking about, but let's get started. Yeah. So he and I went through the journey together.
00:55:18 Speaker_00
I mean, he's a phenomenal computer scientist and many hoops we jumped through together. It was just the solution that got us through.
00:55:26 Speaker_06
This level of plotting that you were able to take on. is unique to you. And I think it's moving here in 10th grade and looking at that fucking dictionary back and forth and back and forth and back and forth.
00:55:39 Speaker_06
That kind of really unique dedication and unwavering plotting. A million other scientists could have had your idea, but I think it's that thing right there that makes you capable of creating image.
00:55:51 Speaker_00
That's an interesting observation.
00:55:53 Speaker_06
Yeah, it's not. I think we like to think of these things very simplistically, like, oh, you had a great idea. Who gives a shit? A lot of people had great ideas in graduate school.
00:56:01 Speaker_00
I do tell my kids ideas are cheap. Exactly. Hollywood.
00:56:05 Speaker_06
Someone's like, that was my idea. Oh, really? Did you write the script? Did you execute it? Did you cast it correctly? Did you motivate everyone? Your idea is 1% of the equation of a great movie.
00:56:14 Speaker_04
Yeah. Thank you for putting it that way.
00:56:15 Speaker_06
Because when I'm reading your thing and the data's coming in, it feels like, and tell me if I'm mischaracterizing it, the deeper you got into this experience, you were just learning every day it was going to be harder than you originally anticipated.
00:56:26 Speaker_06
It just kept getting worse and worse and worse and worse for years, right?
00:56:29 Speaker_00
It was pretty bad.
00:56:31 Speaker_06
When I'm reading it, I'm like, I would have quit a katrillion times. I'd be like, maybe computing will get to a point where this job will be made easy, but right now it's too hard.
00:56:39 Speaker_03
How do you even start something like that? Do you literally just look around the room and you're like, OK, here we go.
00:56:45 Speaker_00
Yeah, I'll start with this room and write everything. Well, OK, so first of all, I've had years of training as a scientist. So after you formulate a hypothesis, you do have to come up with a plan.
00:56:56 Speaker_00
My PhD thesis had a mini version of ImageDesk, so I got a little bit of practice. But yeah, our idea was to create a data set and a benchmark to encompass all the visual concepts in the world. So we had to start with WordNet.
00:57:11 Speaker_00
We had to figure out what is visual. We have to figure out what are the concepts we need and the word to get the source images and how to curate it. Every step of the way, like Dax, you were saying, we were just way too optimistic at the beginning.
00:57:27 Speaker_06
Naivete is the best asset you can have.
00:57:29 Speaker_00
Yeah, I was just fearlessly stupid.
00:57:32 Speaker_06
Yeah, it's a great gift.
00:57:34 Speaker_00
And then we start to hit all these walls of Jia and I and other students, but Jia was the main student. We had to just deal with every obstacle that came. Now, science is a funny thing, right? Sometimes serendipity makes a world of difference.
00:57:51 Speaker_00
What was really critical was the Amazon Mechanical Turk, the crowdsourcing platform. Amazon, nothing to do with us. We're like, oh, we have all these servers sitting in our data centers and we have nothing better to do.
00:58:08 Speaker_00
Let's make an online worker platform so people can just trade little tasks.
00:58:12 Speaker_06
A marketplace for that computer labor.
00:58:15 Speaker_00
Exactly, which I didn't know it exists. I was in New Jersey, Princeton, and trying to pull my hair out. And then some student who did his master at Stanford came to Princeton and just mentioned it casually and said, do you know this thing?
00:58:31 Speaker_00
That was really, really quite a moment for me.
00:58:34 Speaker_06
Yeah, that cut this process down by 80 percent or something.
00:58:38 Speaker_00
Yeah, 10x. That was one of the technical breakthrough that really carried this whole project.
00:58:46 Speaker_06
They're years down the path and they're calculating how much further it's going to be. And they know they have years and years ahead until this moment.
00:58:52 Speaker_00
Not only years and years, the budget, hiring undergrads or whatever just doesn't cut it. The budget was not going to cut it. My tenure was on the line. It was a dicey few months.
00:59:05 Speaker_06
So to fast forward to the end, you create ImageNet, and you can feed in a picture of a boy petting an elephant, and the computer knows that's a boy and that's an elephant.
00:59:15 Speaker_06
Might be a different size than the other elephant I saw, but I know that's an elephant. And this is huge. This earns you the title of Godmother of AI. I know, you don't have to comment. I know you don't want that. And I want to fast forward now.
00:59:29 Speaker_06
You've accomplished this incredible thing. You teach at Princeton for a while, as you say, and then you take up a teaching position at Stanford, where you still currently are.
00:59:39 Speaker_06
You become one of these people that undergrads would then study about, which is fascinating. And you go to work for Google during a sabbatical for like a year and a half.
00:59:48 Speaker_04
Yes.
00:59:48 Speaker_06
And there's a moment where part of your job is to go meet with the new recruits that are going to start their employment at Google.
00:59:55 Speaker_06
Is it fair to say this is one of your, I don't want to call it a crisis of conscience because that would be too strong, but how would you say it? You have an opportunity to talk to those people and it sounds to me like you went rogue a little bit.
01:00:06 Speaker_00
Yes, I did go rogue a little bit.
01:00:07 Speaker_06
Yeah.
01:00:09 Speaker_00
So it's very important to call out the year. My sabbatical at Google was 2017 and 2018. That was my first sabbatical. I finally had a sabbatical. And it was a conscious decision for me to go to Google because this is right after AlphaGo.
01:00:24 Speaker_00
So AI was having its first hype wave, at least public moment. And Silicon Valley, of course, was ahead of the curve and new AI was coming. So I had multiple choices, but I really wanted to go to a place for two reasons.
01:00:40 Speaker_00
One is to learn the most advanced industry AI, and Google was by far the best. But also to go to a place where I can see how AI will impact the world. And Google Cloud was a perfect place because cloud business is serving all businesses.
01:00:59 Speaker_00
So at cloud, being the chief scientist, I was able to see the technology translating to product and product impacting health care, hospitals, financial services, insurance companies, oil and gas companies, entertainment, agriculture, governments, and all that.
01:01:18 Speaker_00
But in the meantime, it was confirming my hypothesis that this technology has come of age and will impact everyone. It was the first tech clash. 2017 was right after Cambridge Analytica.
01:01:33 Speaker_06
Uh-huh. Let's remind people. So Cambridge Analytica figured out how to maximize Facebook politically, and people were very upset by that.
01:01:42 Speaker_00
Yeah. Social media's algorithmic impact can drive societal changes. It was also around the time face recognition bias was being publicized for the good reasons of calling out bias.
01:01:58 Speaker_00
It was also around the time that self-driving car accidents start to happen. So before that, tech was a darling. The media doesn't report tech as a force of badness.
01:02:11 Speaker_06
But I do want to point out, because I heard you point it out, which is in the early advancements, it had all these peaks and valleys, AI.
01:02:18 Speaker_06
And there was a moment in the 70s where it looked promising and immediately people went to robots were going to take over the world. So we also do have this immediate sense. We do jump to that. They jumped to it in the 70s. It's worth pointing out.
01:02:30 Speaker_00
That's true. Hollywood is always ahead of the curve on that.
01:02:34 Speaker_06
Well, we sell fear and excitement.
01:02:37 Speaker_00
So it was a tech clash that came at us very fast. Google has had its own share. I was actually also witnessing the struggle that Google was coming to terms with defense.
01:02:50 Speaker_06
Yeah, they had taken a contract to develop some drone phase recognition stuff. And the people at Google were told that they were only working on nonprofit stuff. There was a bit of a revolt. And you were there during all that.
01:03:02 Speaker_00
Yes. In hindsight, it was a mixture of many things. It wasn't a single event. I remember it was summer of 2018 and we were just coming off this turmoil. In hindsight, they're small, but at that point,
01:03:19 Speaker_00
And I was just like, I'm about to speak to, maybe my memory is wrong, but I thought it was 700 interns from worldwide who worked at Google that summer. And they're the brightest from the whole world. And they were hand-selected by Google.
01:03:36 Speaker_00
You know, Google is really a machine of talent. And what do they want to hear from me? Of course, I can talk about come work at Google. That's my job as someone who was working at Google. But I felt there was more I should share.
01:03:50 Speaker_00
Really coming from the bottom of my heart at that point, something that you will appreciate is that the math behind technology is clean, but the human impact is messy. Technology is so much more multidimensional than equations.
01:04:09 Speaker_06
Yeah, they're all benign. It's how we implement. They're neutral. Neutral. There we go.
01:04:13 Speaker_00
But once they start to interface with the world, the impact is not necessarily neutral at all. And there is so much humanness in everything we do in technology. And how do we connect that? I decided to talk about that with the interns.
01:04:29 Speaker_06
And is this the first time you articulate that you want a human-centered development of AI?
01:04:36 Speaker_00
Yeah, it was around that time, 2018 March, I published the New York Times op-ed. I laid out my vision for human-centered AI.
01:04:46 Speaker_06
So let's parallel your speech to the interns and then also getting to go in front of Congress. So what is your overarching sense of how we keep this technology going in a direction that does serve humans?
01:05:00 Speaker_00
My overarching thesis is that we must center the value of technologies, development, deployment, and governance around people. Any technology, AI or any other technology should be human-centered.
01:05:17 Speaker_00
As I always say, that there's no independent machine values. Machine values are human values. Or there's nothing artificial about artificial intelligence. So it's deeply human.
01:05:30 Speaker_06
So what are the practical things we do? What are the legislative things? What does that mean? How do we do that?
01:05:36 Speaker_00
So human centered AI should be a framework and that framework could be applied in fundamental research and education. That's what Stanford does.
01:05:45 Speaker_00
or creating business and products, that's what Google and many other companies do, or in the legislation and governance of AI, which is what governments do. So that framework can be translated into multiple ways.
01:06:01 Speaker_00
Fundamentally, it is to put humans' dignity, humans' well-being, and the value that a society cares about into both how you create AI, or how you create AI products and services, or how you govern AI. So, concrete examples.
01:06:23 Speaker_00
Let me start from the very basic side, upstream. At Stanford, we created this Human-Centered AI Institute. We try to encourage cross-pollinating, interdisciplinary
01:06:36 Speaker_00
study and research and teaching about different aspects of AI, like AI for drug discovery, AI for developmental studies, or AI for economics and all that.
01:06:48 Speaker_00
But we also need to keep in mind, we need to do this with a kind of norm that reflect our values. So we have actually a review process of our grants.
01:06:59 Speaker_00
We call it ethics and society review process, where even when researchers are proposing a research idea to receive funding from HAI, they have to go through a study or a review about what is the social implication? What is the ethical framework?
01:07:17 Speaker_06
And are you bringing in philosophers and anthropologists and psychologists? This is the interdisciplinary aspect.
01:07:23 Speaker_00
That's the very fundamental research example. Now translate to a company. When we think about an AI product, let's say I would love for AI to detect skin condition for diseases. That's a great idea. But starting from your data, where do you curate data?
01:07:42 Speaker_00
How do you ensure data fairness?
01:07:45 Speaker_06
So if I play out that experiment, I guess I would love to take my phone, scan my face and know if I have a melanoma. That's all sounds great. Where does the results of that get stored? Does my insurance provider have access to that? What all happens?
01:07:57 Speaker_06
It's not just me that's going to find out I have this melanoma.
01:08:00 Speaker_00
Exactly. What about the scale of the face and also the algorithm that detects melanoma? Is it trained on just white folks? Exactly. Narrow type of skin or all skins. What's the impact of that algorithm?
01:08:15 Speaker_06
Will it disproportionately help some group and alienate another?
01:08:18 Speaker_03
And do you have to pay? Because if you pay, you'll probably get a certain group more than you'll get another group. Right.
01:08:22 Speaker_00
So all those are messy human elements. And then you ask about legislation. Then we come to government. Of course, There is always a tension between how much regulation, how do you regulate? Is good policy only about regulating?
01:08:37 Speaker_00
For example, I firmly believe we actually should have good policy to juvenate our AI ecosystem, to make our AI ecosystem really healthy. For example, right now, the gravitational pull is that all the resources, the
01:08:53 Speaker_00
data, the computation and the talents are all concentrated in a small number of large companies. It's all for commerce.
01:09:02 Speaker_06
Yeah. Universities can't really compete.
01:09:04 Speaker_00
No, not at all.
01:09:06 Speaker_06
Meta, Google.
01:09:07 Speaker_00
My own lab at Stanford has zero NVIDIA H100 chips. There you go.
01:09:14 Speaker_06
Yeah. Like that's always been the good corrective mechanism we've had societally, is the world of academia. And it competed pretty robustly with any private sector.
01:09:23 Speaker_00
And it's not just competition, it's that the problems we work on are curiosity driven and sometimes they are really public good. For example, my own lab, we're collaborating with hospitals to prevent seniors from falling.
01:09:38 Speaker_00
That is not necessarily a commercially lucrative technology, but it's humanistically important. Universities do all kinds of work like that. Now our universities, in the age of AI, are so under-resourced that we cannot do this kind of work.
01:09:56 Speaker_00
I have been working really hard in the past five years with HAI, with Washington, D.C., with Congress people, senators, White House agencies, to try to encourage the resourcing of AI through National AI Research Cloud and data.
01:10:13 Speaker_00
And then we have legislation and regulation. How do you thoughtfully put guardrails so that individual lives and dignity and well-being are protected, but the ecosystem is not harmed?
01:10:29 Speaker_06
So all of this, I'm always on board with. I love it. So grateful there's people like you pushing us in that direction. But we just had Yuval Harari on to talk about his take on it.
01:10:40 Speaker_06
And what I ultimately get so discouraged and defeated by is we're not doing this on an island. We're doing this while many other countries do this simultaneously. So how do you see us dealing with the competitive nature
01:10:55 Speaker_06
of these AI technologies emerging. And us maybe proposing we're going to do it in this way, but being realistic and saying, well, Russia might not have those guidelines and China might not have those guidelines.
01:11:05 Speaker_06
And if they have a product that people like, we can't compete now with it. So do you believe there could be cooperation? We could outlaw faking humans. OK, so the US has outlawed faking humans. No one else does.
01:11:16 Speaker_06
And those fake humans are really convincing and entertaining and all these things. And then that industry takes off somewhere else. Like, how do we do this in a world that there are no barriers of this technology?
01:11:24 Speaker_00
I was also chatting with Yuval. Did he give the C- grade to humanity? Did he say that? I didn't get the C- out of him. He said that humanity has gotten a C-. And I was like, Yuval, you know, I'm a teacher and a mom.
01:11:40 Speaker_00
When a kid comes home with C-, you don't throw the kid out. We help the kid to get better. So first of all, you're right. We're not living in a vacuum. And AI also is not living in a vacuum. AI is just one technology that's among many.
01:11:56 Speaker_00
So I absolutely do believe that there can be cooperation. how exactly we cooperate, who we cooperate with, and what are the parameters of cooperation is much, much more complicated. Look at humanity. We have gone through this so many times.
01:12:15 Speaker_00
I mean, Yuval is right. We have many messy chapters, even nuclear technology. But we have gotten to a state that there is a fine balance at this point of nuclear powers. I'm not saying that's necessarily culpable. I think it is.
01:12:32 Speaker_06
And then I think what's really important, and I only know this because I'm on my second Von Neumann book, but Von Neumann was employed in the wake of the Manhattan Project to deal with how this proliferation was going to work.
01:12:43 Speaker_06
And he was so analytical and so realistic that he said, Mutually assured annihilation is the solution. He knew that was the only outcome. It felt sociopathic to say it and to commit to it. But he's like, look, I'm modeling this out.
01:12:58 Speaker_06
This is the only way it works is mutually assured annihilation. That's what we ended up with. And so I'm having a little van Neumann feelings about like, no, I think it's a race to who can win until everything gets neutralized.
01:13:09 Speaker_06
I don't know another comp other than the nuclear arms race.
01:13:13 Speaker_00
difference between A.I. and nuclear technology is A.I. is so much more an enabler of so many good things. True. So that's very different from nuclear. Of course, nuclear can be an energy.
01:13:27 Speaker_06
We're coming back around to it.
01:13:28 Speaker_00
Right. But AI can help discover drugs, AI can help breakthrough infusion, AI can personalize education, AI can help farmers, AI can map out biodiversity. So AI is much more like electricity than it is like nuclear physics. So that's the difference.
01:13:49 Speaker_00
So from that point of view, the viewing angle of AI, at least I do not think it has to only from the competitive lens, because it should be also through the enabling lens, the enabling of our world, of so many good things that can happen.
01:14:06 Speaker_00
And that's the challenge is how do we keep the dark use of AI at bay? How do we create that kind of balance somehow. But in the meantime, encourage the development of AI so that we can do so many things that's good for people.
01:14:24 Speaker_06
So I accept that the nuclear analogy falls short in that there's so many benefits to this. Totally agree. But I will say again, to parallel nuclear arms race in this moment in time.
01:14:35 Speaker_06
I think it would be only the second time where international cooperation is at its peak, where it's most needed. We have got to recognize this as a moment where we have to be getting closer to all these places and not further away.
01:14:50 Speaker_06
Our competitors, our geopolitical adversaries that
01:14:53 Speaker_06
If ever there were a time where everyone stands to gain, other than the nuclear arms race, this is the time where it's like we got to really figure out how to cooperate a bit because everyone will experience the downside if we don't.
01:15:06 Speaker_03
Climate too would be the other more recent thing. There's a Paris accord and there is things that globally people have come together.
01:15:13 Speaker_06
I agree with you, but I will just say that climate to me is a little dicier simply because you have all of these burgeoning industrial economies that we would be slapping rules on.
01:15:24 Speaker_06
It's easy for us to adopt a lot of things that it's not for Sri Lanka. It's not totally fair. There actually should be areas of the world where they are allowed to pollute more as they pull themselves up, you know, like.
01:15:35 Speaker_03
I mean, I think that's part of it. It's just an acknowledgement globally that we're all going to have to do something, and especially the superpowers do need to take more on than others.
01:15:44 Speaker_03
But it's just getting on the same page that I think we've done okay at, and at least there's some consensus there. So there could be some consensus here, potentially.
01:15:53 Speaker_06
Yeah, I just hope that we recognize this is a moment to be making friendships a lot better and not doubling down.
01:15:59 Speaker_00
I do think we must always recognize cooperation is one of the solutions.
01:16:05 Speaker_06
Do you get to the guardrail point in the conversation with the legislators? Do you have certain guardrails that you believe should be like? I like you've all you've all said we shouldn't ever be able to fake humans.
01:16:15 Speaker_06
And I also think there should be a disclaimer on all AI generated things that you at least know it came from that source.
01:16:21 Speaker_00
I do think we should pay a lot of attention on where rubber meets the road, because AI can sound very fancy, but at the end of the day, it impacts people. So if you use AI through medicine, then there is a regulatory framework.
01:16:39 Speaker_00
For example, my mom, again, does imaging all the time because the doctors have to use MRI, ultrasound, you name it, to monitor her. Honestly, do I care if the MRI is fully automatic or is it operated by humans or it's a mixture?
01:16:59 Speaker_00
As a patient family, I probably care more about the outcome. If the result of the MRI can be so accurate,
01:17:08 Speaker_06
78% of an AI or a human does it at 40. It's a no brainer.
01:17:12 Speaker_00
Exactly. But all I care are two things. One is it is the best result my mom can get. Second is it's safe, right? I don't care if it's that kind of mixture. So that regulatory framework is there.
01:17:27 Speaker_00
I'm not saying FDA is perfect, but it is a functioning regulatory framework. So if there is an AI product that goes into the MRI, I would like it to be subject to the regulatory There we go. Yeah, yeah. Right. So that's where rubber meets the road.
01:17:41 Speaker_00
The same as finance, environment, transportation, you name it. That's a very pragmatic approach. It's also urgent because as we have AI products that's entering our customers market,
01:17:57 Speaker_00
And it takes away from, in my opinion, the science fiction rhetoric about existential crisis, machine overlord. That can stay with Hollywood.
01:18:08 Speaker_06
Yeah, yeah, yeah, yeah.
01:18:10 Speaker_00
I believe the downstream application is where we should put our guardrail attention at.
01:18:16 Speaker_06
Right. I really want to encourage people, even if people have only a cursory or no interest in A.I., I really think your book is one of my favorites I've read.
01:18:27 Speaker_06
It's just your personal story, as reluctant as you are to embrace it or talk about it, is a really special story.
01:18:33 Speaker_00
Thank you.
01:18:34 Speaker_06
I mean, what ground you've covered? Do you give yourself any moments where you go, God damn, girl, we got here?
01:18:42 Speaker_00
That's very sweet. That's the problem of always chasing after North Star. I try to, like, look forward. One thing I do reflect back is how grateful I am. I'm not here by myself.
01:18:54 Speaker_00
I'm here because of the Bob Sabella, Jean Sabella, the advisors, the students, the colleagues. That I feel very, very lucky.
01:19:03 Speaker_06
Yeah, there's a lot of sweet people in the world still.
01:19:06 Speaker_00
Yeah, it's good.
01:19:07 Speaker_04
Yeah.
01:19:09 Speaker_06
Oh, well, Fei-Fei, this has been a delight. I hope everyone gets your book, The World's I See, Curiosity, Exploration, and Discovery at the Dawn of AI. And boy, those lucky people that get to have you as a teacher.
01:19:21 Speaker_03
Oh, man, so jealous.
01:19:22 Speaker_06
I also love the narrator of your book. Have you listened to it on tape?
01:19:25 Speaker_00
A little bit. I didn't finish the audio.
01:19:28 Speaker_06
You didn't finish.
01:19:29 Speaker_03
Right. Yeah, it's hard to listen to your own stuff. Well, it's not her. I know, but your own stuff. Yeah. You spent so much time writing it. Right.
01:19:35 Speaker_00
I'm like, do I have time? I should finish my Norman book.
01:19:39 Speaker_06
Yeah, yeah. And you should read Maniac.
01:19:41 Speaker_00
Yeah, you got a couple new books to read. I'm so grateful. I'm grateful you like the book.
01:19:45 Speaker_06
Oh, I love it. It's just really beautiful. I love the narrator. But I was having the moment where I was like, I was only introduced to you through this book. I was completely ignorant about you. And then there's a narrator.
01:19:54 Speaker_06
When I was doing research on you, I'm like, oh, we're going to find out what the real voice is.
01:19:57 Speaker_00
I felt a little self-conscious because of my accent.
01:20:02 Speaker_06
Oh, really?
01:20:03 Speaker_00
Because I consider if I should narrate my own book, but I feel like my accent is probably too strong for that.
01:20:09 Speaker_06
That wouldn't be the reason I'd advise you not to do it. I think it's way, way harder than people think. And there's a lot more acting involved. I've heard some writers narrate their own book. You got to be a performer.
01:20:20 Speaker_00
Right.
01:20:20 Speaker_06
Forget your accent. There's like a performance to be done.
01:20:23 Speaker_00
Right. And that's how many hundred pages. Yeah. You also probably need to put your time there.
01:20:29 Speaker_03
You have a lot of other stuff. Don't waste your time.
01:20:32 Speaker_06
Well, I hope you come back and see us again sometime and I'll be following everything you do. And thank you for trying with all your might to make this a human centered development.
01:20:42 Speaker_00
Thank you. It's so important. And I do think creators and creators' voices are so important because we started this conversation with what's different from human intelligence, AI, and that creativity, the insight is a huge part of it.
01:20:58 Speaker_00
And now that we have the generative AI trying to create things, I think the collaboration with humans is so important.
01:21:06 Speaker_06
Yeah. All right. Well, be well and thanks for coming.
01:21:08 Speaker_00
Thank you.
01:21:12 Speaker_05
Hi there, this is Hermium Hermium. If you like that, you're gonna love the fact check with Ms. Monica.
01:21:20 Speaker_06
Hi, Moni.
01:21:21 Speaker_03
Hi.
01:21:21 Speaker_06
We had so much fun yesterday, didn't we?
01:21:24 Speaker_03
We did.
01:21:25 Speaker_06
I did.
01:21:25 Speaker_03
I did. We shot a commercial.
01:21:28 Speaker_06
So much fun. So much fun.
01:21:30 Speaker_03
Yeah, I had a really fun full circle moment.
01:21:35 Speaker_06
Okay, yes, please tell.
01:21:37 Speaker_03
Because I got out of the car, you know, I haven't acted in a while.
01:21:41 Speaker_05
Sure, we're ball rusty.
01:21:43 Speaker_03
Yeah, I got out of the car and I started recognizing some of the people on set and I realized I had worked with a lot of that crew on some previous commercials in my day.
01:21:55 Speaker_06
Sure, one of the many, many thousands of commercials you had done.
01:21:58 Speaker_03
It felt so nice and cool. I had done these commercials as just this actor auditioning and doing this thing, and now we're doing a commercial.
01:22:09 Speaker_06
Where they asked you to be in it before.
01:22:11 Speaker_03
Yeah, and we're doing it together. Yeah. Not for this podcast.
01:22:16 Speaker_06
But because of this podcast.
01:22:17 Speaker_03
Yeah. And it was something really cool about it.
01:22:20 Speaker_06
I agree.
01:22:21 Speaker_03
I liked it. And I think it's because my ring is fixed.
01:22:25 Speaker_06
I have some housekeeping.
01:22:26 Speaker_03
Okay, great.
01:22:27 Speaker_06
You know, I read the comments. And this is so embarrassing. And I read it a couple of times. I'm like, these people are crazy. I didn't say. So people were like, you said the wrong voice of Darth Vader in the Morgan Freeman intro.
01:22:43 Speaker_06
And I thought they were saying I had said Morgan Freeman was the voice of Darth Vader, and I'm like, I know I didn't say that, because I know he's not. And James Earl Jones was the voice of Darth Vader, and I said Edward James Olmos.
01:22:55 Speaker_06
So I did say it wrong, it was another three-name actor with an Edward in it.
01:23:01 Speaker_03
I see. Yeah, that's hard.
01:23:03 Speaker_06
So I fucked that up and my apologies.
01:23:05 Speaker_06
Oh, and then the other thing was they had coitus interruptus because we were chatting and I was going to say I had, I was going to give a Danny Ricardo update because I had ridden motorcycles with him, but I guess then we got sidetracked and I never did.
01:23:17 Speaker_06
So all these people who are rightly concerned about our sweetheart, Danny Ricardo, how's he doing? We're left hanging and I'm here to report that he's so happy.
01:23:27 Speaker_03
Yeah, he's doing great.
01:23:29 Speaker_06
He's so, so happy. We were riding motorcycles all day long and we chatted a bunch and he's just very happy.
01:23:37 Speaker_04
I'm glad.
01:23:38 Speaker_06
Yeah, he's just doing really, really good. So people should rest assured that Danny Rick is thriving.
01:23:44 Speaker_03
Yay.
01:23:44 Speaker_06
Yay.
01:23:45 Speaker_03
Love to hear that.
01:23:46 Speaker_06
Yeah.
01:23:47 Speaker_03
Do you want to tell people what Toto texted you? Was so funny.
01:23:54 Speaker_06
I had text him to say, hey, people really loved the episode and me in particular. I really loved it. Thanks for doing it. And he said, how were the numbers? You know, I am a lap time guy.
01:24:07 Speaker_03
Oh, so playful.
01:24:10 Speaker_06
Oh God.
01:24:11 Speaker_03
God.
01:24:12 Speaker_06
I got to say, I want to say out loud, that really put a lot of wind in my sails. That made me so happy to have that episode come out. It really right-sized my perspective. As I vocalize on here, it's been a challenging transition.
01:24:27 Speaker_06
I've been really stressed. There's been bad news and challenges. And this came out and I was like, all right, dumbass, you get to meet people that you are obsessed and in love with. Holy lottery. Yeah.
01:24:41 Speaker_06
I just was, I was, I was beaming all day Wednesday from it.
01:24:44 Speaker_03
Yeah. It was a great episode. And so, yeah, just so cool. We get to talk to anyone we want to talk to. Not anyone. I still have a list. We still got Tay.
01:24:55 Speaker_06
Liquid death. I'm just pointing to objects. Monkey with huge balls.
01:24:59 Speaker_03
No, we already had machine gun Kelly.
01:25:01 Speaker_06
We did. We did. Okay. There's another fun update, but this I'm starting, I'm getting worried that people are going to be afraid to text me. I guess these people should know. I run it through my analysis.
01:25:10 Speaker_03
Okay.
01:25:11 Speaker_06
And I would never say anything that was in a text that I didn't think was just lovely.
01:25:16 Speaker_02
You know what I'm saying?
01:25:17 Speaker_06
I get worried about it, don't you? Like, you know, someone's got a private exchange with me and then I'm reporting on it. There's an ethical dilemma here. But sometimes they're so funny and I think the person would like it anyways.
01:25:30 Speaker_06
So I sent Pitt the clip of Toto talking about me telling him that Pitt said he was a good dancer and then Toto talking about him coming to dinner. And then he's, He said, I made up the thing about him being a good dancer. And I said, Oh no.
01:25:51 Speaker_06
I said, I can't believe you made that up. In fact, I don't believe you made that up. I still believe he's a great dancer.
01:25:58 Speaker_03
Yeah, me too. But he did say, cause Toto was like, when did he see me dance?
01:26:03 Speaker_06
I know. But then he just had to, he had to go, well, I don't understand how that happened, but I'm going to take that.
01:26:09 Speaker_03
He was just, He's being funny. He was doing a yes and.
01:26:12 Speaker_06
He was doing a bit. He was like, you're not gonna believe this. He's also a phenomenal dancer, but he's just with him. I believed it.
01:26:17 Speaker_03
Yeah. I think the- Who wouldn't believe it?
01:26:20 Speaker_06
The crux of that story is I'm gullible.
01:26:21 Speaker_03
I think he is a great dancer.
01:26:23 Speaker_06
Can we talk about Chris-mo a little bit?
01:26:25 Speaker_03
Sure.
01:26:26 Speaker_06
I got the fever as much as I've ever had it, as hard as I've ever had it. Let me tell you what's happening. So, so far from our homework, we watched Christmas Vacation already, Home Alone 1 and 2.
01:26:37 Speaker_06
Side note, I've never heard Delta laugh harder in my life than the 27 minute set piece in Home Alone 2 where he's hitting the guys with bricks.
01:26:48 Speaker_03
Oh, sure.
01:26:49 Speaker_06
She was laughing uncontrollably for like 27 minutes. She said at one point, it doesn't get old, like they threw a fifth brick or whatever. And she's like, it doesn't get old.
01:26:59 Speaker_06
And I got so much joy out of watching her have that big of a laugh at something.
01:27:05 Speaker_03
So cute.
01:27:06 Speaker_06
Okay, so Home Alone 2, we did Gremlins, another Christmas favorite for us. Last night we did The Grinch Who Stole Christmas, original cartoon. And I wanna go out and say, For the record, it's the number one Christmas cartoon to ever be made.
01:27:20 Speaker_06
It is the most creative, we all watched it and at the end of it- How many more Christmas cartoons are there? There's a lot, you've got Rudolph, you've got the Chuck Brown.
01:27:30 Speaker_03
Oh yeah.
01:27:31 Speaker_06
You've got the, there's a bunch.
01:27:35 Speaker_02
Okay.
01:27:38 Speaker_06
But I'm saying, maybe even Christmas anything, it ends and I said, you know, Dr. Seuss should really be regarded as like Salvador Dali. He had such a unique...
01:27:52 Speaker_06
imaginative world he created in the words, in the set pieces, and I mean, that's one of the most creative people to ever live.
01:27:59 Speaker_03
Of course. I think he is given his due props.
01:28:04 Speaker_06
Yeah.
01:28:05 Speaker_03
You know, there's a Seuss land.
01:28:07 Speaker_06
There is?
01:28:07 Speaker_03
Yeah, at one of the parks. I think.
01:28:12 Speaker_01
Yeah, Seuss landing in Orlando, Florida. In Orlando, Florida, I should go.
01:28:16 Speaker_03
You should go. You should pay your respects.
01:28:19 Speaker_06
I like when people use the term Seussian. Did you ever hear anyone use that?
01:28:22 Speaker_03
No, but I like it.
01:28:24 Speaker_06
Yeah, it's cool, right?
01:28:25 Speaker_03
Yeah.
01:28:25 Speaker_06
Yeah, like Newtonian or like it's a paradigm.
01:28:28 Speaker_03
But it kind of sounds like Sisyphusian, which is my favorite word.
01:28:32 Speaker_06
It's not my favorite word anymore. You taught me that word. And I thank you for that. To remind people Sisyphus pushed the rock up the hill every day.
01:28:40 Speaker_06
There's a Buddhist take that like, that's what people interpret that as a story of not wasted effort, but like, you know what I'm saying?
01:28:47 Speaker_02
Yeah. Yeah.
01:28:49 Speaker_06
A fool's errand.
01:28:50 Speaker_02
Yeah.
01:28:51 Speaker_06
But there's a Buddhist way of looking at it, which is like, this person had purpose every single day, all day long, and was not suffering, probably.
01:28:59 Speaker_04
Wow. It's a story of suffering.
01:29:01 Speaker_06
It was a huge rock. Well, first of all, he's probably jacked being a believer. So strong, so strong. But that's an interesting way to reframe it, that like, no, this person, every day of their life had purpose.
01:29:12 Speaker_03
Yeah.
01:29:13 Speaker_06
Probably very happy.
01:29:14 Speaker_03
That's a lovely way to look at it.
01:29:15 Speaker_06
Yeah.
01:29:16 Speaker_03
It's actually Sisyphean.
01:29:18 Speaker_06
Sisyphean?
01:29:19 Speaker_03
Yeah.
01:29:19 Speaker_06
I like Sisyphean.
01:29:20 Speaker_03
Me too. And I maintain it. Yeah. Okay. So you're in the Christmas spirit.
01:29:26 Speaker_06
Yes. And I wake the girls up every morning. I wake up about 20 minutes before the girls to meditate. And so now they wake up to me playing from my phone. over the Sonos, Christmas music.
01:29:41 Speaker_06
And I want to make a great recommendation to people who are using Spotify and you can make a station, go to the Charlie Brown Christmas album, and then go specifically to the song, Christmastime is here, make a station out of Christmastime is here, and it's the best Christmas mix I've ever had.
01:30:00 Speaker_03
Ooh, that sounds lovely.
01:30:02 Speaker_06
And it's on all the time. And so, you know, the tree is over-decorated.
01:30:08 Speaker_02
Yeah.
01:30:09 Speaker_06
You know, we get one tree, and Kristen gets a tree in the kitchen. And hers is artistic, and this year it's wicked.
01:30:15 Speaker_02
Oh, cute.
01:30:16 Speaker_06
Yeah, and our tree is a throw-up of color. And I have those old-fashioned bulbs that the water bubbles up in them. They're almost impossible to get to sit vertical on your tree.
01:30:27 Speaker_06
I've spent most of my free time positioning all of them, and then I pull the cord and they all fall down. It's a Sisyphean task.
01:30:35 Speaker_03
Wow, ding, ding, ding.
01:30:36 Speaker_06
I didn't expect it to come around that quick. I had all this anxiety about presents, but I knocked a bunch of presents out the other day.
01:30:40 Speaker_03
Nice, you used a little bit of my,
01:30:43 Speaker_06
I used your gift guide almost exclusively.
01:30:48 Speaker_03
There were good gifts on there.
01:30:49 Speaker_06
Complain about your gift guide, though. You make things sell out. Your gift guide is moving markets.
01:30:58 Speaker_03
Yeah, well, I pick great items.
01:31:00 Speaker_06
Yeah, you do.
01:31:01 Speaker_03
I have to say.
01:31:01 Speaker_06
You have exquisite taste.
01:31:03 Speaker_03
Thank you.
01:31:04 Speaker_06
Some of your recommendations were so good that I found myself dancing around on the websites.
01:31:07 Speaker_03
Yes, that's the goal.
01:31:10 Speaker_06
Yeah, yep, and yeah.
01:31:12 Speaker_03
There's fun stuff abound.
01:31:14 Speaker_06
There's fun stuff, so.
01:31:15 Speaker_03
So, and let's just, so your tree has colored lights, right?
01:31:20 Speaker_06
So many. Yeah, and Kristen's- I have four strands, really long strands, and four of those bubbly light strands. Sure. and the tree's touching the ceiling. It's a Clark Griswold, it's too big. And I'd cut a foot off it.
01:31:35 Speaker_03
But I just wanna talk about lights.
01:31:36 Speaker_06
Oh, okay. My apologies, Miss Monica. Miss Monica, I'm sorry, I get so carried away sometimes when the spirit moves me. I don't leave my apartment much, so I really enjoy decorating it, get all those colors, makes me optimistic.
01:31:50 Speaker_03
I wonder how Hermium, does he have a delivery service? How does he get his tree?
01:31:57 Speaker_06
I have a cousin who's not working at the moment and he loves going to department stores and plazas and shopping malls and strip malls. Wow. And I'll call him on the landline. That's what I have, Ms. Monica.
01:32:11 Speaker_06
I pick up the phone and I call his, his name is Bert.
01:32:14 Speaker_04
Oh.
01:32:14 Speaker_06
Yeah, he's my, did I say my brother-in-law or my cousin?
01:32:17 Speaker_03
You said cousin.
01:32:17 Speaker_06
Yeah, he's my cousin. I just remembered. Weirdly enough, he's also my brother-in-law, but it's my stepsister.
01:32:23 Speaker_03
Oh, okay, so it's all on the up and up.
01:32:24 Speaker_06
Everything's on the above board, as they say. And I call up Bert, and I say, here's what I need, Bert. Six water weenies, 10 spatulas. And Bert, it takes him a while, sometimes four or five days.
01:32:36 Speaker_06
And then he comes over, and he does charge me a little more, but that's okay.
01:32:40 Speaker_04
Sure. Well, he's doing a lot of work.
01:32:41 Speaker_06
And then I have to call him back up and ask him to deliver the presents. Wow. That's okay, though. He charges me for that too.
01:32:47 Speaker_03
Okay. can all take an advantage of, but that's, you seem fine with it. That's okay. Okay, great.
01:32:53 Speaker_05
Mom.
01:32:53 Speaker_03
Now remember, I'm not your mom.
01:32:57 Speaker_05
Okay, Miss Monica. Mom Monica.
01:33:02 Speaker_06
Mrs. Mom. Color lights.
01:33:06 Speaker_03
Yes, the lights, because Chris and I assume on her nice tree has white lights. Yep.
01:33:12 Speaker_06
Yeah.
01:33:12 Speaker_03
Yeah. And this is, you know, this is a big thing. I don't know if it's Rob. Yeah. What color lights do you have?
01:33:20 Speaker_06
Well, first of all, do you have the light you want?
01:33:23 Speaker_01
Yes. OK, I do. I like the like yellowy. White light, warm gold. Yeah. White lights. That's why there's shades of white, too. Like we like about that.
01:33:37 Speaker_03
He's trying to he's trying to walk in the middle and be nice. But really, he has white lights, white lights, and he likes them.
01:33:44 Speaker_01
I have white lights, but they're kind of yellowy.
01:33:46 Speaker_03
Yeah, I know what you mean. There's like a warm and a cool.
01:33:49 Speaker_06
Now, listen, sometimes you complain about there being two boys, one girl in this situation, but you have to admit Rob is a perfect middle ground. Like if Aaron was here, it would, it would suck.
01:34:00 Speaker_03
Well, yeah.
01:34:00 Speaker_06
He disproves my gender stereotypes quite a bit.
01:34:04 Speaker_03
Yeah. Yes. Because Aaron grew up exactly like you. So it's not fit. You just assume it's men because you and Aaron believe it.
01:34:12 Speaker_06
That's right, Monica. That's right, Miss Monica.
01:34:15 Speaker_03
So yes, Rob did not grow up with you and like you.
01:34:19 Speaker_05
Oh, he's from the big windy.
01:34:21 Speaker_03
So I don't think it's gender, but I do think some people love. the nostalgic colored lights, and then other people who care about aesthetics love the white light.
01:34:33 Speaker_06
I could really get on my high horse about it. I used to have a really strong stance on it, and it's all my class warfare stuff.
01:34:41 Speaker_03
Yeah, which is... It's so tired. Is that what you're gonna say? No, I wasn't gonna say that. Your life does not match that mentality anymore.
01:34:50 Speaker_06
Doesn't at all, but did you see Chris Rock's latest standup? He said, I am rich, but I identify as poor.
01:34:56 Speaker_03
Yeah, that's fine. Okay, for him. Yeah. Okay, but for me? Yeah, you aren't, you're of the highest class in this country.
01:35:06 Speaker_06
Yeah, well, there's people with a lot more money than me, but I do have- You're of the highest class.
01:35:12 Speaker_03
Okay, okay. And you also hobnob with the highest class.
01:35:17 Speaker_06
Yeah, but you know what? I act like myself and I have color. Here's what I'll say. The white, all white Christmas tree,
01:35:25 Speaker_06
is like occasionally I'd see that at people's houses who had an extra living room that no one went into and you weren't allowed to go in there, you know, take off all your shoes, you know, you get in a fucking Intel outfit to go in the room.
01:35:41 Speaker_06
And all of it seems stuffy and not playful and fun and colorful. It's felt very presentational.
01:35:50 Speaker_03
And where's your tree?
01:35:50 Speaker_06
But I used to be judgmental of that. I still don't like it, but I'm not as judgmental.
01:35:57 Speaker_03
Because your second tree is in your second living room. Okay?
01:36:04 Speaker_06
Okay.
01:36:04 Speaker_03
You know, I gotta keep you, I gotta just remind you.
01:36:08 Speaker_06
I know I'm spoiled. I know I'm spoiled.
01:36:11 Speaker_03
Yeah, okay.
01:36:12 Speaker_06
I'm really spoiled.
01:36:13 Speaker_03
It's just, to me, the class warfare thing, I would hope you now see.
01:36:19 Speaker_06
That it wouldn't be fair for a stranger to hate me just because I have money?
01:36:21 Speaker_03
Yeah. Yeah.
01:36:23 Speaker_06
Yeah, I would feel that way on the other side of it, but I wouldn't expect anyone to feel that way, not be on the other side of it, because I get it.
01:36:31 Speaker_03
Okay, so I have white lights, obviously.
01:36:33 Speaker_06
Yeah, I know that. I would know that.
01:36:35 Speaker_03
Yeah, everyone would know that.
01:36:36 Speaker_06
You don't need to tell me that, I know that. And I'm not judgmental of you. I'm so glad you're having the Christmas you've always wanted.
01:36:42 Speaker_03
Thank you. Yeah. Yeah. Jess and I had pig day and we went to home. We just missed you, I guess, because it seems like the timing.
01:36:50 Speaker_06
Yeah, because we were there at like 11 a.m. on a Saturday and you were there at 11 a.m. But I got to say, this is my record of all time. I was so fast and there was no fighting. This is like first year in a few. That day is very triggering for our family.
01:37:03 Speaker_03
I think it's hard for families that have to decide.
01:37:06 Speaker_06
You've got to compromise.
01:37:07 Speaker_03
And everyone has their things they care about. And luckily Jess and I have the same thing. We don't like bald... bald puss.
01:37:16 Speaker_06
You call the tree a bald puss?
01:37:18 Speaker_03
Bald pussy. Bald pussy. If there's bald spots. Okay, great. We don't like that. Okay.
01:37:23 Speaker_06
And... You like more of a Brazilian tree? No, Brazilian is... That's shaped and full. Isn't a Brazilian like you have a landing strip?
01:37:32 Speaker_03
I thought Brazilian is- Clean? Clean. Rob?
01:37:36 Speaker_01
Do you want me to Google? Yes, I do.
01:37:38 Speaker_03
Just definition of Brazilian wax.
01:37:41 Speaker_01
And pictures? And pictures.
01:37:44 Speaker_03
You can do that on your own time.
01:37:48 Speaker_01
It removes most or all of the hair from the pubic region, including the front, sides, back, and often the area around the anus. Yeah.
01:37:56 Speaker_06
Okay. I'm glad I- What's the landing? The landing strip's just a landing strip?
01:37:59 Speaker_03
Yeah, there's like, you can just get different kinds, but Brazilian generally means all hair.
01:38:06 Speaker_06
Do you think any dudes get a landing strip? I was just thinking I wanna go do that, just as a bit. I've done that as a joke.
01:38:12 Speaker_01
You have? For Natalie.
01:38:14 Speaker_03
Oh my God. That's so funny. Did she laugh?
01:38:20 Speaker_01
Did I make her horny? No, it was not meant to be.
01:38:24 Speaker_03
That's really funny.
01:38:28 Speaker_05
Stay tuned for more Armchair Expert, if you dare.
01:38:33 Speaker_06
Okay, let's take a break from the fact check to thank our presenting sponsor, Amazon Prime. Prime has you covered with movies, music, and everything you could possibly need to make the holidays perfect. Whatever you're into, it's on Prime.
01:38:47 Speaker_03
This is very exciting. It's holiday party season.
01:38:50 Speaker_06
Yes it is, that time of year.
01:38:52 Speaker_03
Work parties, family parties, parties with friends.
01:38:57 Speaker_06
Party parties.
01:38:57 Speaker_03
Parties with your animals.
01:39:00 Speaker_06
If you're as popular as Monica, you're hitting the party circuit.
01:39:04 Speaker_03
It's a great reason to shop for new clothes or accessories and really like spice up your wardrobe, make it fancy.
01:39:11 Speaker_06
Prime can help with that, especially if you decide last minute you want to buy something new. You're set with Prime's fast, free shipping. And hey, what you're buying for holiday parties depends on whether you're a guest or a host.
01:39:25 Speaker_06
If you're hosting, then you're going deep on Prime to find everything you need to make your home feel fun and festive and perfectly like you.
01:39:33 Speaker_03
Oh, tell me about it. I really like to make my house feel very me during the holidays. You could be decorating the outside of the house, getting some lights, something for the windows, grab some new holiday towels, some festive hand soap.
01:39:47 Speaker_03
Oh, I love a good festive hand soap. Candles. You really, you can do it all.
01:39:52 Speaker_06
Absolutely. And you can get all those things on Prime.
01:39:55 Speaker_03
Oh, and one other thing, Amazon Music is here to help with the playlist. Curating the party playlist, it's an art.
01:40:02 Speaker_06
Amazon Music will get the vibe right. Listen, what we're saying is anything you need for a holiday party is on Prime.
01:40:10 Speaker_06
Nice sweaters, goofy sweaters for the ugly sweater party, holiday decor, gifts for the host, or fun small stuff for a gift exchange at work. The sky is the limit when Prime's fast, free shipping is at your fingertips.
01:40:24 Speaker_06
From streaming to shopping, it's on Prime. Visit amazon.com slash prime to get more out of whatever you're into.
01:40:37 Speaker_03
Now, we were also so quick, so quick. In fact, it was almost eerie. We walked in and we were doing just like a quick look and Jess just beelined. He knew his daughter.
01:40:50 Speaker_06
Like Christmas vacation, there was a beam of light shining down on it.
01:40:54 Speaker_03
Yes, and he knew his daughter and it was the one.
01:40:58 Speaker_06
Are you his daughter? Because I think you view more of his mom.
01:41:00 Speaker_03
No. The tree was our daughter. We co-parent, but she lives at my house. So he's a little bit of a Debbie Deb, but whatever. And she's really pretty. She's so nice. She's... We said, cause last year Archery was a boy and he was a model.
01:41:22 Speaker_03
Like he was striking and very similar, like perfect.
01:41:29 Speaker_02
Angular.
01:41:29 Speaker_03
Exactly. Very angular. Not around features. This girl is, she's not a model, but she's a star.
01:41:40 Speaker_06
Oh, yeah. That's my that's the kind I like.
01:41:43 Speaker_03
Exactly. And I've been trying some different hats on her toppers.
01:41:48 Speaker_06
Oh, OK.
01:41:50 Speaker_03
I haven't decided yet.
01:41:52 Speaker_06
Is there no part of you that feels sad? Like what I really, the softest spot in my heart I have is for Charlie Brown's Christmas when they get that really bad tree. Charlie Brown did a bad job and they hated it.
01:42:06 Speaker_06
They're yelling at Charlie because of the tree, but then they decided to love it. And it's a good little tree.
01:42:10 Speaker_03
It's a sweet story.
01:42:11 Speaker_06
And I always am drawn to the shitty tree there. Cause I think no one wants this tree and we'd have a great Christmas with this tree. I have a real, I get emotional about it.
01:42:20 Speaker_03
Wow.
01:42:21 Speaker_06
Yeah, I want to like rescue this shitty tree.
01:42:23 Speaker_03
Oh my God, the way you feel about the trees is like how Kristen feels about the dogs.
01:42:27 Speaker_06
That's right, that's right. And all because of Charlie Brown, I think.
01:42:30 Speaker_03
Wow.
01:42:30 Speaker_06
So yes, so the girls have one agenda, which is to never like the same tree, I think is their agenda. And then mom has an agenda. Mom's very aesthetic. You know, it's very important to her.
01:42:41 Speaker_03
So for her, trees are not dogs. She wants a pretty one.
01:42:45 Speaker_06
Yeah. She's got something in her mind she's looking for. Yeah. My singular goal is when you pull into Home Depot, you can either park. and then go by the tree and then enter the line to pull up where they'll put the tree in.
01:42:59 Speaker_06
And I wanna just pull into the line and know that they can get that tree fast enough that by the time it inches up to the front, we'll have gotten a tree. So my only objective is to get the trees in time by the time I'm pulling the truck.
01:43:12 Speaker_06
No, in previous years, they go in and I wait in the car. This year, I went to,
01:43:20 Speaker_03
So what'd you do with the truck?
01:43:21 Speaker_06
I just parked it and I'm like, I'm going to run in. I'm going to see if I find a tree. It's not going to move up that fast. They got to load a tree. I didn't hold anyone up. Okay. And then we got the trees by the time I pulled up. So that was my goal.
01:43:32 Speaker_06
Mine's way less aesthetic and way more time management.
01:43:35 Speaker_03
Yeah. I don't feel bad for the, for the... You don't? I don't.
01:43:38 Speaker_06
How could you not? A tree that no one wants, Monica? It's already...
01:43:48 Speaker_05
I always get a tree that has a little bit of personality. And by personality, I mean missing parts.
01:43:55 Speaker_03
Bald puss?
01:43:57 Speaker_05
Miss Monica, I don't know what you just said, but please don't say it again. You want to talk about size and I'm here to buy a big old Christmas tree. Tell me about your tree. Does it have a Brazilian? Ew! What?
01:44:15 Speaker_03
Stop.
01:44:16 Speaker_05
What do you put under your tree?
01:44:17 Speaker_03
Make him go away. Make him go away. I can only tell, I can't, I kind of forgot.
01:44:23 Speaker_06
Sounds a little like you guys are very Frito-esque when you're shopping for this tree. You're just not doing the voice. Talking about ball, I don't even want to say it either. My God, and you're saying it's your daughter? This is twisted.
01:44:36 Speaker_06
Certainly don't want Jez talking about his daughter in that fashion.
01:44:39 Speaker_03
No!
01:44:40 Speaker_06
Last update, it was time for a crop, a harvest. Everyone already knows that. I feel like people are going to have a bunch of judgment about this. I guess, fuck them. Delta's like, I wanna shave my legs. Will you shave my legs?
01:44:53 Speaker_04
Yeah.
01:44:53 Speaker_06
I'm like, okay. People are gonna be like, you shouldn't shave your kid's legs. I can already feel that coming, but I don't give a fuck. She wants me to shave her legs.
01:45:01 Speaker_03
Yeah, why not?
01:45:01 Speaker_06
If she feels left out, I did it. Okay. Monica, her leg hair is also cashmere. It is. So we now have two fields in rotation. And so I want you to see what an enormous- Are you combining? Yes. It's now father-daughter cashmere.
01:45:22 Speaker_06
And I want you to, you remember how much we had just two days ago?
01:45:25 Speaker_03
Yeah, practically none.
01:45:27 Speaker_06
Look at the amount of cashmere we now have.
01:45:31 Speaker_03
Wow, it's like quadrupled in size.
01:45:34 Speaker_06
I was making a joke that we might get a mitten or a scarf in 10 years, but I actually think that's a real possibility now. Look at the amount in there now.
01:45:42 Speaker_03
Do you wanna feel her? I do, I wanna touch it, but also last time we touched it.
01:45:47 Speaker_06
Some of it disappeared. That's okay, now we got two growers.
01:45:50 Speaker_03
Wow, there is so much.
01:45:53 Speaker_06
Yeah. Now you have two growers. We got basically a mink farm.
01:45:58 Speaker_03
Are they separated?
01:46:00 Speaker_06
There's no real, yeah, it's just, I think it's separated on.
01:46:06 Speaker_03
Wow, it's so soft.
01:46:09 Speaker_06
Yeah, I think hers might even be softer than my back hair.
01:46:12 Speaker_03
Oh my God.
01:46:13 Speaker_06
But that's got a time limit. Her leg hair will turn into shitty hair like our leg hair.
01:46:18 Speaker_03
Exactly.
01:46:18 Speaker_06
But currently she is growing cashmere.
01:46:21 Speaker_03
Oh my God.
01:46:22 Speaker_06
You think I need to get a work permit for her?
01:46:24 Speaker_03
Because she is now kind of actively- You're probably illegal. It's like, yeah, it's illegal.
01:46:31 Speaker_06
I don't want to out her because she did such a great job. Lincoln shaved my back, did a great job.
01:46:35 Speaker_04
Yeah.
01:46:35 Speaker_06
But she was, she thought she had some cashmere on the razor and she emptied a little bit into our pouch. And then I discovered, no, some of that was beard hair. So I had to actually go in and pull it. Now I'm getting embarrassed.
01:46:49 Speaker_06
It sounds like a bit, but then you realize, no, it's not a bit, he's really- That happened.
01:46:54 Speaker_03
Did you use tweezers?
01:46:55 Speaker_06
No, I just, I could feel and I'd pull that out. I probably lost a lot of really good products.
01:46:59 Speaker_03
I know, that's okay. Yeah, we live and learn. This is an R&D situation.
01:47:03 Speaker_06
It's only the second harvest, so. Wow. Still learning a lot. That's exciting.
01:47:06 Speaker_03
All right, oh, one more thing. One cool thing that happened that I wanna put out there in the world, because I think it's good for me to manifest this. When Callie and I were shopping, we went into one store and I bought some cute little boxer shorts
01:47:24 Speaker_05
Okay.
01:47:25 Speaker_03
As we were leaving, Callie was in front of me and someone had held the door open for her to come out and like some woman walked in and then Callie walked out and then he this person continued to hold the door for me and I was like, oh, thank you.
01:47:44 Speaker_03
Then I kept walking. I don't know. He's a mystery man.
01:47:49 Speaker_06
Oh, oh, oh, oh, okay. The look on your face was that it was a famous person.
01:47:55 Speaker_03
It was the most- Gorgeous? Gorgeous person I've ever seen. Really? And not- Male or female?
01:48:03 Speaker_06
Male. Give me age, height, describe.
01:48:05 Speaker_03
Build it for me. But now he's sort of a haze. Oh. Like, I don't, I don't remember. I don't like that part. I know, I know, but part of it was, it happened so fast, he took my breath away. Yeah. And I think it read, you know, it read.
01:48:23 Speaker_06
Okay, your face betrayed you.
01:48:25 Speaker_03
Yeah, and he smiled, and I don't remember if he showed teeth or not. But no, he just like, that's who he is. Okay. And I turned, you know, I turned and I said, oh my God, that guy was so hot. And she said, I know.
01:48:40 Speaker_06
Callie was fucked up too.
01:48:41 Speaker_03
Yes. So we, so this is an undeniable situation.
01:48:45 Speaker_06
You should have gone back inside to talk to the third woman. who entered.
01:48:49 Speaker_03
Well, I think they were together.
01:48:50 Speaker_06
Oh.
01:48:51 Speaker_03
Well, I don't know. There's no way to know.
01:48:53 Speaker_06
So this is a lost persons report.
01:48:55 Speaker_03
Exactly.
01:48:55 Speaker_06
If you opened a door for Kelly and Monica at the farmer's market.
01:48:59 Speaker_03
Brentwood Country Mart.
01:49:00 Speaker_06
Brentwood Country Mart. Yeah.
01:49:02 Speaker_03
On Black Friday. On Black Friday. Probably around noon.
01:49:05 Speaker_06
Okay. Yeah. Contact, I guess, comment in this. I'll read it.
01:49:11 Speaker_03
I'll read them all. Okay, comment or.
01:49:13 Speaker_06
Don't. No catfishes.
01:49:16 Speaker_03
Exactly.
01:49:17 Speaker_06
I guess you'll be able to see the photo though, and you'll know.
01:49:20 Speaker_03
Oh, I'll know.
01:49:20 Speaker_06
And no one could fake it.
01:49:22 Speaker_03
No, because you know, when I walk through the world, I'm extremely unobservant. I don't notice people. You're blind, basically. I really am. Yeah. And speaking of blind, I got some soap in my eye this morning, and it was- Blinding?
01:49:34 Speaker_03
I thought I did some permanent damage.
01:49:37 Speaker_05
Of course.
01:49:37 Speaker_03
Okay, so anyway, I walk around so unobservant, and yet this person was strong. Penetrated. He pulled me out. It was shocking.
01:49:46 Speaker_06
He's like a lifeline.
01:49:47 Speaker_03
He was, he was so attractive.
01:49:50 Speaker_06
How many more times did you think about him?
01:49:52 Speaker_03
That day? A lot of times. A ton. Yeah.
01:49:55 Speaker_06
Did you like whip up fantasies? I know you're prone to fantasies.
01:49:57 Speaker_03
I am prone to fantasies. I, I didn't actually, I was more just like.
01:50:02 Speaker_06
Thunderstruck.
01:50:03 Speaker_03
I was just taken.
01:50:04 Speaker_06
Love at first sight.
01:50:05 Speaker_03
A little bit. Oh my God. And I don't even believe in that bit, like maybe. Anyway, that was a big mystery.
01:50:11 Speaker_06
Yeah, wow. And how often have you thought of him since then? Daily or once every few days?
01:50:19 Speaker_03
No, it's starting to dissipate. And I don't remember him at all.
01:50:22 Speaker_06
I sure hope he reaches out in the comments.
01:50:24 Speaker_03
Me too.
01:50:25 Speaker_06
Also, no bullshit, no catfishing.
01:50:26 Speaker_03
Yeah, guys, seriously.
01:50:28 Speaker_06
Stop catfishing, everybody.
01:50:30 Speaker_03
Seriously. Okay, anyway, so that, we'll add that to the mystery pile with the guy I met in New York, the restaurant guy.
01:50:39 Speaker_05
Oh, right.
01:50:40 Speaker_03
That mystery is also.
01:50:41 Speaker_05
Catfish is so delicious. What? Did you ever eat a big catfish sandwich, Monica?
01:50:49 Speaker_03
I don't give him permission to say my name.
01:50:52 Speaker_05
What do you want him to call you?
01:50:53 Speaker_03
I don't want him.
01:50:54 Speaker_06
Don't let him decide.
01:50:58 Speaker_03
Okay, this is for Fei-Fei Lee.
01:51:01 Speaker_06
Oh, in a ding, ding, ding, we just interviewed someone who knows her intimately. Yeah. Not intimately, I mean sexually. The colleague, we just interviewed a colleague.
01:51:14 Speaker_03
And he was giving her a lot of props and reverence that she really deserves. I loved her so much.
01:51:20 Speaker_06
I loved her so much too.
01:51:22 Speaker_03
She was a delight. Yeah. Now some facts for her. How long did Einstein live at Princeton? He lived in Princeton, New Jersey for 22 years, from 1933 until his death in 1955. He purchased a house at 112 Mercer Street, which became his home until his death.
01:51:39 Speaker_03
The house was for him, his wife Elsa, stepdaughter Margo, and secretary Helen Dukas.
01:51:44 Speaker_06
His secretary lived with him.
01:51:46 Speaker_03
I guess so.
01:51:47 Speaker_06
Interesting.
01:51:47 Speaker_03
I bet it's more like an assistant. Nowadays we'd call it an assistant.
01:51:52 Speaker_06
You're probably right. Yeah, I guess secretaries were just assistants.
01:51:56 Speaker_03
Okay. Oh, we talk about Cambridge Analytica. which was the whole thing that happened with Facebook. I encourage people to listen to Acquired, the podcast Acquired. They do an episode on Metta, fantastic episode.
01:52:13 Speaker_03
And they talk about what happens with the Facebook Cambridge Analytica scandal. And a lot of it's very misunderstood. A lot of what the public thinks, we're all missing a ton of information.
01:52:24 Speaker_06
It's kind of like the Martha Stewart thing.
01:52:26 Speaker_03
Exactly.
01:52:26 Speaker_06
Where we all think she traded her company
01:52:29 Speaker_03
Exactly, and that's not what it was.
01:52:30 Speaker_06
And she didn't, nor did she even do any insider training. I know. And she still went to prison. I know. Yeah. But yeah, the nefarious activity was on the, was on Cambridge Analytic, not Meta.
01:52:40 Speaker_06
Right, but also they- And they were just using existing tools that anyone could have been using.
01:52:44 Speaker_03
But they were using old information from an old quiz. Oh. Or a quiz or something that Facebook did a long time ago, and that's what they used. Okay. They weren't using current information. And yeah, they, Like Facebook didn't sign off on...
01:53:02 Speaker_03
They didn't hand over this information.
01:53:04 Speaker_06
Yeah, yeah, yeah. Everyone should listen to Acquired, just period.
01:53:06 Speaker_03
It's such a good podcast.
01:53:08 Speaker_06
It is, it is.
01:53:09 Speaker_03
I'm always shocked.
01:53:10 Speaker_06
Yeah, if you like a deep dive, that's the show for you.
01:53:13 Speaker_03
In the business world, like learning. I mean, you listen, I mean, they're four hours long. Met at six. Met at six, yeah.
01:53:19 Speaker_03
They spend a month researching a company and then they just tell you everything about the business and how it came to be and all of it. And you do leave feeling like you went to, Like you took a course in business.
01:53:31 Speaker_06
Oh, big time. Yeah, yeah, yeah.
01:53:33 Speaker_03
I recommend Acquired. And that's it.
01:53:37 Speaker_06
That was it?
01:53:37 Speaker_03
Yeah.
01:53:38 Speaker_06
Okay. I just adore her. I wish her the best.
01:53:40 Speaker_03
Me too.
01:53:41 Speaker_06
I'm grateful for her.
01:53:42 Speaker_03
Me too.
01:53:42 Speaker_06
Yeah.
01:53:43 Speaker_03
That's the line we learned from the Lisa Kudrow fact check that we say to people that I'm just grateful for you. I'm grateful you exist.
01:53:50 Speaker_06
Oh, yeah, yeah, yeah, yeah. Now we know. And I'm grateful for her existence.
01:53:53 Speaker_03
Yeah, me too.
01:53:54 Speaker_06
Okay. And Toto's a great dancer. Don't listen to anybody else. Love you.
01:54:00 Speaker_03
Love you.