Armchair Expert with Dax Shepard - Fei Fei Li (on a human-centered approach to AI)
Episode Date: December 11, 2024Fei Fei Li (The Worlds I See: Curiosity, Exploration, and Discovery at the Dawn of AI) is a computer scientist, co-director of the Stanford Institute for Human-Centered Artificial Intelligenc...e, and is considered by many to be the godmother of AI. Fei Fei joins the Armchair Expert to discuss her initial reluctance to tell her personal story as a part of her book on AI, starting a laundromat with her parents to support themselves, and the high school teacher that changed the course of her life. Fei Fei and Dax talk about how math is the closest thing there is to magic, why being fearlessly stupid is sometimes the best asset you can have, and the reason her north star is asking the audacious question. Fei Fei explains her perspective on the tech-lash, why there is so much humanness in everything we do in technology, and how essential it is to put dignity into how we both create and govern AI.Follow Armchair Expert on the Wondery App or wherever you get your podcasts. Watch new content on YouTube or listen to Armchair Expert early and ad-free by joining Wondery+ in the Wondery App, Apple Podcasts, or Spotify. Start your free trial by visiting wondery.com/links/armchair-expert-with-dax-shepard/ now.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
Welcome, welcome, welcome to Armchair Expert, experts on experts.
I'm Dan Rather and I'm joined by Modest Mouse.
Hi.
Hello.
I've been talking about this book quite a bit over the last six months,
The World's IC Curiosity, Exploration and Discovery at the Dawn of AI
by Dr. Fei-Fei Li.
She is an expert on computer vision, machine learning, and cognitive and computational neuroscience.
She is a computer science professor at Stanford University
and founding director of Stanford Institute
for Human Centered Artificial Intelligence.
So, we've had a lot of AI, but I'll say that this,
what makes this episode so special is Dr. Fei-Fei Li's
personal story is so...
Compelling.
It means the fact that people can land in this country
not speaking English deep into school
and pick it all up and then master all these fields.
Become better than everyone.
Oh my God, it's so impressive.
Yeah.
Oh, I loved her.
Please enjoy Dr. Faith Haley.
We are presented by Amazon Prime.
It's more than just fast free shipping.
Whatever you're into, it's on Prime.
So keep listening, keep watching, keep on keeping on to hear more about why we love
Prime during the fact check. This is a very, very sinking couch.
I know.
We're off to a terrible start.
You want to swap?
You want to sit in this chair?
You can.
I feel like I'm going through a therapy very soon.
Yeah, well that is the goal.
We do want people to feel very relaxed.
Too comfortable, really.
Yeah, if the spirit moves you to lay down supine,
you're invited to do that.
I might, I woke up so early.
What time did you wake up?
Probably 5.40 something.
That's early.
Yeah.
What time do you normally rise?
Not that much later.
My alarm is 6.20. Okay, mine's 6.40. Oh time do you normally rise? Not that much later. My alarm is 6.20.
Okay, mine's 6.40.
Oh, okay.
I'm aspiring, but you know what's funny?
Today it was 6.20.
You have kids.
I have kids.
How old are they?
Nine and 11.
Mine is eight and 12.
You have eight and 12?
Yeah, so we're in the same stage of life.
Boys or girls?
12 is boy, eight is girl.
How about you?
Girl, girl. Aww.
Yes, I'm so lucky.
Although 11 about to be 12,
I'm starting to get an inkling of what's coming my way.
Yeah, yeah.
Yeah, yeah.
In a house with three ladies.
Yeah.
In fact, yesterday was a very emotional day.
I'm barely hanging on.
What happened?
I don't know what's going on with my three ladies,
but all of them are in some kind of a...
Hormonal sling.
Hormonal turmoil.
Interesting.
And every variety, which is fun.
Have we starred?
Yeah, yeah, yeah, yeah, yeah, yeah.
We're always recording.
We call it AVR, always be recording.
Always be recording.
Okay, so you have 12 and eight, and are they close?
They're very close.
He's a good big brother.
He totally is.
Yeah.
He's sweet.
Silvio's your husband, yeah?
Yes.
And does one of them have Silvio's personality
and one have yours?
Well, I can tell you one of them has Silvio's hair.
Okay, which is?
A lot of curls.
Curls, world of curls.
Yeah, world of curls.
Well, you're here because I read your book,
maybe two months ago.
I was having dinner with Ashton Kutcher.
Do you know who that actor is?
Yes, he just texted me last night.
He's like, you're seeing my friend, Dex.
Oh, good.
Yeah, so we were at dinner
and we were just chatting about people
we thought were really interesting.
And then he asked me if I had read your book and I hadn't.
I went into it thinking I would get a history lesson on AI,
which I did, and a very thorough one,
but I would not have invited you for that.
Your life story is so interesting and beautiful
and the way you write about it, you're an incredible writer.
Thank you.
I do want to give credit to my friend, Alex,
who co-wrote the book with me.
Okay, that's very big of you. Who's Alex?
Alex is a friend of mine.
He does not claim to be a writer,
but he's a very talented writer.
We've known each other for years and he loves AI and we talk about it.
So when I was invited to write this book,
I do feel like I want Alex to co-create this with me.
So we became a creative team.
It was a really fun process.
How do you know Alex?
I know Alex through TED.
2015, I gave my first TED talk.
I watched it.
Yeah, thank you.
Yeah, yeah, about image,
how hard it is for a computer to see.
Yes, and that's how I got to know Alex.
Because he worked with TED in some capacity?
Yeah, he was in some kind of partnership with Ted,
and he was helping me to put together my slides.
Since then, we've become friends, and we talk about AI,
and he's also helped me in some of the Stanford,
HAI, Human Centered AI Institute stuff.
You're kind of creative partners.
Yes.
It's very interesting, because book writing
is very different creation compared to doing science.
We wrote almost three years or two and a half of those years during the day I do my science
and some of the evenings I do the creative writing. It's such a different part of the brain.
Yeah, which one do you find more exhaustive?
Both.
Both in different ways.
No, but they are very different.
Of course, I've been a scientist for almost three decades, so I'm more familiar with being
a scientist, but the creative writing journey, I loved every minute of it.
When I say I love, it's not necessarily happy love.
It's like painful, love-some part of it, but I really loved it.
That's what I want to start with, because I'm curious, when you sat down with Alex, I'm
sure the historical part, the scientific part,
that stuff is probably easy,
but had you ever told your life story to anyone
in that detail?
No, and I didn't want to, and I still don't.
Yeah.
Do you think that's a personal disposition
or where you come from culturally?
I think of the story of your father, which we'll get to,
and how little he told you about his own childhood
until the time was right.
And I glean from that,
well, this isn't a culture that is just divulging
all this emotional trauma and baggage.
Well, I have to say, I think culture,
in the case of an individual,
sometimes is too broad a brush stroke.
I think it's more individual.
I'm a relatively shy person,
and Alex and I wrote the first version.
It was purely scientific.
It was the first year of COVID.
We talked on the phone almost every night.
And one of my best friends is a philosopher at Stanford called John H. Mendy.
He's a very revered higher education leader.
He was Stanford's provost for 17 years.
And he is co-director with me
at Stanford Human Center AI Institute.
So I was really proud I wrote this first draft.
I showed it to him.
It was during COVID.
After like two weeks, he called me.
He said, Fei-Fei, you and Alex should come to my backyard.
That's how we meet because we were social distancing.
And then we went and he said, you have to rewrite.
I was like, what?
That's the last thing I want to do here.
He said, you're missing an opportunity to tell your story,
tell a story through your lens.
And I was just so rejecting that idea.
I was like, who wants to hear my story?
I want to write about AI.
I call him Edge.
Edge said, there are many AI scientists
who can write a so-called, quote, unquote,
objective story of AI,
but your angle would be so meaningful,
your voice to the young people out there,
the young women, immigrants,
people of all kinds of backgrounds.
And we were sitting in his backyard
with a triangular shape, three chairs,
and I looked at Alex, he was almost jumping off his chair.
With excitement.
He said, I told you.
He said, I told you so many times.
Of course it only takes edge to tell you that.
Well, let's jump to a really big philosophical question
about that.
I think when reading your story, you came here, this huge language barrier, such a fish
out of water, but your work, if good enough, would speak for itself and it would be a meritocracy.
And so it's not surprising to me that someone who got to where they wanted to go with that
belief would have a hard time thinking,
wait, I was trying to transcend this otherness.
This otherness is the thing that would be most interesting
and worthy of attention and affection.
What a gap.
It's very subtle you caught that
because when I go into the world of science,
I don't think too much about many other things.
I just follow that light, follow the curiosity.
And to this day, even when I was writing the book,
it's AI that fascinates me and I wanted to write about AI.
So it was very strange that someone wants to read about me.
Yeah. Well, I think even the notion
that you're struggling so hard,
I got to set up your story more.
This is the last thing I'll say out of context.
Monica's like, not everyone's read the book.
But just, of course math was appealing
because math didn't have a language barrier.
Yes, but I do wanna be honest,
even when I was a young kid in China,
I loved math and physics.
I love physics, I would say even more than math itself.
I saw math more as a tool.
I saw more beauty and fascination in physics.
Yeah, there's more philosophy.
Yes.
Okay, so let's start in China in 1976.
You're the only child of your parents.
Yeah.
And talk about your mother, because she's very interesting.
She is very interesting.
My mom come from a normal family,
but as the book says,
our family is in a difficult position because of the history.
So she was a very good student.
I think the intellectual intensity I have, a large part of it comes from my mom.
She was a curious student. She was very intense.
But her dream was pretty shattered when she was not able to go to a normal high school,
when she had a dream for college. And that carried her through.
And then you arrive and you show this great aptitude. And now she has, in a sense,
a second chance at this dream. But she starts recognizing pretty soon,
your path is going to be stilted as well if you stay there. So what's happening?
What is she noticing as you start getting educated and show this aptitude?
A lot of this is hindsight because I didn't talk to my mom in this way, right?
I think it was a combination that my mom has her own longing to be more free, maybe.
And in hindsight, I don't know if she knew
how to translate that in the world she was living in.
And the opportunity to go to a new world
was as appetizing to her as it is for her on my behalf.
It's also true she saw me as a bit of a quirky kid.
I think that blend of what she was longing for
and what she was longing for on my behalf
without me realizing was the motivation
of many of the changes, the decision of immigration.
Well, what would have been your trajectory
had you stayed in China in 1988 when you're 12?
Am I misremembering that your mom felt like
they weren't giving you the attention and encouragement
that she was hoping you would get?
My mom was not looking for attention for me.
My mom was looking for freedom for me.
And for herself, a lot of that is projection,
was looking for a world where I can just be who I am.
She wasn't necessarily looking for attention.
What do you mean by quirky?
This goes to my dad.
I didn't follow rules in the average way.
In hindsight, maybe it was just me being immature.
Sure.
But also there is a part of me,
why should girls not play soccer?
Why should girls be told they are
biologically less smart than boys? I was told at least more than once, watch out
that girls will in general be less smart by the time you hit your teenage time.
This is what I'm remembering from the book that you are explicitly told you're
not as smart as boys. I wasn't told in the context of one one,
like let me sit you down Fei Fei and tell you,
I was told in the way that teachers will say things
to boys or the context.
Society had a whole different expectation for boys.
I was very lucky my own family protected me,
but they can only protect
me so much as soon as you enter a school system, as soon as you interact with
society. All that came through. From that point of view, I was not following the
normal path. I was reading different books. You know, I was so passionate about
flights, UFOs, physics, special relativity, I would grab my classmates
to talk about that, but that was just not normal.
Yeah.
Who was exposing you to all that stuff?
That's a great question.
I was trying to ask myself that question
when I was writing the book,
and I still don't have a strong answer.
I think the early curiosity, the exposure came from both my parents.
My dad loved nature.
My mom loved books and literature.
But how did I fell in love with physics and UFOs and all that?
I'm not totally sure.
It could be my dad before he came to New Jersey.
He was ordering me some magazines outside of the school reading and that exposed
me to those topics.
Because my parents protected my curiosity, when I say protected, it really just meant
they left it alone.
They didn't meddle with it.
I kind of followed it through myself.
So your dad leaves when you're 12, he goes to New Jersey, he's there for three years on his own,
and he is setting up a landing for you and your mother, yeah?
Yeah.
Do you remember that three years missing him terribly?
How was that experience?
It was tough, I mean, it was early teenagehood,
there was no internet, phone calls are extremely expensive,
to the point of being prohibitively expensive.
So it was mostly letters every couple of months.
But then I was a teenager, so I had my own world to explore as well.
So it wasn't like I was sitting in the room crying or anything.
So then you come your sophomore year.
Yes.
You start a public high school in New Jersey.
Persephone High School.
One of the experiences you had, I came in and told Monica immediately about,
and you were in a class, some kind of a study hall or something.
Library.
Library.
And you were with a group of other ESL kids, English as a second language kids.
And you saw a very, well, no, I want to say how insignificant this first interaction was,
like benign brushing up against a kid's backpack or something.
And what happened?
A group of ESL students were in the library and then the bell rang or something.
We have to file out of the library door.
And I remembered it was crowded.
I honestly did not see what happened to that boy,
but all I knew was my ESL friend was on the floor.
By the time I realized there was some commotion
being kicked and punched,
I think there was nose bleeding and he was holding his head.
Yeah, you said he had a concussion and a broken nose.
And there's two boys kicking him.
Yeah.
And that's not even maybe the most traumatic part.
It's that after he's gone for a couple weeks,
he comes back and he's just not the same boy.
Yeah, I mean, nobody would be the same after going through something like that.
Definitely, it's a huge impact.
It was an experience that was definitely pretty intense for all ESL students.
Nobody felt safe for a long while.
Yeah, I think it changes your worldview on a dime, which is,
ooh, this new place I'm in can get pretty violent and a little out of control.
And if you're other, this could happen. I have to imagine. Yeah, it's
an incredibly scary recognition of where you're at.
Yes, but also I do want to get more colors, right? I love that your show focuses on the messiness
of being human.
Being messy is being multi-dimensional.
But it was also an environment where
there was so much support,
there was so much friendliness,
and there was also so much opportunity.
So it was very confusing.
I'm not trying to say that experience itself is not heavy.
I don't feel lucky about that experience. And I mean, there was anger and all that. But in the
meantime, the fuller context of that community was also quite a supportive community. So it was very
confusing. It gave me the multi-dimensionality of the new country I landed in. Everything's happening.
A lot of opportunity is happening as promised, and then a lot of xenophobia and violence
is happening.
Right.
Yeah, did you feel like you had to, after that, sort of like keep your head down?
Maybe it's just my own personality.
I always felt I had to keep my head down.
Right.
Especially as an immigrant.
Sometimes I feel that way even now, especially given the AI world we live in.
I feel I need to keep my head down to do work.
Of course, that particular event probably added a layer
of complication, at least for a while.
But they also taught me you have to stand up for yourself.
They did open different insights to me.
I don't know if you would rank these things in your life
of like serendipitous things happening,
but meeting Mr. Sabella has to be minimally in the top 10
and I would hope in the top three.
Yeah, it's possibly in the top three for sure.
Meeting Mr. Sabella was so lucky for me.
Yeah, I find this to be one of the sweetest stories
I've ever read about and kind of makes me hopeful
for people, how generous they can be.
But in a nutshell, minimally you're thinking
I'm gonna do good at math,
I don't have to go to my dictionary back and forth
like I do in every other class,
and you're in math and you're getting problems wrong
and you yourself cannot identify any pattern in this, you don't
know what's going on, and you go to see Mr. Sibela in his office.
That's your teacher?
Yes, Mr. Sibela was my math teacher.
I got into calculus and then Percivali High School doesn't have calculus BC.
We only had AP calculus for AB, so he had to teach me during his lunch hour for BC.
But this story you're talking about was earlier,
it was during some pre-calculus stuff,
and it turned out I was using a broken calculator.
They had gotten in a garage sale.
Her father loves garage sales,
it was his favorite thing in the world.
I know.
Every weekend they'd go to, he still does.
I love garage sale, I don't have time to go to Grudge Sale, but I love it.
Mr. Sabella was tough.
He is a tough love kind of teacher.
So even though I was ESL, I was this mousy girl,
he didn't think I needed any extra.
He didn't pity you.
No, not at all.
I for one quarter of semester. I got eighty nine point
Four something I still remember that I was like, oh god point six
I would get to at least an A and he would not give me that A
You asked about extra credit and he was like get real. How about good a good grade on the test?
Yeah, he would say there are many smart kids in the class
You just have to work hard.
But it sounds in the retelling like the breakthrough,
and I think this scene would be in a movie,
if I were writing the movie.
You're there, he discovers the tan symbol
on your calculator is malfunctioning.
He helps her figure this out,
because he too can't figure out the pattern
of all these errors.
And then somehow you guys start talking about books,
and he asks if you've read a certain science fiction writer.
You try to tell him, you haven't read that one
but you really love A Million Kilometers Under the Sea.
You can't translate it and you can't pronounce Jules Verne.
But he figures out you've read Jules Verne.
Yes.
And he is like shook.
He's like, you've read Jules Verne?
Yes.
And then you go on to say, yes,
and you've read Hemingway, and you've read everything.
Well, I've read everything my mom gave me,
which is a lot.
Which was extensive.
Yeah.
If I were him, and this young girl from China comes in,
and she has read most of the classics,
that's a real like, what am I dealing with here?
I gotta imagine for him at least,
that was a moment where he's like,
okay, I'm betting on this horse
I think he saw a person he can befriend with just the way I saw in him later on I realized
again
this is hindsight that he does that to so many students and he used this way of
Opening up in different ways not necessarily science fiction or
classic literature, to really get to be so helpful. For him and me, beyond math
and calculator, when we started talking about science fiction and the English
classics, he realized that he was seeing me more than an ESL kid at that point.
And he's also a shy person himself.
Later, his wife, Jing, and Bob, later I call him Bob,
we became such good friends.
Many, many years, way longer than maybe.
Jing said he's such a bookworm,
even during his family parties,
he'll be by himself reading.
So he's totally an introvert
in a way that we just had chemistry.
But this is one thing I was not able to fit into the book,
is that for years he would keep a diary.
And his diary talks about just his teaching life.
And I know in this diary there are so many stories
about different students he helped with,
not in the sense of bragging.
It's just he's a writer, right? So years later, before he helped with, not in the sense of bragging, it's just he's a writer, right?
Yeah.
So years later, before he passed away, we didn't know he was going to pass away.
I told Bob, I said, Bob, you've got to turn this into a book.
Of course we could anonymize, but this is an American teacher story of so many students,
many of them are immigrant students because they lack the support,
they lack the family.
Some of them are in high school by themselves,
family is overseas.
Many of them are like me.
Parents are so busy that the students
don't have that emotional support.
And he supported so many students.
I can sit here and tell you
an endless story.
And then he wanted to translate that into a book,
but somehow he just couldn't bring himself to do it.
Maybe he's too shy, maybe he's too humble.
Yeah, I think he's struggling with the same issue
you're struggling with. Probably.
You don't feel entitled to tell this story.
Right, I feel so strongly he needed to write this book.
I almost felt like one day I would write it for him,
but of course he passed away so suddenly because of the brain tumor.
So when I was writing my book, I realized, let me tell the Bob Sabella story.
Let me tell the story on behalf of so many American public school teachers.
They don't have much of a voice.
Nobody knows their name, but they work above and beyond every day
for the students in their community.
They don't care which part of the world they come from,
which kind of family background they come from,
but they invested so much in these students
and they changed lives.
Yeah, they're very unsung heroes.
They're not tenured professors at elite universities.
They're totally unsung heroes.
And they're the ones that get the people
to those destinations.
Yeah, it's a really beautiful story.
How instrumental was he in you finding your way to Princeton?
He was instrumental more than Princeton
because he was instrumental as the second dad.
He helped me to be grounded.
When you're an immigrant kid, a ESL kid,
you land in the country without speaking the language
and going through so many things.
It feels so unstable.
I think you're underplaying your story.
If you came here in seventh grade
and ended up at Princeton, that's one story.
You had two years to get yourself ready.
To learn English.
To start Princeton, and you didn't speak any English.
You're very much under, which is fine, I think.
So would your teacher.
It's admirable.
Yeah, you feel maybe that self-indulgent or something,
but that's really bonkers.
Again, AI aside, to land and go,
okay, if you drop me in Russia and told me
I have two years to land at their most elite university.
Moscow State University.
It's not gonna happen.
It's not gonna happen for 99.999% of people.
Let's talk for a second about you going to Princeton.
This is another fun moment for me in the book
because there's something so much more important
about Einstein than the theory of special relativity.
And I can't really articulate what it is,
but I know you have a good dose of it.
So what was it like going there
and seeing the statue of Albert Einstein
and imagining that you would in some way
be touching that reality?
So the first time I saw the statue of Albert Einstein,
before I was applying for college,
it was probably early junior year.
My dad continued to find things for us to do that's free.
It's very important it's free. It's very important it's free.
Princeton's Natural History Museum was free.
That's why we went there.
Garage sales, free.
Exactly.
Museums free.
Yes.
Seeing Einstein's statue was kind of symbolic for me
that I'm getting back to where really my soul wanted to be at.
Because as a teenager, landed in a new country, trying to learn language, deal with all the
messiness, you know, Chinese restaurant, walking dogs.
You're working a ton of hours.
Yeah, exactly.
I didn't forget about physics.
I was taking physics class in school, but I forget about the sense of love.
Romanticism.
Yes, it really is that first love.
And it kind of got me back to that, rekindled something.
Well, don't you think it left an imaginary word
where this person existed,
and it put it in your own three-dimensional reality?
Yes.
Suddenly I feel so much closer to that person,
and that person symbolizes the entire world of physics.
I feel so much closer.
I was literally in Princeton, right?
So that felt very different.
And he lived there for what, 30 some years?
Yeah.
Maybe more.
I think that would be a special moment as well.
I'm sure you watched the movie Oppenheimer.
Yeah, yeah.
Do you remember the opening scene
was Einstein in front of that little pond?
Yep, yep.
Talking with Oppenheimer.
Right, he was first there by himself.
I call that my pond, that pond literally exists.
It was very close to my dorm.
By the time I got to Princeton,
and I would go there a lot,
because I know that was close to the Advancing Institute
where Einstein worked.
Yeah.
So like when the scene came out,
Sylvia was sitting next to him.
I'm like, Sylvia, this is my pond.
Yeah.
Yeah, such a full circle.
Yeah, yeah.
Yeah.
I'm currently stuck in a rut
where I'm learning a lot about physicists,
historical physicists, and I'm wondering,
have you read, When We Cease to Understand the World?
Have you read that book?
No.
Or have you read The Maniac?
Either of those?
No.
Oh!
The Maniac's all about Janusz von Neumann.
I'm reading a different bio of him,
but not The Maniac. You are?
Which one?
Oh, it's in my phone.
Yeah, same, don't worry about it.
This one's fun, because it has the perspective
of a million different people in his life.
Like, a student, he was friends with his school,
one of his wives, people who worked with them,
and you get this really comprehensive view.
Another Princeton guy.
Yeah, I'm obsessed with all these guys.
And then when we cease to understand the world
is many of these physicists who were so brilliant at a time
who ultimately became crazy.
And how many of their breakthroughs
in the math of quantum mechanics,
coming to this guy in a nine day, 106 degree fever, writing down the matrices and not understanding the math of quantum mechanics, coming to this guy in a nine day, 106 degree fever,
writing down the matrices and not understanding the math
when he comes out of it, but it holds.
There's a lot of weird magic in this space, I think,
where people have these breakthrough thoughts
and they touch some understanding
and they're in a compromised state mentally.
It's just fascinating to me.
It's like mystical.
Yeah.
Physics is absolutely the discipline that pushes you
to think so audaciously that you have to transcend
the immediate reality.
Yes.
That's what I loved.
I loved about Einstein.
I loved about modern physics.
Even Newton, classic physics,
you have to think so beyond the immediate reality.
Although stories of him getting asked a question
and then answering it two and a half days later
and he hasn't left the chair and the person left.
Like he went away for two and a half days
and then came back with the answer.
Or just the notion, I think one of the most intriguing parts
is like, you're going to have thoughts
that cannot be expressed in language
but can only exist in math. That already is like what?
There is actually even beyond math.
Right. And then there's a realm beyond math.
Yes.
It's the closest thing I think we have to magic, where it's like completely outside of our grass,
but for a handful of people.
I love it. You call it magic. It is also the furthest thing we have to AI is that humanity in us, that magic,
that creativity, that intuition, that almost ungraspable way of intelligence.
Yes.
We should keep that in mind.
So you're at Princeton.
You're also working a ton, right?
When do your parents start the dry cleaners? So we started very quickly
right after my freshman year started
because my mom's health was going so badly.
They were working in Newark, New Jersey.
I don't know if you guys know that part of New Jersey.
From Persepolis to Newark, New Jersey
is a very difficult drive.
My mom's health was bad and it was long working
hours. I was really worried about them. The doctor was worried. We finally decided if we can do a
local thing in Persepone, it'll be better for the family. And it was very important for me that the
business is a weakened business because that way I can do the lion's share
of work.
But there are pretty much three kinds of weakened business for immigration families like us.
Open a restaurant, open a grocery store, or open a laundry.
And restaurant and grocery require very late working hours for restaurant and grocery is
very early.
You have to go to Chinatown to get supplies.
So neither of these work for my mom's health.
Whereas dry cleaning was actually perfect
because it's a daytime business.
It's very long hours during the weekend,
but it's at least daytime.
And a lot of my mom's work,
especially when it comes to alteration,
she can sit in front of the sewing machine.
Because your mother had had a reoccurring fever as a child
and it greatly degenerated some of her heart fat.
So she was really struggling with heart issues.
Yes, she carried that illness with her all her life.
And there's no money in the dry cleaning.
There's only money in the seamstress scene,
whatever we call it.
The tailoring.
The tailoring.
I mean, there's no money in any of these.
Yeah, yeah, yeah. More in the spirit. But having The tailoring. The tailoring. I mean, there's no money in any of this. It's more in the spirit.
But having that tailoring ability was nice
because it helps a little bit.
And my mom is incredible.
She never learned this.
She was a bookworm and she's kind of a brainy.
She should have done what you did.
Right.
I don't think she would love physics.
But you know what I mean.
She should have probably been an academic.
Yes, she would have been an academic. But then she just kind of figured out tailoring by herself.
I still don't know. Like I tried, I could not. The only thing I can do is sit there and un-stitch
things for her. Sure, I think a chimp can do that. Thank you. Yeah, exactly. I say that because I know
how to remove stitches from garments. And I don't have more skills than a chimp. Yeah, exactly. I say that because I know how to remove stitches from garments. Yeah. And I don't have more skills than it can.
Yeah.
So we opened a dry cleaner shop during the middle of my freshman year.
And that became my entire memory of my undergraduate.
Here's a fun fact.
Princeton is organized by residential dorms.
I lived in one of them called Forbes.
It turned out Forbes is very famous for its Sunday brunch.
I didn't know there was a Sunday brunch
because I was home doing dry cleaning.
You said you didn't go to a single party.
Right, but then when I went back to Princeton as a faculty,
Forbes was very kind.
They made me a faculty fellow and I discovered Sunday brunch.
That was like 50 years later.
Instead of the freshmen 10, you gained the 30 10.
Yeah, exactly.
The faculty 10.
So I felt so good.
I finally got my Sunday brunch.
And I think it's worth mentioning
when you guys were trying to open that dry cleaners,
you were trying to raise $100,000
and you were $20,000 short.
And again, Mr.
Isabella.
Monica.
He gave the money.
Yeah, it was a total shock.
To this day, actually, as a 19-year-old,
as much as I appreciated Jing and Bob,
I did not realize the extent.
We're talking about late 1990s.
They are two public school teachers
with two kids about to go to college.
It's unimaginable.
He said, Jing, and he decided to do that.
I mean, at that moment, I was very, very grateful.
But now after I became a grownup, this is unimaginable.
It's impossible that someone would do that.
Especially, he later told me,
I think when I was returning the money,
he said, I didn't realize you'll be able to return.
I was like, what?
Of course, you have to give it,
thinking you'll never get it back.
I guarantee he and his wife were like,
we're giving this money away.
I did not know that.
He did use the word lend.
And of course, in my mind,
I was like, of course I'm going'm gonna return like I'll do anything to return
But Jing and Bob did not they could not have assumed that so the money was being raised to help your mom
No to help my family to start the business
We as a family I still consider myself the CEO of the dry cleaner
You have to claim yourself
to be a CEO of something.
A C-Swee member.
Yeah, exactly.
So Bob and Jing, it's incredible.
I don't even think their kids knew about this
till they read my book.
Wow.
Oh my God, how proud I'd be of my dad.
Okay, so you graduate from Princeton
and you have a degree in physics as well some kind of computational
Yeah, so Princeton is a quirky school. It didn't have minors. So it has these certificates, but they're just minors
I had a computational mathematics as well as engineering physics minors
And when you're there unless I'm misremember, you had a very singular focus on being a physicist,
but while you're there, you start realizing you're maybe open to something different.
It's actually really interesting.
I never necessarily thought I would be a physicist, but I wanted to be a scientist.
That was almost a sacred calling for me.
It was an identity.
Yeah, it was an identity.
For some reason, this girl who works in dry cleaners just wanted
to be a scientist. And then I loved physics, but I loved physics for its audacity and curiosity.
I didn't necessarily feel I'm married to a big telescope inquiry. So I was just reading
a lot. And what really caught my attention was the physicists I admired so much,
Einstein, Schrodinger, Roger Penrose,
they actually are curious beyond just atomic world.
They were curious about other things,
especially life, intelligence, minds,
and that was immediately no point in the eye opener for me.
I realized I love that.
Yeah, understanding how this brain works.
Intelligence works.
It's crazy the overlap that has now been proven,
but at that time, that's not an obvious,
we haven't figured out neural pathways
and we're not gonna map that onto computers yet.
So these seem on the surface, very different fields.
One's biology and one is, you know.
Right, but for me it was the science of intelligence.
I always believed it's the science of intelligence
that will unite our understanding
of both the brain and the computers.
Right, okay, so then you choose Caltech
to go to graduate school.
What did you think of California?
I mean, my God, what a place, right?
I know we're 15 minutes away from Caltech here. So I was choosing among MIT, Stanford,
and Caltech an honest God. I almost chose Caltech because of the weather. Yeah. It was
so funny.
And the vibe.
Yes, the turtles, the garden-like campus.
And of course, I walk into this building.
I think it was Moore Building at Caltech.
And guess whose photo was there?
It was Albert Einstein.
Sure.
And I was like, what?
Calling.
He was visiting.
And of course, there was Richard Feynman, the Feynman lecturer.
So I just followed these physicists, apparently.
And New Jersey was cold.
And also I really have an issue with cold
because my mom's illness is exacerbated by cold.
So every winter she suffers a lot.
So I have this negative affinity to coldness
coming from taking care of my mom.
So coming to Southern California,
I was like, oh my God, I love this place.
Did your parents come with you?
Later they did. In the middle of my grad school they did.
Were you worried about that leaving?
I had to switch from being on site to remotely run the dry cleaning. The dry cleaning was
stabilized that the customers are all returning customers. So my mom would be able to handle with one part-time worker.
And Bob Sabella was doing bills for my mom.
Oh my God.
Yeah.
He was just helping me.
And another thing he helped me as a young graduate student,
I would be entering the world of writing scientific articles.
That's pretty intense.
He would still proofread my English for me, all my papers.
Tell me about North Star and how you discovered yours,
because this happens at Caltech.
Yes, the prelude of the North Star was my education
from physics is always about asking the right questions.
If you go to the Nobel Museum
in Stockholm, there is an Einstein quote about much of science is asking the
right questions. Once you ask the right questions, solutions follow. You'll find
a way for solutions. Some people call it hypothesis-driven thinking. I've always
been just thinking this way. So as I was studying computational neuroscience,
as well as artificial intelligence at Caltech,
I was always kind of seeking,
what is that audacious question I wanted to ask?
And of course, my co-advisor Pietro Perona and Christophe
Koch, they were great mentors guiding me.
But many things start to converge, not just my own work,
but the field, people working on visual intelligence
from neuroscience, from AI, start to orbit around this idea
that the ability to recognize all kinds of objects
is so critical
for human visual intelligence.
When I say all kinds of objects, I really mean all kinds.
I'm sitting here in your beautiful room.
There's table, bottles, couch, pillow, a globe,
books, flower, vase, plants.
T-Rex skeleton.
Okay, that's behind you.
That's behind you, it's about toex skeleton. Okay, that's behind you.
It's about to eat you.
That's great, yes.
The shirts and skirts and boots and TV.
So the ability for humans to be able to learn
such a complicated world of objects.
Oh, millions and millions of objects.
Yes, it's so fascinating.
And I started to believe along with my advisors,
this is a critical problem for the foundation of intelligence.
That really started to become the North Star of my scientific pursuit,
is how do we crack the problem of object recognition?
Okay, so now I think is a great point to just go through a couple of the landmark events
that take us to where the technology is at that time.
So I guess we could start with Turing,
we could start in 1956.
Give us a couple of things that have happened in computing
up to that point.
Right, so that's the parallel story I was writing in the book.
Now that I have people hooked into you as an individual,
now we can get a little protein in this
and learn some stuff.
Right, well, the field of computing,
thanks to people like Alan Turing,
Van Neumann, was starting during World War II time, basically.
Of course, for the world of AI,
a very important moment was 1956,
when what we now call the founding fathers of AI,
like Marvin Minsky, John McCarthy, Claude Shannon,
they get together under, I believe,
a US government grant.
DARPA funded or something?
DARPA funded to have a summer long workshop at Dartmouth
with a group of computer scientists.
At that point, the field of AI was barely kind of born,
not born yet.
They got together and wrote this memo
or this white paper about artificial intelligence.
In fact, John McCarthy, one of the group leaders,
was responsible for coining the term artificial intelligence.
I think we could get even more rudimentary, right?
So up until that point, a computer was something that could solve a problem.
It could do computations.
It could calculate.
And this notion of artificial intelligence, what it really meant is,
could we ever ask a computer questions that it hadn't been pre-programmed to answer?
What are the hallmark things that separated at that time
artificial intelligence from just computing?
Because I think we've just fast forwarded
to everyone saying AI,
and I don't think they really even take a second
to think of what that step is
between computing and computation and thinking.
Right, up to that point, you can think,
no matter how powerful the computer was,
it was used for programmed calculation. So what was the inflection
concept? I think two intertwined concepts. One is reasoning. Like you said, if I ask you a question,
can you reason with it? Could you deduce if a red ball is bigger than a yellow ball, a yellow ball
is bigger than a blue ball, therefore the red ball must be bigger than the blue ball.
Right, without having been programmed that.
Right, without directly saying red ball
is bigger than the blue ball.
So that's a reasoning.
So that's one aspect.
A very, very intertwined aspect of that is learning.
A calculator doesn't learn.
Whether you have a good 10 button or not,
it just does what it is.
You had a bad one.
Once I had a bad one.
So artificial intelligence software should be able to learn.
That means if I learn to see tiger one, tiger two, tiger three,
at some point when someone gives me tiger number five,
I should be able to learn, oh, that's a tiger,
even though that's not tiger one, two, three.
So that's learning. But even though that's not tiger one, two, three.
So that's learning.
But even before the Dartmouth workshop,
there were early inclins.
Like Alan Turing's daring question to humanity,
can you make a machine that can converse with people,
QA with people, question and answer,
so that you don't really know if it's a machine or
a person.
It's this curtain set up that he conjectured.
So it was already there, but I think the founding fathers kind of formalized the field.
Of course, what's interesting is for the first few decades, they went straight to reasoning.
So they were less about learning, they were more about reasoning, they went straight to reasoning. So they were less about learning,
they were more about reasoning,
they were more about using logic
to deduce the red ball, yellow ball, blue ball question.
So that was one branch of computer science and AI
that went on during the years, predated my birth,
but during the years of my formative years,
without me knowing, I wasn't in there.
Right.
But there was a parallel branch.
That branch was messier.
It took longer to prove to be right,
but as of last week, we had the Nobel Prize awarded to that,
which was the neural network.
So that happened again in a very interesting way.
Even in the 50s, neuroscientists were asking questions
nothing to do with AI about how neurons work.
And again, my own field, vision, was the pioneering study
about CAT mammalian visual system.
And Hubel and Wiesel in the 1950s and 60s
were sticking electrodes into cat's visual cortex
to learn about how cat neurons work.
Details aside, what they have learned and confirmed
was a conjecture that our brain or mammalian brain
is filled with neurons that are organized
hierarchically,
layered.
They're not like thrown into a salad bowl.
Right.
That means information travel in a hierarchical way.
Up these columns.
Yes.
For example, light hits our retina.
Our retina sends neural information back to our primary cortex.
Our primary cortex processes send it up to say another layer,
and then it keeps going up.
And as the information travels,
the neurons process this information
in somewhat different ways.
And that hierarchical processing
get you to complex intelligent capabilities.
That's a mouse I'm seeing if I'm a cat.
Or this tiger sneaking up on me.
And I think this could be a bad analogy,
but you might be misled to think,
oh, well, a camera can take a picture
and then the computer can show the picture.
So the computer understands that's a photo.
But really, the camera has broken
what it's seen into thousands of pixels.
They are coded with a numerical sequence.
The computer reconstructs those colors.
It's a grid.
And virtually that's what our eyes do.
Our eyes are just grabbing photons
and they're sending back the ones and zeros.
And then back here in the cortex, it's assembling it all.
Yes.
And how did evolution assemble us
so that we can recognize all this beautiful world?
Not only we can recognize, we can reason with it.
We can learn from it.
Many scientists have used this example is that
children don't have to see too many examples of a tiger
to recognize a tiger.
It's not like you have to show a million tigers.
Right, right, right, right.
To children.
So we learn really fast.
And as you point out in the book,
it took us 540 million years of evolution
to get this system.
Exactly. So just to finish, so the neuroscientists were studying the structure of the mammalian
brain and how that visual information was processed. Fast forward, that study got the
Nobel Prize in the 1980s because it's such a fundamental discovery. But that inspired
computer scientists. So there is a separate small group of
computer scientists who are starting to build
algorithms inspired by
this hierarchical information processing architecture.
You build one algorithm at the bottom that's maybe generic?
No, it's a whole algorithm,
but you build mathematical functions that are layered.
So you can have one small function that process brightness,
another that process curvature.
I'm being schematic.
And then you process the information.
But what was really interesting of this approach is that in the early 80s,
this neural network approach
found a learning rule.
So suddenly it unlocked how to learn this automatically
without hand code, it's called backpropagation.
And also Jeff Hinton, along with others
who have discovered this,
was awarded the Nobel Prize last week for this.
But that is the algorithm neural network.
Could you think of it as almost a filtration device,
which is like this data comes in,
we filter out these three key points that then filters up,
and then we come to our conclusion at the top of this hierarchy.
You could actually.
Because it's just like all this raw info at the bottom,
and then we kind of recombine it into this layer
and then another process filters.
Well, it's not a school bus, it's not this.
You just keep filtering it.
Of course you combine it in mathematically
very intricate way, but it is like layers
of filtration a little bit.
Okay, great.
So, and now also when you find your North Star,
another thing that's happening at the same time
is WordNet, right?
This is kind of a big breakthrough for early AI.
For linguistics.
So WordNet had nothing to do with AI.
It had nothing to do with vision.
But what happened for my own North Star
is that I was obsessed with the problem
of making computers recognize millions of objects in the world.
While I was obsessing with it,
I was not satisfied because my field was using
extremely contrived data sets,
like data sets of four objects or 20 objects.
I was really struggling with this discrepancy
because my hypothesis was that
we need to learn the much more complex world. We need to solve that deeper problem than
focusing on a very handful of objects. But I couldn't really wrap my head around that.
And then again, Southern California, I remember that Biederman number in my book is that I read
a psychologist paper of Biederman who was up till two years ago a professor at University
of Southern California. He conjectured that humans can recognize tens of thousands of
object categories. So we can recognize millions of objects, but categories are a little more
abstract.
Animal, food, furniture, German shepherd, transportation.
Yeah, sedan, fighter jet, and all that.
So he conjectured that, but that conjecture didn't go anywhere.
It was just buried in one of his papers, and I dug it out, and I was very fascinated.
I called it the Biderman number because I thought
that number was meaningful but I don't know how to translate that into anything actionable because
as a computer scientist we're all using datasets of 20 objects. That's it. And then I stumbled
upon WordNet. What WordNet was was a completely independent study
from the world of linguistics.
It was George Miller, a linguist in Princeton.
He was trying to organize taxonomy of concepts
and he feels alphabetically organized dictionary
was unsatisfactory because in dictionary,
an apple and appliance would be close to each other.
But then apple should be closer to a pear
than appliance.
So how do you organize that?
How do you regroup concepts?
So he created WordNet,
which hierarchically organized concept
according to meaning and similarity rather than alphabetical ordering.
Does wordnet not lead to the machine
that can read the zip codes?
No.
It doesn't, what's that called?
That's what I meant to bring up.
That was ConvNet, convolutional neural network.
That's happening as you're getting your idea
about the images right.
We've trained a machine to read zip codes basically, handwritten zip codes. So that was Young Akun's work in Bell Labs. That was an
early application of neural network in the 1980s and 1990s where that neural
network at that time was not very powerful but giving enough training
example of digits, scientists in Bell Labs were able to read from 0 to 9 or the 26 letters.
And with that, they created an application to read zip codes to sort mail.
But its data set was, I forget, it was like a thousand or something, or wasn't that?
It was a lot of handwritten digits.
Yeah, and common mistakes, they would feed it.
That data set was probably tens of thousands of example,
but we're talking about just letters and digits.
What they had proved in concept,
you're gonna try to do in images,
but the lift for images is so exponentially larger
than getting the machine to read.
Exactly.
By a factor of what?
I mean, when you lay out what it's going to take for you to prove this theory you have
and you figure out how long it's going to take, it's going to take like a decade of
you feeding them, right?
There's some moment where the amount of images you're going to have to feed this computer
to train it can't almost be done by the group of you.
So I think what you were referring to was the process of making ImageNet.
Yes.
And that process was, once we realized,
thanks to the inspiration of WordNet and also Biederman's number
and also many other previous inspiration,
we realized what computers really need is big data.
And that was so common today because everybody talks about big data, you know,
OpenAI talks about big data. But back in 2006, 2007, that was not a concept.
But we decided that was the missing piece. So we need to create a big data set.
How big is big?
Nobody knows.
My conjecture went with Biederman's number.
Why don't we just map out the entire world's visual concept?
Oh my God.
Yeah.
Why don't we?
And you wrangled someone in that this wasn't even really
their North Star.
Okay, so Professor Kiley at Princeton,
he was very supportive of me.
He was a senior faculty, but what was really critical was he recommended his student to
join my lab, Jia Deng. And Jia was just a deer in the head, like as a young first year
graduate student, he didn't know what's going on. He got this crazy assistant professor, me,
and told him that we're going to create a dataset
that map out the whole world's visual concept.
He's like, sure.
I don't know what you're talking about,
but let's get started.
So he and I went through the journey together.
I mean, he's a phenomenal computer scientist
and many hoops we jumped through together.
It was just solution that got us through.
This level of plotting that you were able to take on
is unique to you.
And I think it's moving here in 10th grade
and looking at that fucking dictionary back and forth
and back and forth and back and forth.
That kind of really unique dedication
and unwavering plotting.
A million other scientists could have had your idea, but I think it's that thing right there of really unique dedication and unwavering plotting.
A million other scientists could have had your idea,
but I think it's that thing right there
that makes you capable of creating image.
That's an interesting observation.
Yeah, it's not, I think we like to think of these things
very simplistically, like, oh, you had a great idea.
Who gives a shit?
A lot of people had great ideas in graduate school.
I do tell my kids, ideas are cheap.
Exactly, Hollywood.
Someone's like, that was my idea.
Oh really, did you write the script?
Did you execute it?
Did you cast it correctly?
Did you motivate everyone?
Your idea is 1% of the equation of a great movie.
Yeah, thank you for putting it that way.
Because when I'm reading your thing
and the data's coming in,
it feels like, and tell me if I'm mischaracterizing it,
the deeper you got into this experience,
you were just learning every day
it was gonna be harder than you originally anticipated.
It just kept getting worse and worse and worse
and worse for years, right?
It was pretty bad.
When I'm reading it, I'm like,
I would have quit a trillion times.
I'd be like, maybe computing will get to a point
where this job will be made easy,
but right now it's too hard.
How do you even start something like that?
Do you literally just look around the room
and you're like, okay.
Here we go. So yeah, I'll look around the room and you're like, okay. Here we go.
Yeah. I'll start with this room and write every single.
Well, okay. So first of all,
I've had years of training as a scientist.
So after you formulate a hypothesis,
you do have to come up with a plan.
My PhD thesis had a mini version of image desk,
so I got a little bit of practice.
But yeah, our idea was to create a dataset and a benchmark
to encompass all the visual concepts in the world.
So we had to start with WordNet.
We had to figure out what is visual.
We have to figure out what are the concepts we need
and the word to get the source images,
and how to curate it.
Every step of the way, like actually we were saying,
we were just way too optimistic at the beginning.
Naiveté is the best asset you can have.
Yeah, I was just fearlessly stupid.
Yeah, it's a great gift.
Yeah, and then we start to hit all these walls
of Jia and I and other students,
but Jia was the main student.
We had to just deal with every obstacle that came.
Now, science is a funny thing,
right? Sometimes serendipity makes a world of difference. What was really critical was
the Amazon Mechanical Turk, the crowdsourcing platform. Amazon, nothing to do with us. We're
like, oh, we have all these servers sitting
in our data centers and we have nothing better to do.
Let's make an online worker platform
so people can just trade little tasks.
A marketplace for that computer labor.
Exactly, which I didn't know it exists.
I was in New Jersey, Princeton,
and trying to pull my hair out.
And then some student who did his master at Stanford came to Princeton and just
mentioned it casually and said, do you know this thing?
That was really, really quite a moment for me.
Yeah.
That cut this process down by 80% or something.
Yeah.
10x.
That was one of the technical breakthroughs that really carried this whole project.
They're years down the path and they're calculating how much further it's going to be.
And they know they have years and years ahead until this moment.
Not only years and years, the budget, hiring undergrads or whatever, just doesn't cut it.
The budget was not going to cut it.
My tenure was on the line.
It was a dicey few moments.
So to fast forward to the end, you create ImageNet,
and you can feed in a picture of a boy pet an elephant,
and the computer knows that's a boy and that's an elephant.
Might be a different size than the other elephant I saw,
but I know that's an elephant.
And this is huge.
This earns you the title of godmother of AI.
I know, you don't have to comment.
I know you don't want that.
And I wanna fast forward.
Now you've accomplished this incredible thing.
You teach at Princeton for a while, as you say,
and then you take up a teaching position at Stanford
where you still currently are.
You become one of these people
that undergrads would then study about,
which is fascinating.
And you go to work for Google during a sabbatical for like a year and a half.
And there's a moment where part of your job is to go meet with the new recruits that are
going to start their employment at Google.
Is it fair to say this is one of your, I don't want to call it a crisis of conscience because
that would be too strong, but how would you say it?
You have an opportunity to talk to those people and it sounds to me like you went rogue a little bit.
Yes, I did go rogue a little bit.
Yeah.
So it's very important to call out the year.
My sabbatical at Google was 2017 and 2018.
That was my first sabbatical.
I finally had a sabbatical.
And it was a conscious decision for me to go to Google,
because this is right after AlphaGo.
So AI was
having its first hype wave, at least public moment, and Silicon Valley of course was ahead of the
curve and new AI was coming. So I had multiple choices but I really wanted to go to a place
for two reasons. One is to learn the most advanced industry AI, and Google was by far the best.
But also to go to a place where I can see
how AI will impact the world.
And Google Cloud was a perfect place
because cloud business is serving all businesses.
So at Cloud, being the chief scientist,
I was able to see the technology translate into
product and product impacting healthcare, hospitals, financial services, insurance companies,
oil and gas companies, entertainment, agriculture, governments and all that.
But in the meantime, it was confirming my hypothesis that this technology has come of
age and will impact everyone.
It was the first tech clash.
2017 was right after Cambridge Analytica.
Let's remind people, so Cambridge Analytica figured out how to maximize Facebook politically
and people were very upset by that.
Social media's algorithmic impact can drive societal changes.
It was also around the time face recognition bias was being publicized
for the good reasons of calling out bias.
It was also around the time that self-driving car accidents start to happen.
So before that, tech was a darling.
The media doesn't report tech as a force of badness.
But I do want to point out, because I heard you point it out,
which is in the early advancements,
they had all these peaks and valleys AI.
And there was a moment in the 70s where it looked promising,
and immediately people went to robots
who were going to take over the world.
So we also do have this immediate sense.
We do jump to that.
They jumped to it in the 70s.
It's worth pointing out.
That's true.
Hollywood is always ahead of the curve on that.
We sell fear and excitement.
So it was a tech clash that came at us very fast.
Google has had its own share.
I was actually also witnessing the struggle that
Google was coming to terms with defense. Yeah, they had taken a contract to develop some drone
face recognition stuff and the people at Google are told that they were only working on non-profit
stuff. There was a bit of a revolt. You were there during all that. Yes. In hindsight, it was a mixture of many things.
It wasn't a single event.
I remember it was summer of 2018,
and we were just coming off this turmoil.
In hindsight, they're small, but at that point,
and I was just like, I'm about to speak to,
maybe my memory is wrong, but I thought it was
700 interns from worldwide who worked at Google that summer. I'm about to speak to maybe my memory is wrong, but I thought it was several hundred interns
from worldwide who worked at Google that summer and they're the brightest from the whole world
and they were hand selected by Google.
You know, Google is really a machine of talent.
And what do they want to hear from me?
Of course, I can talk about come work at Google. That's my job as someone who is working at Google.
But I felt there was more I should share.
Really coming from the bottom of my heart at that point,
something that you will appreciate
is that the math behind technology is clean,
but the human impact is messy.
Technology is so much more multi-dimensional than equations.
Yeah, they're all benign.
It's how we implement all that.
They're neutral.
Neutral, there we go.
But once they start to interface with the world,
the impact is not necessarily neutral at all.
And there is so much humanness
in everything we do in technology.
And how do we connect that?
I decided to talk about that with the interns.
And is this the first time you articulate
that you want a human-centered development of AI?
Yeah, it was around that time, 2018 March,
I published the New York Times op-ed.
I laid out my vision for human-centered AI.
So let's parallel your speech to the interns and then also getting to go in front of Congress.
So what is your overarching sense of how we keep this technology going in a direction
that does serve humans?
My overarching thesis is that we must center the value of technologies,
development, deployment, and governance around people.
Any technology, AI or any other technology, should be human-centered.
As I always say, that there's no independent machine values.
Machine values are human values.
Or there's nothing artificial
about artificial intelligence.
So it's deeply human.
So what are the practical things we do?
What are the legislative things?
What does that mean?
How do we do that?
So human-centered AI should be a framework.
And that framework could be applied
in fundamental research and education.
That's what Stanford does,
or creating business and products,
that's Google and many other companies do,
or in the legislation and governance of AI,
which is what governments do.
So that framework can be translated into multiple ways.
Fundamentally is to put humans dignity, humans
well-being, and the value that a society care about into both how
you create AI or how you create AI products and services, or how
you govern AI. So concrete examples, let me start from the
very basic size upstream.
At Stanford, we created this Human-Centered AI Institute. We try to encourage cross-pollinating
interdisciplinary study and research and teaching about different aspects of AI, like AI for
drug discovery, AI for developmental studies, or AI for economics
and all that. But we also need to keep in mind we need to do this with the kind of norm that
reflect our values. So we have actually a review process of our grants. We call it ethics and
society review process, where even when researchers are proposing a research idea
to receive funding from HAI,
they have to go through a study or a review
about what is the social implication,
what is the ethical framework.
And are you bringing in philosophers
and anthropologists and psychologists?
This is the interdisciplinary aspect.
That's the very fundamental research example.
Now, translate to a company, when we think about an AI product,
let's say I would love for AI to detect skin condition for diseases.
That's a great idea.
But starting from your data, where do you curate data?
How do you ensure data fairness?
So if I play out that experiment,
it's like, yes, I would love to take my phone,
scan my face, and know if I have a melanoma.
That all sounds great.
Where does the results of that get stored?
Does my insurance provider have access to that?
What all happens?
It's not just me that's going to find out
I have this melanoma.
Exactly.
What about the scan of the face?
And also the algorithm that detects melanoma
is it trained on?
Just white folks?
Right, exactly.
Narrow type of skin or all skins.
What's the impact of that algorithm?
Will it disproportionately help some group
and alienate another?
And do you have to pay?
Because if you pay, you'll probably get a certain group
more than you'll get another group.
Right, so all this are messy human elements. And then you ask about legislation. Then we
come to government. Of course, there is always a tension between how much regulation, how
do you regulate? Is good policy only about regulating? For example, I firmly believe
we actually should have good policy to drovenate our AI ecosystem to to make our AI ecosystem really healthy.
For example, right now, the gravitational pull
is that all the resources, the data, the computation,
and the talents are all concentrated
in a small number of large companies.
It's all for commerce.
Universities can't really compete. No, not at all.
Meta, Google.
My own lab at Stanford has zero NVIDIA H100 chips.
There you go.
Yeah.
Like that's always been the good corrective mechanism
we've had societally is the world of academia.
And it competed pretty robustly with any private sector.
And it's not just competition. It's that the problems we work on are curiosity driven and
sometimes they are really public good.
For example, my own lab were collaborating with hospitals to prevent seniors from falling.
That is not necessarily a commercially lucrative technology, but it's humanistically important.
The universities do all kinds of work like that.
Now our universities in the age of AI is so under-resourced
that we cannot do this kind of work.
I have been working really hard in the past five years
with HAI, with Washington, D.C., with Congresspeople,
senators, White House agencies, to try to
encourage the resourcing of AI through national AI research cloud and data.
And then we have legislation and regulation.
How do you thoughtfully put guardrails so that individual lives and dignity and well-being
are protected,
but the ecosystem is not harmed.
So all of this I'm always on board with.
I love it.
I'm so grateful there's people like you
pushing us in that direction,
but we just had Yuval Harari on to talk about his take on it.
And what I ultimately get so discouraged and defeated by
is we're not doing this on an island.
We're doing this while many other countries do this simultaneously.
So how do you see us dealing with the competitive nature of these AI technologies emerging,
and us maybe proposing we're going to do it in this way,
but being realistic in saying,
well, Russia might not have those guidelines,
and China might not have those guidelines and China might not have those guidelines.
And if they have a product that people like,
we can't compete now with it.
So do you believe there could be cooperation?
We could outlaw faking humans.
Okay, so the US has outlawed faking humans.
No one else does.
And those fake humans are really convincing
and entertaining and all these things.
And then that industry takes off somewhere else.
Like, how do we do this in a world
that there are no barriers of this technology?
I was also chatting with Yuval.
Did he give the C-grade to humanity?
Did he say that?
No, no.
I didn't get the C- out of him.
He said that humanity has gotten a C-.
And I was like, Yuval, you know,
I'm a teacher and a mom.
When a kid comes home with C-,
you don't throw the kid out.
We help the kid to get better.
So first of all, you're right.
We're not living in a vacuum.
AI also is not living in a vacuum.
AI is just one technology that's among many.
So I absolutely do believe that there can be cooperation.
How exactly we cooperate, who we cooperate with, and what are the parameters
of cooperation is much, much more complicated. Look at humanity. We have gone through this
so many times. I mean, Yuval is right. We have many messy chapters, even nuclear technology,
but we have gotten to a state that there is a fine balance at this point of nuclear powers.
I'm not saying that's necessarily coppable.
I think it is.
And then I think what's really important, and I only know this because I'm on my second
Von Neumann book, but Von Neumann was employed in the wake of the Manhattan Project to deal
with how this proliferation was gonna work. And he was so analytical and so realistic
that he said, mutually assured annihilation is the solution.
He knew that was the only outcome.
It felt sociopathic to say it and to commit to it.
But he's like, look, I'm modeling this out.
This is the only way it works is mutually assured
annihilation, that's what we ended up with.
And so I'm having a little van Noymeny feelings about like,
no, I think it's a race to who can win
until everything gets neutralized.
I don't know another comp other than the nuclear arms race.
Here's the difference between AI and nuclear technology
is AI is so much more an enabler of so many good things.
True.
So that's very different from nuclear.
Of course, nuclear can be an energy.
We're coming back around to it, yes.
Right, but AI can help discover drugs.
AI can help break through infusion.
AI can personalize education.
AI can help farmers.
AI can map out biodiversity.
So AI is much more like electricity
than it is like nuclear physics.
So that's the difference.
So from that point of view, the viewing angle of AI, at least I do not think it has to only from the competitive lens,
because it should be also through the enabling lens, the enabling of our world of so many good things
that can happen.
And that's the challenge is how do we keep the dark use
of AI at bay?
How do we create that kind of balance somehow?
But in the meantime, encourage the development of AI
so that we can do so many things that's good for people.
So I accept that the nuclear analogy falls short and that there's so many benefits to this.
Totally agree. But I will say again, to parallel nuclear arms race in this moment in time,
I think it would be only the second time where international cooperation is at its peak,
where it's most needed. We have got to recognize this as a moment
where we have to be getting closer to all these places
and not further away.
Our competitors, our geopolitical adversaries,
that if ever there were a time where everyone stands to gain,
other than the nuclear arms race,
this is the time where it's like,
we gotta really figure out how to cooperate a bit
because everyone will experience the downside if we don't.
Yeah.
Climate too would be the other more recent thing.
There's a Paris Accord and there is things
that globally people have come together.
I agree with you, but I will just say
that climate to me is a little dicier
simply because you have all of these
burgeoning industrial economies
that we would be slapping rules on.
It's easy for us to adopt a lot of things
that it's not for Sri Lanka.
It's not totally fair.
There actually should be areas of the world
where they are allowed to pollute more
as they pull themselves out.
I mean, I think that's part of it.
It's just an acknowledgement globally
that we're all gonna have to do something,
and especially the superpowers
do need to take more on than others.
But it's just getting on the same page that I think we've done okay at,
and at least there's some consensus there.
So there could be some consensus here, potentially.
Yeah, I just hope that we recognize this is a moment to be making friendships a lot better,
and not doubling down.
I do think we must always recognize cooperation is one of the solutions.
Do you get to the guardrail point in the conversation with the legislators?
Do you have certain guardrails that you believe should be...
Like I like Yuval. You've all said we shouldn't ever be able to fake humans.
And I also think there should be a disclaimer on all AI generated things
that you at least know it came from that source.
I do think we should pay a lot of attention on where rubber beats the road.
Because AI can sound very fancy,
but at the end of the day, it impacts people.
So if you use AI through medicine,
then there is a regulatory framework.
For example, my mom again does imaging all the time,
because the doctors have to use MRI, ultrasound,
you name it, to monitor her.
Honestly, do I care if that MRI is fully automatic
or is it operated by humans or it's a mixture?
As a patient family, I probably care more about the outcome.
If the result of the MRI can
be so accurate.
78% at an AI or a human does it at 40. It's a no-brainer.
Exactly. But all I care are two things. One is it is the best result my mom can get. Second
is it's safe, right? I don't care if it's that kind of mixture. So that regulatory framework is there.
I'm not saying FDA is perfect,
but it is a functioning regulatory framework.
So if there's an AI product that goes into that MRI,
I would like it to be subject to the regulatory framework.
There we go, yeah, yeah.
Right, so that's where rubber meets the road.
The same as finance, environment, transportation,
you name it.
That's a very pragmatic approach.
It's also urgent because as we have AI products that's entering our customers' market, and
it takes away from, in my opinion, the science fiction rhetoric about existential crisis, machine overlord,
that can stay with Hollywood.
Yeah, yeah, yeah, yeah.
I believe the downstream application is where we should put our guardrail attention at.
Right.
I really want to encourage people, even if people have only a cursory or no interest
in AI, I really think your book is one of my favorites I've read.
It's just your personal story,
as reluctant as you are to embrace it or talk about it,
is a really special story.
Thank you.
I mean, what ground you've covered.
Do you give yourself any moments where you go,
God damn girl, we got here.
That's very sweet.
That's the problem of always chasing after North Star.
I try to like look forward.
One thing I do reflect back is how grateful I am.
I'm not here by myself.
I'm here because of the Bob Sabella, Gene Sabella,
the advisors, the students, the colleagues.
That I feel very, very lucky.
Yeah, there's a lot of sweet people in the world still.
Yeah, it's good.
Yeah.
It's helpful.
Oh, well, Fei-Fei, this has been a delight.
I hope everyone gets your book,
The Worlds I See, Curiosity, Exploration,
and Discovery at the Dawn of AI.
And boy, those lucky people that get to have you
as a teacher. Oh man, so jealous.
I also love the narrator of your book.
Have you listened to it on tape?
A little bit, I didn't finish the audio.
You didn't finish?
Right.
Yeah, it's hard to listen to your own stuff.
Well, it's not her.
I know, but your own stuff.
Yeah.
You spent so much time writing it.
Right, I'm like, do I have time?
I should finish my Norman book.
Yeah, yeah.
And you should read Maniac.
Yeah, you got a couple new books to read.
I'm so grateful you like the book.
Oh, I love it.
It's just really beautiful.
I love the narrator, but I was having the moment
where I was like, I was only introduced to you
through this book, I was completely ignorant about you.
And then there's a narrator.
When I was doing research on you, I'm like,
oh, we're gonna find out what the real voice is.
Oh, good.
I felt a little self-conscious because of my accent.
Oh, really?
Because I consider if I should narrate my own book,
but I feel like my accent is probably too strong for that.
That wouldn't be the reason I'd advise you not to do it.
I think it's way, way harder than people think,
and there's a lot more acting involved.
I've heard some writers narrate their own book.
You gotta be a performer.
Right.
Forget your accent.
There's like a performance to be done.
Right, and that's how many hundred pages
Probably need to put your time there you have a lot of other stuff
Well, I hope you come back and see us again sometime and I'll be following everything you do and thank you for trying with
All your might to make this a human centered development. Thank you. It's so important that I do think
to make this a human-centered development. Thank you, it's so important.
And I do think creators and creators' voices
are so important because we started this conversation
with what's different from human intelligence, AI,
and that creativity, the insight, is a huge part of it.
And now that we have the generative AI
trying to create things, I think the collaboration
with humans is so important. Yeah, all right, well, be well, and thanks create things. I think the collaboration with humans is so important.
Yeah.
All right, well be well and thanks for coming.
Thank you.
Hi there, this is Hermium Permium.
If you like that, you're gonna love the fact check.
Miss Monica.
Hi, Moni.
Hi.
We had so much fun yesterday, didn't we?
We did. I did. I did. We shot. We had so much fun yesterday, didn't we? We did.
I did.
I did.
We shot a commercial.
So much fun.
So much fun.
Yeah.
I had a really fun, full circle moment.
Okay, yes, please tell.
Because I got out of the car,
you know, I haven't acted in a while.
Sure, we're all rusty.
Yeah.
I got out of the car and I started recognizing
some of the people on set and I realized I had worked
with a lot of that crew on some previous commercials
in my day.
Sure, one of the many, many thousands
of commercials you had done.
It felt so nice and cool.
Like I had done these commercials as just this actor auditioning and doing this thing,
and now we're doing a commercial.
Where they asked you to be in it.
Yeah, and we're doing it together.
Yeah.
Not for this podcast, but.
But because of this podcast, yeah.
And it was something really cool about it.
I agree.
I liked it, and it,
I think it's because my ring is fixed.
I have some housekeeping.
Okay, great.
You know, I read the comments and so,
and this is so embarrassing.
And I read it a couple of times.
I'm like, these people are crazy.
That's, I didn't say, so people were like,
you said the wrong voice of Darth Vader
in the Morgan Freeman intro.
And I thought they were saying,
I had said Morgan Freeman was the voice of Darth Vader. And I'm intro. And I thought they were saying I had said
Morgan Freeman was the voice of Darth Vader,
and I'm like, I know I didn't say that,
because I know he's not.
And James Earl Jones was the voice of Darth Vader,
and I said Edward James almost.
So I did say it wrong, it was another three name actor
with an Edward in it.
I see, yeah, that's hard.
So I fucked that up, and my apologies.
Oh, and then the other thing was,
they had coitus interruptus,
because we were chatting,
and I was going to say I was going to give
a Danny Ricardo update,
because I had ridden motorcycles with him.
But I guess then we got sidetracked, and I never did.
So all these people who are rightly concerned
about our sweetheart Danny Ricardo, how's he doing,
were left hanging.
And I'm here to report that he's so happy.
Yeah, he's doing great.
He's so, so happy.
We were riding motorcycles all day long
and we chatted a bunch and he's just very happy.
I'm glad.
Yeah, he's just doing really, really good.
So people should rest assure that Danny Rick is thriving.
Yay.
Yay. Love to is thriving. Yay. Yay.
Love to hear that.
Yeah.
Do you want to tell people what Toto texted you?
Yeah.
It was so funny.
Ha ha ha ha ha.
I had text him to say, hey, people really love the episode
and me in particular, I really loved it.
Thanks for doing it.
And he said, how were the numbers?
You know I am a lap time guy.
Oh, so playful.
Oh, God.
God.
I got to say, I want to say out loud,
that really put a lot of wind in my sails.
That made me so happy to have that episode come out.
It really right-sized my perspective.
As I vocalize on here, it's been a challenging transition.
I've been really stressed.
There's been bad news and challenges.
And this came out and I was like,
oh right, dumb ass, you get to meet people
that you are obsessed and in love with.
Holy lottery.
Yeah, I just was, I was beaming all day Wednesday from it.
Yeah, it was a great episode and so yeah, just so cool.
We get to talk to anyone we want to talk to.
Not anyone.
I still have a list.
We still got Tay.
Liquid Death.
I'm just pointing to objects.
Monkey with huge balls.
No, we already had Machine Gun Kelly.
We did, we did.
Okay, there's another fun update, but this, I'm getting worried
that people are gonna be afraid to text me.
I guess these people should know,
I run it through my analysis.
Okay.
And I would never say anything that was in a text
that I didn't think was just lovely.
You know what I'm saying?
I get worried about it, don't you?
Like, you know, someone's got a private exchange with me,
and then I'm reporting on it.
There's an ethical dilemma here.
Sure.
But sometimes they're so funny
and I think the person would like it anyways.
So I sent Pitt the clip of Toto talking about
me telling him that Pitt said he was a good dancer
and then Toto talking about him coming to dinner.
Yeah.
And then he's,
he said, I made up the thing about him being a good dancer.
And I said,
Oh no.
I said, I can't believe you made that up.
In fact, I don't believe you made that up.
I still believe he's a great dancer.
Yeah, me too.
But he did say, cause Toto was like,
when did he see me dance?
I know.
But then he just had to go,
well, I don't understand how that happened,
but I'm going to take that.
He was just-
He was being funny.
He was doing a yes and.
He was doing a bit.
He was like, you're not going to believe this.
He's also a phenomenal dancer,
but he's just with him.
I believed it.
Yeah.
I think the-
Who wouldn't believe it?
The crux of that story is I'm gullible.
I think he is a great dancer.
Can we talk about Chrisma a little bit?
Sure.
I got the fever as much as I've ever had it,
as hard as I've ever had it.
Let me tell you what's happening.
So, so far from our homework,
we watched Christmas Vacation already,
Home Alone 1 and 2, side note,
I've never heard Delta laugh harder in my life
than the 27 minute set piece in Home Alone 2 where he's hitting the guys with bricks.
Oh sure.
She was laughing uncontrollably for like 27 minutes.
She said at one point, it doesn't get old.
Like they threw a fifth brick or whatever.
And she's like, it doesn't get old.
And I got so much joy out of watching her have
that big of a laugh at something.
So cute.
Okay, so I'm alone too.
We did Gremlins, another Christmas favorite for us.
Last night we did the Grinch Who Sold Christmas
original cartoon, and I want to go out and say,
for the record, it's the number one Christmas cartoon
to ever be made.
It is the most creative.
We all watched it, and at the end of it-
How many more Christmas cartoons are there?
There's a lot.
You've got Rudolph, you've got the Chuck Brown.
Oh yeah.
You've got the, there's a bunch.
Okay.
But I'm saying, maybe even Christmas anything.
It ends and I said, you know,
Dr. Seuss should really be regarded as like Salvador Dali.
He had such a unique imaginative world
he created in the words and the set pieces.
I mean, that's one of the most creative people to ever live.
Of course, I think he is given his due props.
Yeah. You know, there's a Seuss land.
There is?
Yeah, one of the parks.
There is?
I think.
Yeah, Seuss landing in Orlando, Florida.
In Orlando, Florida, I should go.
You should go, you should pay your respects.
I like when people use the term Seussian.
Did you ever hear anyone use that?
No, but I like it.
Yeah, it's cool, right?
Yeah. Like Newtonian or like it. Yeah, it's cool, right?
Yeah.
Like Newtonian, or like it's a paradigm.
But it kind of sounds like Sisyphusian,
which is my favorite word.
It's not my favorite word anymore.
You taught me that word, and I thank you for that.
To remind people Sisyphus pushed the rock
up the hill every day.
There's a Buddhist take that like,
that's what people interpret that as a story of,
not wasted effort, but like, you know what I'm saying?
Yeah, yeah.
A fool's errand.
Yeah.
But there's a Buddhist way of looking at it,
which is like, this person had purpose
every single day, all day long,
and was not suffering probably.
Wow. It's a story of suffering.
It was a huge rock.
Well, first of all, he was probably jacked
to be able to believe. So strong, so strong.
But that's an interesting way to reframe it.
That like, no, this person,
every day of their life had purpose.
Yeah.
Probably very happy.
That's a lovely way to look at it.
Yeah.
It's actually Sisyphean.
Sisyphean?
Yeah.
I like Sisyphusian.
Me too, and I maintain it.
Let's keep it.
Yeah.
Okay, so you're in the Christmas spirit. Yes, and I wake the girls aficion. Me too, and I maintain it. Let's keep it. Yeah. Okay, so you're in the Christmas spirit.
Yes, and I wake the girls up every morning.
I wake up about 20 minutes before the girls to meditate.
And so now they wake up to me playing from my phone
over the Sonos Christmas music.
Wow, that's nice.
And I want to make a great recommendation
to people who are using Spotify,
and you can make a station.
Go to the Charlie Brown Christmas album,
and then go specifically to the song,
Christmas Time is Here.
Make a station out of Christmas Time is Here,
and it's the best Christmas mix I've ever had.
Ooh, that sounds lovely.
And it's on all the time.
And so, you know, the tree is over-decorated.
Yeah.
You know, we get one tree,
and Kristin gets a tree in the kitchen.
Uh-huh.
And hers is artistic, and this year it's wicked.
Oh, cute.
Yeah.
And our tree is a throw-up of color.
And I have those old-fashioned bulbs
that the water bubbles up in them.
They're almost impossible to get to sit vertical
on your tree.
I've spent most of my free time positioning all of them
and then I pull the cord and they all fall down.
It's a Sisyphusian task.
Wow, ding ding ding.
I didn't expect it to come around that quick.
I had all these things that I had about presents,
but I knocked a bunch of presents out the other day.
Nice, you used a little bit of my gift guide.
I used your gift guide almost exclusively.
There were good gifts on there.
Complain about your gift guide though.
You make things sell out.
Your gift guide is moving markets.
Yeah, well I pick great items.
Yeah, you do.
I have to say.
You have exquisite taste.
Thank you.
Some of your recommendations were so good that I found myself dancing around on the website
Yes, that's the goal. Yeah. Yep. And yeah, I'm stuff abound
There's fun stuff. So so and let's just so your tree has colored lights, right?
So many yeah, I have four strands
really long strands,
and four of those bubbly light strands.
And the tree's touching the ceiling.
It's a Clark Griswold, it's too big.
And I'd cut up foot off.
I just wanna talk about lights.
Oh, okay.
Okay.
My apologies, Miss Monica.
Miss Monica, I'm sorry, I get so carried away sometimes
when the spirit moves me. I don't leave my apartment much, so I really enjoy decorating it. Get all those
colors. Makes me optimistic.
I wonder how Hermium, does he have a delivery service? How does he get his tree?
I have a cousin who's not working at the moment, and he loves going to department stores and
plazas and shopping malls and strip malls.
Wow.
And I'll call him on the landline.
That's what I have, Miss Monica.
I pick up the phone and I call, his name is Bert.
Oh.
Yeah, he's my, did I say my brother-in-law or my cousin?
You said cousin.
Yeah, he's my cousin, I just remembered.
Weirdly enough, he's also my brother-in-law,
but it's my stepsister.
Okay, so it's all up and up. Everything's above board, as they say. I call up Bert and I say, here's also my brother-in-law, but it's my stepsister. Okay, so it's all on the up and up.
Everything's above board, as they say.
I call up Bert and I say, here's what I need, Bert.
Six water weenies, 10 spatulas, and Bert,
it takes him a while, sometimes four or five days,
and then he comes over, and he does charge me a little more,
but that's okay.
Sure, well he's doing a lot of work.
And then I have to call him back up
and ask him to deliver the presents. Wow.
That's okay though.
He charges me for that too.
Okay.
Getting a little taken advantage of, but you seem fine with it.
Okay, great.
Mom?
Remember, I'm not your mom.
Okay, Miss Monica.
Mom Monica.
Mrs. Mom. Color lights.
Yes, the lights because Chris and I assume
on her nice tree has white lights.
Yep.
Yep.
Yeah.
And this is a big thing.
I don't know if it's a, Rob?
Yes.
What color lights do you have?
Well, first of all, do you have the lights you want?
Yes.
Okay.
I do.
I like the yellowy.
White light.
Kind of warm gold, yeah, white lights.
That's white.
I mean, there's shades of white too.
Like we argue about that.
He's trying to walk in the middle and be nice,
but really he has white lights and he likes them.
I have white lights, but they're kind of yellowy.
Yeah, I know what you mean.
There's like a warm and a cool.
Now listen, sometimes you complain
about there being two boys, one girl in this situation,
but you have to admit, Rob is a perfect middle girl.
Like if Aaron was here, it would suck.
Well, yeah.
He disproves my gender stereotypes quite a bit.
Yeah, yes, because Aaron grew up exactly like you, He disproves my gender stereotypes quite a bit.
Yeah, yes, because Aaron grew up exactly like you,
so it's not fit, you just assume it's men
because you and Aaron believe it.
That's right, Monica.
That's right, Miss Monica.
So yes, Rob did not grow up with you and like you.
Oh, he's from the big windy.
So I don't think it's gender,
but I do think some people love
the nostalgic colored lights,
and then other people who care about aesthetics
love the white light.
I could really get on my high horse about it.
I used to have a really strong stance on it,
and it's all my class warfare stuff.
Yeah, which is-
It's so tired.
Is that what you're gonna say? No, I wasn't gonna say that your life does not match that mentality
But did you see Chris Rock's latest?
Stand-up, he said I am rich, but I identify as poor. Yeah, that's fine
Yeah, okay
Yeah, you you aren't you're of the highest class
in this country.
Yeah, well there's a lot of people
with a lot more money than me, but I do have.
You're of the highest class.
And you also hobnob with the highest class.
Yeah, but you know what, I act like myself.
And I have color, here's what I'll say.
The white, all white Christmas tree
is like occasionally I'd see that at people's houses
who had an extra living room that no one went into
and you weren't allowed to go in there,
take off all your shoes,
you know, you'd get in a fucking Intel outfit
to go in the room.
And all of it seems stuffy
and not playful and fun and colorful.
This felt very presentational.
And where's your tree?
But I used to be judgmental of that.
I still don't like it, but I'm not as judgmental.
Cause your second tree is in your second living room.
Okay?
Okay.
You know, I gotta keep you, I gotta just remind you.
I know, I'm spoiled.
I know I'm spoiled.
Yeah, okay.
I'm really spoiled.
It's just to me the class warfare thing.
I would hope you now see.
That it wouldn't be fair for a stranger to hate me
just because I have money?
Yeah.
Yeah.
Yeah, I would feel that way on the other side of it,
but I wouldn't expect anyone to feel that way,
not be on the other side of it, because I get it.
Okay, so I have white lights, obviously.
Yeah, I know that.
I would know that.
Yeah, everyone would know that.
You don't need to tell me that, I know that.
And I'm not judgmental of you.
I'm so glad you're having the Christmas
you've always wanted.
Thank you.
Yeah.
Yeah, Jess and I had pig day and we went to home,
we just missed you, I guess,
because it seems like the timing.
Yeah, because we were there at like 11 a.m. on a Saturday
and you were there at 11 a.m.
But I got to say, this is my record of all time.
I was so fast and there was no fighting.
This is like first year in a few.
That day is very triggering for our family.
I think it's hard for families that have to decide.
You gotta compromise.
And everyone has their things they care about,
and luckily Jess and I have the same thing.
We don't like bald...
call it bald puss.
You call the tree a bald puss?
Bald pussy.
Bald pussy.
If there's bald spots.
Okay, great.
And we don't like that.
Okay.
And...
You like more of a Brazilian tree?
No, Brazilian is...
That's shaped and full.
Brazilian?
Yeah.
Isn't a Brazilian like you have a landing strip?
I thought Brazilian is...
Clean?
Clean.
Rob?
Do you want me to Google?
Yes, I do.
Just definition of Brazilian wax.
And pictures?
And pictures.
You can do that on your own time.
It removes most or all of the hair from the pubic region,
including the front, sides, back,
and often the area around the anus.
Yeah. Okay.
I'm glad I...
What's the landing strip?
The landing strip's just the landing strip?
Yeah, there's like, you can just get different kinds,
but Brazilian generally means all hair.
Do you think any dudes get a landing strip?
I was just thinking I wanna go do that just as a bit.
I've done that as a joke.
You have?
For Natalie.
Oh!
Oh my God.
That's so funny.
Did she laugh?
Did it make her horny?
No, it was not meant to be.
That's really funny.
["Ring the Bells"]
Stay tuned for more Armchair Expert, if you dare.
Okay, let's take a break from the fact check
to thank our presenting sponsor, Amazon Prime.
Prime has you covered with movies, music,
and everything you could possibly need
to make the holidays perfect.
Whatever you're into, it's on prime.
This is very exciting.
It's holiday party season.
Yes it is, that time of year.
Work parties, family parties, parties with friends.
Party parties.
Parties with your animals.
If you're as popular as Monica,
you're hitting the party circuit.
It's a great reason to shop for new clothes or accessories
and really like spice up your wardrobe, make it fancy.
Prime can help with that,
especially if you decide last minute
you wanna buy something new.
You're set with Prime's fast free shipping.
And hey, what you're buying for holiday parties
depends on whether you're a guest or a host.
If you're hosting, then you're going deep on prime
to find everything you need to make your home feel
fun and festive and perfectly like you.
Oh, tell me about it.
I really like to make my house feel
very me during the holidays.
You could be decorating the outside of the house,
getting some lights, something for the windows,
grab some new holiday towels,
some festive hand soap, oh, I love a good festive hand soap,
candles, you really, you can do it all.
Absolutely, and you can get all those things on Prime.
Oh, and one other thing, Amazon Music is here to help
with the playlist, curating the party playlist,
it's an art.
Amazon Music will get the vibe right.
Listen, what we're saying is, anything you need Curating the party playlist, it's an art. Amazon Music will get the vibe right.
Listen, what we're saying is anything you need
for a holiday party is on Prime.
Nice sweaters, goofy sweaters for the ugly sweater party,
holiday decor, gifts for the host,
or fun small stuff for a gift exchange at work.
The sky is the limit when Prime's fast free shipping
is at your fingertips.
From streaming to shopping, it's on Prime.
Visit Amazon.com slash Prime to get more out of whatever you're into.
Now we were also so quick, so quick.
In fact, it was almost eerie.
We walked in and we were doing just like a quick look
and Jess just beelined, he knew his daughter.
Like Christmas vacation, there was a beam of light
shining down on it.
Yes, and he knew his daughter and it was the one.
Are you his daughter?
Because I think you view more of his mom.
No, the tree is our daughter.
Okay, okay, that makes perfect sense We have, we co-parent.
Okay.
But she lives at my house.
Yeah, so, I gotcha.
So he's a little bit of a da-bi-da, but whatever.
And she's really pretty, she's so nice.
She's, we said, cause last year, our tree was a boy,
and he was a model.
Oh.
Like, he was gorgeous.
Striking? He was striking, and very was a model. Like he was gorgeous.
He was striking and very similar, like perfect.
Angular.
Exactly, very angular.
Not round features.
This girl is, she's not a model, but she's a star.
Oh yeah, that's the kind I like.
Exactly.
And I've been trying some different hats on her, toppers.
Oh, okay.
Hats.
I haven't decided yet.
Is there no part of you that feels sad?
Like what I really, the softest spot in my heart I have
is for Charlie Brown's Christmas
when they get that really bad tree.
Charlie Brown did a bad job and they hated it.
They're yelling at Charlie because of the tree,
but then they decide to love it and it's a good little tree.
It's a sweet story.
And I always am drawn to the shitty tree there
because I think no one wants this tree
and we'd have a great Christmas with this tree.
I have a real, I get emotional about it.
Wow.
Yeah, I want to like rescue this shitty tree.
Oh my God, the way you feel about the trees
is like how Kristen feels about the dogs.
That's right, that's right.
And all because of Charlie Brown, I think.
Wow.
So yes, so the girls have one agenda,
which is to never like the same tree,
I think is their agenda.
And then mom has an agenda.
Mom's very aesthetic.
You know, it's very important to her.
So for her, trees are not dogs.
She wants a pretty one.
Yeah.
She's got something in her mind she's looking for.
My singular goal is when you pull into Home Depot,
you can either park and then go by the tree
and then enter the line to pull up
where they'll put the tree in.
And I want to just pull into the line
and know that they can get that tree fast enough
that by the time it inches up to the front, we'll have gotten a tree.
So my only objective is to get the trees in time
by the time I'm pulling the truck.
Because Kristen stays in the car?
No, in previous years, they go in and I wait in the car.
This year, I went to.
So what'd you do with the truck?
I just parked it and I'm like, I'm gonna run in,
I'm gonna see if I find a tree.
It's not gonna move up that fast,
they gotta load a tree.
I didn't hold anyone up.
Okay.
And then we got the trees by the time I pulled up,
so that was my goal.
Mine's way less aesthetic and way more time management.
Yeah, I don't feel bad for the...
You don't?
I don't.
How could you not?
Well...
A tree that no one wants, Monica?
It's already dead.
Dax, it's already dead.
It got chopped down.
I always get a tree that has a little bit of personality, and by personality I mean
missing parts.
Bald puss.
Miss Monica, I don't know what you just said, but please don't say it again.
You wanna talk about sad that I'm here to buy a big old Christmas tree.
Tell me about your tree.
Does it have a Brazilian?
Ew!
What?
Stop!
What did you put on your tree?
Make him go away.
Make him go away.
I can only take, I can't, I kind of forgot he-
Sounds though like you guys are very Frito-esque
when you're shopping for this tree.
You're just not doing the voice, talking about bald,
I don't even want to say it either.
My God, and you're saying it's your daughter?
This is twisted.
Certainly don't want Jess talking about his daughter
in that fashion.
No!
Last update, it was time for a crop, a harvest.
Everyone already knows that.
I feel like people aren't are gonna have a bunch of judgment
about this, I guess fuck them.
Delta's like, I wanna shave my legs.
Will you shave my legs?
Yeah.
I'm like, okay, people are gonna be like,
you shouldn't shave your kid's legs.
I can already feel that coming.
But I don't give a fuck, she wants me to shave her legs.
Yeah, why not?
If she feels left out, I did it.
Okay.
Monica, her leg hair is also cashmere.
It is.
So we now have two fields in rotation.
And so I want you to see what an enormous-
Are you combining?
Yes.
It's now father, daughter, cashmere.
And I want you to, you remember how much we had just-
Yeah, practically none just two days ago.
Yeah, practically none.
Wow!
Look at the amount of cashmere we now have.
Wow, it's like quadrupled in size.
I was making a joke that we might get a mitten
or a scarf in 10 years,
but I actually think that's a real possibility now.
Look at the amount in there now.
This is really, you know, I-
Do you wanna feel her?
I do, I wanna touch it, but also, last time we touched it.
Some of it disappeared.
That's okay. Now we got two growers.
Wow! There is so much.
Yeah.
Now you have two growers.
We got basically a mink farm.
Are they separated?
There's no real, yeah, I think it's separated on.
Wow. It's so soft.
Yeah, I think hers might even be softer than my back hair.
Oh my God.
But that's got a time limit.
Her leg hair will turn into shitty hair like our leg hair.
Exactly.
But currently she is growing cashmere.
Oh my God.
You think I need to get a work permit for her?
Because she is now kind of actively.
You're probably illegal.
It's like, yeah, it's illegal.
Ah, yeah.
I don't want to out her
because she did such a great job.
Lincoln shaved my back, did a great job.
Yeah.
But she thought she had some cashmere on the razor
and she emptied a little bit into our pouch
and then I discovered, no, some of that was beard hair.
So I had to actually go in and pull.
Now I'm getting embarrassed.
It sounds like a bit, but then you realize,
no, it's not a bit, that happened.
Did you use tweezers?
No, I just, I could feel and I'd pull that out.
I probably lost a lot of really good product.
That's okay.
We live and learn.
This is an R and D situation.
This is only the second harvest, so.
Wow. Still learning a lot. That's exciting. All right we live and learn. This is an R&D situation. This is only the second harvest, so. Wow.
Still learning a lot.
It's so exciting.
All right, oh, one more thing.
One cool thing that happened
that I want to put out there in the world,
because I think it's good for me to manifest this.
When Callie and I were shopping,
we went into one store
and I bought some cute little boxer shorts.
Okay. As we were leaving, Callie was in front of me
and someone had held the door open for her to come out.
And like some woman walked in and then Callie walked out
and then this person continued to hold the door for me
and I was like, oh, thank you.
Then I kept walking.
I don't know, he's a mystery man.
Oh, oh, oh, okay.
And he-
It looked like the look on your face
was that it was a famous person.
It was the most-
Gorgeous?
Gorgeous person I've ever seen.
Really?
And not-
Male or female?
Male.
Give me age, height, describe.
He's kind of, now he's like,
but now he's sort of a haze.
Oh.
Like I don't, I don't remember.
I don't like that part.
I know, I know, but part of it was,
it happened so fast, he took my breath away.
Yeah.
And I think it read, you know, it read.
Okay, your face betrayed you.
Yeah, and he smiled,
and I don't remember if he showed teeth or not.
Flirting.
But no, he just like, that's who he is.
Okay.
And I turned, you know, I turned and I said,
oh my God, that guy was so hot.
And she said, I know.
Callie was fucked up too?
Yes, so this is an undeniable situation.
You should have gone back inside to talk to the third woman
who entered.
Well, I think they were together.
Ew.
Well, I don't know.
There's no way to know.
So this is a lost persons report.
Exactly.
If you open a door for Callie and Monica
at the farmer's market.
Brentwood Country Mart.
Brentwood Country Mart.
Yeah.
On Black Friday.
On Black Friday.
Probably around noon. Mart. Yeah. On Black Friday. On Black Friday. Probably around noon.
Okay.
Yeah.
Contact, I guess, comment in this.
I'll read it.
I'll read them all.
Comment or.
Don't, no catfishes.
Exactly.
I guess you'll be able to see the photo though
and you'll know.
Oh, I'll know.
And no one could fake it.
No, because you know, when I walk through the world, I'm extremely unobservant.
I don't notice people. You're blind, basically.
I really am. Yeah.
And speaking of blind,
I got some soap in my eye this morning and it was-
Blinding?
I thought I did some permanent damage.
Of course.
Okay, so anyway, I walk around so unobservant
and yet this person was strong.
Penetrated.
He pulled me out.
It was shocking.
He's like a lifeline.
He was so attractive.
How many more times did you think about him later?
That day, a lot of times.
A ton.
Did you like whip up fantasies?
I know you're prone to fantasies.
I am prone to fantasies.
I didn't actually, I was more just like,
Thunderstruck.
I was just taken.
Love at first sight. A little bit. Oh my God, I was just taken love at first sight a little bit
Oh my god, and I don't even believe in that bit like maybe anyway. That was a big mystery. Yeah
Wow
And how often have you thought of him since then?
Daily or once every few it's starting to dissipate and I don't remember him
I sure hope he reaches out in the comments. Me too.
Also no bullshit, no cat fishing.
Yeah guys, seriously.
Stop cat fishing everybody.
Seriously.
Okay, anyway, so that,
we'll add that to the mystery pile
with the guy I met in New York, the restaurant guy.
Oh right.
That mystery is also.
Catfish is so delicious.
You ever eat a big catfish sandwich, Monica?
I don't give him permission to say my name.
What do you want him to call you?
I don't want him.
Don't leave it, don't let him decide.
Okay, this is for Fei-Fei Li.
Oh, in a Ding Ding Ding,
we just interviewed someone who knows her intimately.
Yeah.
Not intimately, I mean, sexually.
The colleague, we just interviewed a colleague.
And he was giving her a lot of props and reference
that she really deserves.
I loved her so much.
I loved her so much too.
She was a delight.
Now some facts for her.
How long did Einstein live at Princeton?
He lived in Princeton, New Jersey for 22 years
from 1933 until his death in 1955.
He purchased a house at 112 Mercer Street,
which became his home until his death.
The house was for him, his wife Elsa, stepdaughter Margot,
and secretary Helen Dukas.
His secretary lived with him.
I guess so.
Interesting.
I bet it's more like an assistant.
Nowadays we call it an assistant.
You're probably right.
Yeah, I guess secretaries were just assistants.
Okay, oh, we talk about Cambridge Analytica,
which was the whole thing that happened with Facebook.
I encourage people to listen to Acquired,
the podcast Acquired.
They do an episode on Metta, fantastic episode,
and they talk about what happens
with the Facebook Cambridge Analytica scandal,
and a lot of it's very misunderstood.
A lot of what the public thinks,
we're all missing a ton of information.
It's kind of like the Martha Stewart thing.
We all think she traded her company.
Exactly, and that's not what it was.
And she didn't, nor did she even do any insider trading.
She still went to prison.
I know.
Yeah, but yeah, the nefarious activity
was on Cambridge Analytic, not Metta.
Right, but also they thought-
And they were just using existing tools
that anyone could have been using.
But they were using old information
from an old quiz
Quiz or something that Facebook did a long time ago and that's what they use. They weren't using current information
And yeah, they
like Facebook didn't sign off on
They didn't hand over this information. Everyone should listen to acquired just period.
It's such a good podcast.
It is, it is.
I'm always shocked.
Yeah, if you like a deep dive, that's the show for you.
In the business world, like learning.
I mean, you listen, I mean, they're four hours long.
That's meta six. Meta six, yeah.
They spend a month researching a company
and then they just tell you everything about the business
and how it came to be and all of it. And you do leave feeling like you went to, like you
took a course in business. Oh, big time. Yeah, yeah, yeah. I recommend acquired. And that's
it. That was it? Yeah. Okay. I just adore, I wish you're the best. Me too. I'm grateful
for. Me too. Yeah. That's the line we learned from the Lisa Kudrow fact check
that we say to people that I'm just grateful for you.
I'm grateful you exist.
Oh yeah, yeah, yeah, yeah.
Now we know.
And I'm grateful for her existence.
Yeah, me too.
Okay.
And Toto's a great dancer.
Don't listen to anybody else.
He's great.
Love you.
Love you.