The Ezra Klein Show - ‘We Have to Really Rethink the Purpose of Education’
Episode Date: May 13, 2025I honestly don’t know how I should be educating my kids. A.I. has raised a lot of questions for schools. Teachers have had to adapt to the most ingenious cheating technology ever devised. But for me..., the deeper question is: What should schools be teaching at all? A.I. is going to make the future look very different. How do you prepare kids for a world you can’t predict?And if we can offload more and more tasks to generative A.I., what’s left for the human mind to do?Rebecca Winthrop is the director of the Center for Universal Education at the Brookings Institution. She is also an author, with Jenny Anderson, of “The Disengaged Teen: Helping Kids Learn Better, Feel Better, and Live Better.” We discuss how A.I. is transforming what it means to work and be educated, and how our use of A.I. could revive — or undermine — American schools.Mentioned:Brookings Global Task Force on AI EducationWinthrop’s World of EducationBook Recommendations:Democracy and Education by John DeweyUnwired by Gaia BernsteinBlueprint for Revolution by Srdja PopovicThoughts? Guest suggestions? Email us at ezrakleinshow@nytimes.com.You can find the transcript and more episodes of “The Ezra Klein Show” at nytimes.com/ezra-klein-podcast. Book recommendations from all our guests are listed at https://www.nytimes.com/article/ezra-klein-show-book-recs.htmlThis episode of “The Ezra Klein Show” was produced by Annie Galvin. Fact-checking by Michelle Harris. Our senior engineer is Jeff Geld, with additional mixing by Aman Sahota. Our executive producer is Claire Gordon. The show’s production team also includes Marie Cascione, Rollin Hu, Elias Isquith, Marina King, Jan Kobal, Kristin Lin and Jack McCordick. Original music by Pat McCusker. Audience strategy by Kristina Samulewski and Shannon Busta. The director of New York Times Opinion Audio is Annie-Rose Strasser. Special thanks to Switch and Board Podcast Studio. Unlock full access to New York Times podcasts and explore everything from politics to pop culture. Subscribe today at nytimes.com/podcasts or on Apple Podcasts and Spotify.
Transcript
Discussion (0)
I'm going to be doing a little bit of a Here's a statistic I've been thinking about recently.
So in 1976, if you ask high school seniors, have they read some books in last year for
fun?
Around 40% of them had read at least six books for fun in the last year for fun. Around 40% of them had read at least six books for fun
in the last year.
Only about 11% hadn't read a single book for fun.
Today, those numbers are basically reversed.
About 40% haven't read a single book for fun.
If you are looking for this,
you see it everywhere right now.
There are all these headlines about how kids are not reading the way they once did. There are all these stories quoting professors,
even at Ivy League universities, about the way in which when they try to assign the reading that
they've been assigning their entire careers, their students, they just can't do it anymore.
And so the professors are adjusting. They're changing the books, making them shorter,
And so the professors are adjusting. They're changing the books, making them shorter,
making them simpler, making the reading just less burdensome.
We're losing something.
We can see it on test scores that over the last decade,
we just see the number of kids reading
at grade level slipping.
And then of course the pandemic accelerated that.
So if you were simply asking, how are the kids doing on some of these intellectual faculties
that we once thought were the core of what education was trying to promote?
They're not doing well.
And then as if we summoned it, as if we wrote it into the script, here comes this technology,
generative AI that can do it for them. as if we wrote it into the script. Here comes this technology,
generative AI, that can do it for them.
That'll read the book and summarize it for you.
That'll write the essay for you.
That'll do the math problem, even shown its work, for you.
We know GenAI is being used at mass scale
by students to cheat.
But its challenge is more fundamental to that.
Of course, using it that way, we call it cheating.
But to them, why wouldn't you?
If you have this technology,
they not only can but will be doing so much of this for you,
for us, for the economy.
Why are we doing any of this at all?
Why are we reading these books ourselves when they can just for you, for us, for the economy. Why are we doing any of this at all?
Why are we reading these books ourselves
when they can just be summarized for us?
Why are we doing this math ourselves
when a computer can just do it for us?
Why am I writing this essay myself
when I can get a first draft in a couple minutes
from Claude or from chat GPT?
I have a three and a six year old.
And one of the ways that my uncertainty about our
AI-inflected future manifests is this deep uncertainty about how they should be educated.
What are they going to need to know?
I don't know what the economy, what society is going to want from them in 16 or 20 years.
And if I don't know what it's going to want from them,
what it's going to reward in them,
how do I know how they should be educated?
How do I know if the education I am creating for them
is doing a good job?
How do I know if I'm failing them?
How do you prepare for the unpredictable?
My guest today is Rebecca Winthrop, the director of the Center for Universal Education at the
Brookings Institution.
Her latest book co-authored with Jenny Anderson is The Disengaged Teen, Helping Kids Learn
Better, Feel Better, and Live Better.
So, here's my email, Ezra Klein show at NY times.com.
Rebecca Winthrop.
Welcome to the show.
Lovely to be here, Ezra.
So I'm a three and a six year old.
I feel like I cannot predict with AI, what it is society will want or reward from them in 15, 16 years. Which makes this question in the interim, how should they be educated?
What should they be educated towards?
Feel really uncertain to me.
My confidence that the schools are set up now for the world they are going to graduate into
is very, very low. So you study education, you've been thinking a lot about education and AI. What
advice would you give me?
So approximately a third of kids are deeply engaged, so two thirds of the kids are not. We
need to have learning experiences that motivate
kids to dig in and engage and be excited to learn. So when friends or relatives ask me the same
question, I usually say, look, we have to think about three parts to the answer. Why do you want
your kids to be educated? What is the purpose of
education? Because actually now that we have AI that can write essays and pass
the bar exam and do AP exams just as good or better than kids, we have to
really rethink the purpose of education. The second thing we have to think about is how kids learn,
and we know a lot about that. And the third thing is what they should learn.
Like, what's the content? What are the skills? People always think of education as sort of a
transactional transmission of knowledge, which is one important piece of it, but it is actually so much more than that
Learning to live with other people learning to know yourself and developing the flexible
competencies to be able to navigate a world of uncertainty
Those are kind of the the whys for me, but you know, I might ask you what what are your hopes and dreams for your kids?
Under the why before we get to the details of the skills.
Well, I have a lot of hopes and dreams for my kids. I would like them to live happy, fulfilling lives.
I think I'm not naive.
And certainly in my lifetime, the implicit purpose of education,
the way we say to ourselves,
did this kid's education work out?
Is do they get a good job?
Right.
That's really what we're pointing the arrow towards.
Right.
The fact that maybe develop their faculties as a human being,
the fact that maybe they learned
things that were beautiful or fascinating, that's all great.
But if they do all that and they don't get a good job, then we failed them. And if they do none of
that, but they do get a good job, then we succeeded. So I think that's been the reality of education.
But I also think that reality relies a little bit on an economy in which we've asked
people to act very often as machines of a kind. And now we've created these machines that can act
or mimic as people of a kind. And so now the whole transaction is being thrown into some chaos.
The skills that I think are going to be most important are how motivated and engaged kids
are to be able to learn new things.
That is maybe one of the most important skills in a time of uncertainty.
That they are go-getters, they're going to be wayfinders, things are going to shift and
change and they're going to be able to navigate and constantly learn new things and be excited
to learn new things and be excited to learn new things because when kids are motivated, that's actually a huge predictor of how they do.
And we're going to want kids absolutely to know enough content so that they can be a
judge of what is real and what is fake.
But we're also going to want them to have experiences where they're learning and testing
how to come up with creative new solutions to things, which is not really what traditional
public education has been about.
I think sometimes about this distinction between education as a virtue and education as something
that is instrumental.
Education is training.
Right.
Studying the classics was important,
not because it made it likelier that you got into law school,
but because it had deepened your appreciation of beauty.
It deepened your capacities as a human being.
And I think for reasons that make a lot of sense,
in many ways, we drifted away from that.
And I don't know that you build a society off of people just enjoying what they're studying.
And at the same time, I worry now we have pulled people into a conveyor belt, that when
they get to the other side of it, there's not going to be that much there.
And I don't even think you need to imagine AI for that. That's already happening to a lot of people. I think one reason you see a lot of anger
among young people today is that the deal often doesn't come through.
You do all the extracurriculars, you get your good grades, you show up on time, and then you graduate
college and the good jobs and the interesting life you were promised just aren't there.
And so there's something there that feels like
it is getting thrown into question.
If we don't know what the future is going to ask of us,
how can we be instrumental in the way we train people for it?
We can't be super instrumental.
So we have to come up with a new plan.
We did not know collectively, us, the world, that we would have generative AI that could
basically write every seventh grade essay or college essay to get into university, or the whole host of exams that are being administered
and are being passed by AI just as well or better than kids.
So we have to come up with a new plan.
That is not the plan for success.
And I want to push back on something you said.
You said, I don't know if kids just enjoy what they're learning, it's gonna help
or people are really gonna benefit from that.
Engagement is very powerful.
It's basically how motivated you are
to really dig in and learn.
And it relates to what you do.
Do you show up?
Do you participate?
Do you do your homework?
It relates to how you feel.
Do you find school interesting? Is it exciting? Do you do your homework? It relates to how you feel. Do you find school interesting?
Is it exciting? Do you feel you belong at school? It relates to how you think. Are you cognitively engaged?
Are you looking at what you learn in one class, applying it to what it might mean in your life outside or other classes?
And it's also how proactive you are about your learning. And all those dimensions
really work together in education that's a very powerful construct to predict better achievement,
better grades, better mental health, more enrollment in college, better understanding
of content, and lots of other benefits to boot. And we need to have kids build that muscle of doing hard things because I worry greatly
that AI will basically make a frictionless world for young people.
It's great for me.
I'm loving generative AI, but I have said several decades of brain development where
I know how to do hard things.
But kids are developing their brains.
They're literally being neurobiologically wired
for how to attend, how to focus, how to try,
how to connect ideas, how to relate to other people.
And all of those are not easy things.
You have in your book these four modes of engagement.
Do you want to talk through them?
Absolutely.
So we found after three years of research
that kids engage in four different ways.
They're passenger mode, kids are coasting,
achiever mode, they're trying to get perfect outcomes,
resistor mode, they're avoiding and disrupting,
and explorer mode is when they really love what they're
learning and they dig in and they're super proactive. So
that's the high level framework. What part do you want to dig in
on?
Well, why don't you go through them? I think passenger mode is
particularly interesting here. So why don't we start there?
Why don't we start there? So passenger mode is difficult to spot often
for parents and sometimes teachers
because many kids in passenger mode get really good grades
but are just bored to tears.
They show up to school, they do the homework,
they have dropped out of learning.
So passenger mode is when kids are really coasting,
doing the bare minimum.
Some signs of this are your kid comes home
and they do their homework as fast as possible.
Another sign is that they say,
oh, school's boring, it's just boring.
I learned nothing.
Kids are in passenger mode
because school is actually too easy for them.
We talked to so many kids who said,
look, I'm in class and the teacher is going over
the math homework from yesterday and I got everyone right.
And I know the answers and it's 45 minutes of that.
And I understand the kids who don't get it,
they need the help,
but I'm going to shop online. Or I have kids who say, well, I got the homework home and
I know how to do this stuff, so I just put in ChatGPT and it did my problem set for me
and then I turn it in. So that's when it's too easy.
Another version of why kids get into passenger mode is when it's too hard, school's
too hard. You could have a neurodivergent kid, kids don't feel they belong and so they're
not tuning in. They've missed certain pieces of skill sets that they really need. Knowledge
and education is cumulative in many ways and and they get kind of overwhelmed, and they need particular special attention.
So that's kind of what's going on in passenger mode.
One reason I wanted to start in passenger mode is that when I think about ways AI probably
is now, but can be very harmful, it's the connection with that mode.
Because in passenger mode, what you want to do, and many of us have done passenger mode
at work and many of us have done it at school, in some ways passenger mode was what I aspired
to be at school.
I just wasn't able to achieve it.
But you're reading something you think is boring.
You're reading something you don't want to be reading.
But you want to get a good grade.
So maybe at an earlier point, you would buy the Spark Notes.
Right.
But now you just have Chat GPT summarize it.
And more than that, you can have Chat GPT write the essay.
Kids are getting better at telling Chat GPT,
no, you actually wrote too good of an essay.
Like dumb it down a little bit.
You've basically hired your own, like, fill-in student who can help you coast.
And that will help you get, if you're able to do it adroitly enough, decent grades. But also,
whatever meta skills, forget the knowledge, whatever meta skills are being taught, how to read a book,
how to write an essay, you're not actually learning them.
And that's, I think, when people think educationally about AI, a bit of the fear and something that I believe everybody believes is happening now. So how do you think about that
interaction? I think you're 100% right. I've talked to kids all over the country and I've seen
to kids all over the country, and I've seen lots of incidents or cases of highly motivated, highly engaged kids who are using AI really well. They'll write the paper themselves,
they'll go in and use AI for research and help them copy edit. They're doing the thinking.
They've lined up the evidence to create a thesis and they've presented it
in logical order on their own. And that is the art of thinking. And that's why we assign
seventh graders to write essays or 10th graders to write essays. It's not that they're going
to create incredible works of art. It's to train them how to think logically and how
to think in steps. And that is a core component of critical thinking.
So as long as kids are mastering that and the AI is helping, that's a good use.
But a lot of kids are using it to do exactly like you said, shortcut the assignments.
So an example, one kid I talked to said, well, you know, this is a high school kid.
For my essay, I break the prompt into three parts.
I run it through three different generative AI models.
I put it together.
I run it through three anti-plagiarism checkers, and then I turn it in.
Another kid said, I run it through chat GPT, and then I run it through an AI humanizer,
which goes in and puts typos in and makes it, you know.
These kids are getting good at something.
I'm not sure that's what we want them getting good at,
but they're getting good at something.
Kids will find a way, no matter what.
Kids will find a way.
We cannot outmaneuver them with technology.
So the first response from Gen.AI came in was ban it,
block it, get anti-plagiarism checkers in,
which are bad, by the way.
Like I talked to one kid who showed me he had this essay and the plagiarism checker flagged
40% of it and he changed two words and then it went away. So we cannot out-technologize ourselves.
So what we need to do is shift what we're doing in our teaching and learning experiences. I have very personally complicated feelings on the question of AI and education, just
question of education generally.
I hated school, hated it, did terribly in it, starting in middle school, going through
high school, failed classes, just found the whole thing impenetrable.
And not because I wasn't smart,
not because I wasn't interested even in things related to it,
just somehow the whole construct didn't work for me.
And I couldn't make it work for me.
It wasn't exactly that I was bored.
I think today I probably could have muscled through it,
but for whatever reason then I couldn't.
But I was voracious outside of school.
I spent three or four nights a week at Barnes and Nobles.
I loved reading deeply into things that I was interested in.
And I've related the story before.
And one of the sort of reactions I get is,
well, you should really then recognize
the way school fails kids.
And in a way I do, but it's just not obvious to me at all
that school should be tuned for me.
Like one thing that I recognize
is somebody who studies bureaucracies
is that if you just think of US public education,
to say nothing of also private education,
to say nothing of global education,
it's educating a lot of kids.
And its ability to tune itself to every kid is going to be pretty modest.
And what kids need is different, but somehow you have to be orienting towards something
that works for most of them, even if you're not sure how to make it work for all of them.
I'm curious how you think about that.
I am not sure I agree.
I agree with several things.
One, you are not alone.
There are many, many kids currently today
are going through the system and feel like you.
Two, I agree with you that as sort of a bureaucratic system
that is actually quite miraculous if you think about it.
In every community across our country, kids as young as 3 to 18 at the same time of day
are getting themselves to a place Monday through Friday for a certain amount of days in the
year.
I mean, that is an organizational feat. And the thing I don't agree with is that once you're there, you just have to design for
the mean and the average.
I think there's lots of examples that are relatively big scale, or at least not just
one little school in a corner by one fabulous homespun teacher that do things differently.
And I think it actually just gets down to how we
orchestrate teaching and learning experiences.
Give me one of those examples.
One of those examples of a schooling system
able to educate in a personalized way at scale
that seems to you to be replicable?
I'll give you a couple.
So there's an example of schools in North Dakota
that have created studios for their adolescents.
And what are studios?
They are self-created classes that a student can
design and they have to tell you or tell the teacher what standards
they're meeting. I'll give you an example. We have a great character in the book I've done with
Jenny Anderson, the disengaged teen named Kia. And she was totally disengaged, doom scrolling
in middle school, and then these studios showed up. She got super into it because she was learning history
and science and she decided to design an escape room.
And she had to list out for herself,
these are the standards I'm meeting
for whatever grade she was in, 10th grade I think,
history and science, and she did an escape room
around the assassination of Abraham Lincoln and
John F. Kennedy. But she had to design this escape room. That turned her on like nobody else,
and she got super excited, and she did several of those. And then she actually said she was so
motivated, she went back to sort of normal classes. They're doing that across the district. That's one
small example. There's other examples of schools that we're talking about AI do sort of tech-based education on core subjects for a couple hours
a day, math, science, reading, social studies, and then for the rest of the day, they are
doing projects together on whatever it may be that they so decide. And there's a curriculum,
there's things the teachers want them to learn. It's not a every kid do whatever you want,
but that's super motivating.
There's no reason that we couldn't do that.
With the existing staff and people
and school buildings and infrastructure,
we just have to have the willpower
to decide to do things differently.
I wanna zoom in on something in that story,
which is that when the student you brought
up found the thing that lit her up, she was then able to do better in all the other classes
and maybe didn't.
This was a little bit of my own experience of life.
For me, it was political blogging of all things, which I found as a freshman in college.
And once I activated, then I became much better at doing things that I didn't want to do or didn't exactly see the point of in even unrelated fields.
I love that.
What's an example?
So you started political blogging and then what happened?
I think the way, what would have been the conventional line on me from the adults who knew me
was smart kid can't get it together. Right. Just can't seem to get the homework in.
Can't seem to do things he's not that interested in doing. And can't even seem to do things he is
interested in doing in a way that fits what we want from him. I read every book in English class
from him. I read every book in English class and I enjoy doing the essays. And I'm a good writer. I think I'm willing to say that at this point in my life. And I still did badly
on the essays because it wasn't what they wanted for me in some way or another. And
over time, I just don't have that. I mean, that was the broad experience of my life
that I couldn't fit what I did
to what the world wanted for me.
And now I'm just much better at doing that
in ways that are not related to my course of interest.
I'm not trying to over extrapolate my experience.
It's actually important to me
not to over extrapolate my experience,
but something I've seen you talk about
is this quality of when students find the teacher,
find the subject, find the approach that activates them, that all of a sudden the things that
are not that activating to them become easier, that there is a sort of lock and a key dynamic
to learning. And this is something we talk about around finding your spark.
Kids need to find their spark.
And they may have many sparks and their sparks may change.
But when kids find their spark, for Kia, it was this idea of doing an escape room around
historical residential assassinations.
For other students, they find sparks in other places.
One of the characters in our book, Samir, absolutely loved local politics and dove in,
getting himself on the school board, ultimately in high school.
Another student, Mateo, was super excited and turned on by robotics, and that's what
really turned him
around.
And when you're motivated, this internal drive, it makes you engage more, you lean in more,
you enjoy it more.
There's a virtuous upward cycle, and there's lots of evidence to show that it often spills
over.
So Kia talks about doing these studios for a couple years, which really helped her
re-engage and care about school. And then she went back and did some high school college
credit courses, which were very traditional structure. And she said she didn't love the
structure, but she had enough motivation to figure out how to bend the class to her interests.
So that's the best case scenario.
It doesn't always spill over automatically.
What you talked about when you said you enjoyed it,
you loved it, you loved English,
but you didn't give the teachers what they want.
It's probably because you're a total explorer
and we do not reward engaging in school
in a way that supports explorers in general.
Some schools do.
And that is what we have to change.
So then this gets to the AI optimist case.
And I take the AI optimist case as something like this.
It's pretty hard to do personalized learning,
even if you have examples that you've seen work.
Because you have one teacher,
it's a classroom of 20, 30 kids oftentimes. But AI makes this
completely different. AI gives you more tutors than there are children. It allows you to have
tutors who adapt to that kid's individual learning style in any way you want it to,
in any way they want it to. If this kid is a visual learner, it can do visual learning.
If pop quizzes are helpful for them, they can a visual learner, it can do visual learning.
If pop quizzes are helpful for them, they can do pop quizzes.
It can turn it into a podcast they listen to.
If you are more audio focused, everything can be turned into a poem.
If you absorb information better through the sonnet form that as we get better at this and as we build these systems and tune them better, although they're
already pretty capable here, that our ability to personalize education using artificial
intelligence as tutors will be like nothing ever seen before in human history.
It's a complete quantum leap in educational possibility.
And as such, it allows you to bring every child into their educational utopia, whatever that
is, to spark them, to turn them on, to make them into an explorer.
How do you feel about that more utopic vision?
I think we're on the same page.
Schools exist.
They're important.
They're important for many reasons.
We need to change what we do inside of them, particularly because of Gen. AI and we need to do it quickly. In addition to, I would say, you know,
regulating Gen. AI so it isn't so massively in students and young people's
hands without being designed for that purpose. I would say those are the two
big things we need to do. But I don't think our goal inside schools, when we're
educating young people, is to have a 100% personalized learning journey
for every kid.
What I think you're talking about is actually
the ability for Gen.A.I. to help teachers,
which I think is very real.
I think there's a big difference
and we need to make a big distinction
between AI supporting educators
and doing what they do versus going direct
to young people.
Well, let me push you on this for a second before you go here, because if I'm taking
the position of the AI optimist, what I'd say is, no, I'm not saying that.
I'm saying the AI will be better than the teachers.
Better at what?
If we are saying that AI is going to be better than the median for many people
at many kinds of work, why would we not assume that this system
we will be able to build in six years,
given how fast these things are developing,
won't per kid be better than the teacher?
I'm not saying I believe this, but I
want to make you argue with the AI optimist case.
But the question is better at what? So teachers do many, many things. Kids learn in relationships with other humans.
We've evolved to do that.
I do not think that we will go away from that or we may go away and then we'll be like,
oh my god, that was a huge mistake and ten years later go back.
So there's a question around skill development
and knowledge transmission.
That is one thing a teacher does.
And I think that's what you're talking about.
That is an area where I think technology can be good,
can be really good.
And actually we see it even without generative AI.
There's adaptive learning software, you know,
that helps kids really learn to read,
which is incredibly helpful, especially if you have access gaps. You don't have good teachers,
you have large classes, you have substitute teachers that aren't trained on how to teach
kids to read. So that complemented with things that motivate kids, get them excited, and see
the relevance of what they're doing, which is often in person, could be a great thing to do inside the classroom.
We see private schools doing that.
There's a group of schools that I have not visited, and I don't know up close, but alpha
schools are doing this.
They do, and they've been doing it for 10 years, actually, pre-Gen.AI.
They do a couple hours of sort of adaptive learning on key academic subjects,
and then the rest of the time kids are working together to build bridges or learn about financial
literacy or play sports or identify a passion that they want to go learn about in their community.
It's together, it's alone. What we don't want to do is bring AI in and have every kid sitting in front of a AI tutor alone at their desk for eight
hours a day, that's not the future that is going to help our kids.
I guess another way you might think about it is that this changes the job of the teacher
quite substantially.
Absolutely.
So, and I will say, I think I don't believe what I'm about to say.
So I don't want to get yelled at by everybody for every take.
Oh, you're not talking about me.
I'm not talking to you. I'm talking to my beloved audience.
My beloved audience.
Fair enough.
But one thing I've observed is that it seems to me that where AI is going to push
is towards the skills of the manager, the editor,
the supervisor, the fact checker in a way,
and often away from the skills which are right now
more numerous and needed in more numerous quantities,
of the worker, of the writer,
of in this case maybe the teacher.
So if you think about that world that you were just
describing as one we don't want a second ago, where you have 25 kids in a case, maybe the teacher. So if you think about that world that you were just describing as one we don't want a second to go,
where you have 25 kids in a class,
they're all staring at a screen.
They're all working with an individualized AI tutor.
Right.
You could imagine a world,
if you think about every one of those screens
as a junior teacher, as an individual tutor,
that there's some master teacher in the room
who the kids can go talk
to, who can like be pulled in to sort of oversee the
learning, to reshape what's happening. There is
testing. There are things that are trying to help us
evaluate how the kids are doing. But the teacher who's
already managing a classroom of students is now also
in a way managing a classroom of helpers, of tutors. I think that would be the kind of vision
you would hear from the more AI-pilled among us. Right. The role of the teacher in traditional
public schools is damn near impossible, honestly. They have to master a certain subject. They have to get kids
to grade level. And usually we have a wide difference of grade levels in school between
three and four different grade levels. So they've got to differentiate and figure out who needs what,
the bored kid who's the passenger, the struggling kid who's also the passenger, both of them silent
and quiet and you don't even know. And they've got to manage classroom dynamics. Like kids have to not hit each other
or disrupt each other or ruin the furniture. And they have to increasingly be social workers.
Kids are not doing well, lots of mental health problems, they've got to spot that, they've
got to help it. They also have to be relationship managers, they've got to work with parents, etc.
So it's very hard for one teacher to do this all.
Absolutely, I think the wave of the future is a different model where you have multiple
people and one of those could be an AI tutor, helping support our kids' growth and development. The interaction with AI can help with skill development,
knowledge acquisition, but that is one slice
of what happens in a classroom,
and it is one slice of what it really means
for kids to be educated.
Kids are learning all sorts of things in a classroom.
They're learning how to self-regulate emotions in a group. They're learning how to self-regulate emotions in a group. They're
learning how to understand different perspectives from kids who are different from themselves.
They're learning how to ask for help when they need it. There's a whole bunch of things
that kids are learning that is much more person-to-person that we want to maintain, I would argue. Here's where I actually am.
I think we've just been going through a catastrophic experiment
with screens and children.
And right now, I think we are starting to figure out that this was a bad idea.
And schools are banning phones.
My sense is that they are not relying very much on
laptops and iPads. There's a big vogue for a while of every kid gets their own laptop
or tablet. I think that's beginning to go away if I'm reading the tea leaves of this
right. And so I feel a bit better about that as a parent of young kids. I really feel badly for the parents whose kids have been navigating this over the past
10 or 15, 10 years, let's call it.
And right now I see AI coming and I don't think we understand it at all.
I don't think we understand how to teach with it.
I don't think the studies we're doing right now are good studies yet.
There are too many other effects we're not going to be measuring.
I think there's the sort of narrow thing that a program does and then what it does for a kid to
be staring at a screen all the time in a deeper way. I believe human beings are embodied.
And if you made me choose between sending my kids to a school that has no screens at all
and one that is trying the latest in AI technology, I would send them to school with no screens at all in a second.
But we're going to be working through this somehow.
And what scares me, putting aside what world my kids graduate into, is them moving into
schools at the exact time that they don't know what the hell to do with this technology.
And they're about to try a lot of things that don't work and probably try it badly.
And I wonder, as somebody who's tracked this, what you think the lessons of what I consider,
at least, the screens and phones debacle of the 2010s or the 2000s have been.
I agree with you 100%.
It was a massive, uncontrolled experiment, and our kids were the guinea pigs. We just
had a wait and see approach. We cannot take a wait and see approach again. And I think
that there's lots of lessons. I would say, first off, do not use generative AI unless
you really know what you're using it for. There is a real sense of FOMO among educators,
parents, young people even, that there's this thing happening out there and I should use
it because it's the newest thing. I saw that with groups who were working on student wellbeing.
And they had done teacher training around wellbeing curriculum for teachers and they
said, oh, we need to train parents how to do it. So their idea was let's use
Gen. AI. It'll be great because parents also do need to reinforce wellbeing
messages that teachers are giving in school, which is true. And what we'll do
is we'll create an app. And so this is what they had suggested. Ezra, imagine
you sitting down around the dinner table, you pull up your phone and you have a nap and your kids have their phone and you say, okay, how are you
feeling today? And you're looking at your phone and they're telling you how they feel.
And then you click through and ask, you know, why are you feeling that way? Like mediated
through a phone. It's crazy. It's crazy. Like we've lost our mind. Like that we need AI
to talk to our kids. So if there's
not a real problem you're trying to solve, don't use it is number one. Number two, any
I really do believe this, any company that wants to work with kids in schools should
be a benefit corporation. Because legally, you have a lot of companies who are creating
perhaps really good stuff if used
well that they have to maximize profits.
They can't maximize social benefit and well-being.
One thing that worries me is the way in which this might, maybe already has been, widen
the inequality between parents who can pay for private schools and parents who can't.
And what I mean by that is that private schools can just adapt more quickly.
They are not dealing with, they don't have to go through legislatures and have the boards
and they're just a little bit more independent.
They can take the screens out, they can put them in, they can limit what comes in.
Whereas the public school systems tend to be somewhat more slow moving.
I just knew living out in the Bay Area, a lot of tech people who were paying money
to send their kids to private schools that had banned the products they made starting many years ago.
And the rest of everybody was sending them to public schools that had not done that.
And when things are very, very fast moving,
being able to be fast moving is really important.
So somebody who cares a lot about public education,
what should the orientation of the public schools be?
How do they sort of not seem to parents
who think there's something that their kids should be getting out of this?
Don't their kids need to know how to use AI?
So they're going to need to attract parents on that level, but also how do
they not end up flat-footed if this is turning out to be a disaster?
This is a really tricky question.
And you point on something that is a real issue, which is around the deep
equity issues that have already emerged.
which is around the deep equity issues that have already emerged.
So think about the schools that ban AI
for a kid who has no access to AI at home
versus a kid who goes home and has full access
to all the AI tools.
That right there is a huge cleavage in our country.
It also, there's a huge equity gap in terms of language.
Large language models work off of language that is written down.
There's a lot of languages that aren't written down that much.
They have very little written down.
And so there you're seeing a global gap across the globe
between sort of African and indigenous languages
or in communities versus English speaking
or other large languages.
So equity is a huge one.
Your question about sort of public versus private,
I would say to public education systems,
do not have FOMO because
that is what the gut instinct is when a new technology comes.
I'm missing out.
I have a fear of missing out and I need to adopt it and I see this.
So don't have FOMO.
Don't use it unless it's a real problem you want to solve.
Do give it to the adults in the school building.
Give it to teachers. have them use it and
figure out how it will help them today. Then give it to sort of
really novel school leaders to think about how they could maybe
restructure the teaching and learning experiences. What are
the things that AI can do? There's so much that AI could
actually do to help make public schools work better.
Bus schedules, calendaring, school meals, cafeteria, I mean, assessment input. There's
so much time that could be used to give people FOMO argument or the argument that will be used to give people
FOMO.
The argument goes something like this.
If AI is a very potent technology that's going to be integrated into virtually everything
in the future, not literally everything, but quite a lot, then not just your literacy,
but your competency in it becomes paramount.
You're not going to be replaced by an AI.
You're going to be replaced by a person who knows how to use AI.
And so what you need to learn is to use the AI.
You need to learn how to manage it, how to prompt it, a sense of what a cannon can't
do.
And there's no way to do that other than relentless familiarity and experimentation and exposure.
And so a kid who goes to some Luddite school or when they're young, the toys are made out
of wood and when they're older, the books are all printed on paper and there's not a
gen AI in sight is going to lose out.
And it will be like having not taught them mathematics.
Right.
Or having not taught them how to drive, or something of that,
or how to type.
Right.
How do you take that argument?
I think it is 50% right.
And I think the 50% depends on the age of the child.
I absolutely 100% think you should send your kids
to the Waldorf School with the wood blocks
when they're young. You know, we know that early childhood, the more screen time they have,
the less language acquisition they have. We know that when infants are learning language,
they learn a lot of language from human to human contact. And if you put the same sentences on a screen,
they don't learn it.
Our neurobiology is not going to change in five years.
So we have to work with, that's the only confines
I think we really have to work with,
and everything else I think we can be imagined.
But it's true that when kids get older,
you do want to teach AI literacy.
When kids understand this is true for social media too, when kids sort of learn about,
oh, these big companies are trying to addict me, they're doing it for free,
but I get with my attention and staying on it longer is how they make money.
You tell that to teenagers, actually, there's been great research on this,
and they get pissed off.
I think we need to do the same with AI literacy.
Like this is how it works.
It's not some magical thing.
It's not another human being.
So when kids get older, we need to teach them about that.
And then they need, when they get older,
they need to start playing with it,
playing with it, using it.
But my huge caveat is with AI that is designed for kids. Right now,
there is a spring fling race by the large AI labs to get students to sign up.
You've got chat GPT giving two months free of GPT plus, then you got XAI come in two months free for Super
Grok and then Google not to be outdone is like, well, you can get a year free and I'll
give you two terabytes of storage.
And these are largely for college students and Google just made Gemini available for
kids through Parents with Family Plan.
And they are racing to get allegiance of young kids.
This is terrible because those products are not designed for children and for learning.
I guess then there is, to go back to your equity point, there is that it is the kids with the least access to all
kinds of enrichment materials, to tutors.
We know what rich kids in urban centers get.
And then what you're getting in parts of America that are rural and don't yet have broadband
or don't have wide access to broadband to say nothing of, you know,
a kid in Nigeria, in rural Nigeria,
that that is where at least a well-structured
Gen-AI tutor might be able to make a difference really fast.
You've talked a bit about a study in Nigeria that I never quite know how to
how seriously to take these studies yet but but why don't you say what it what
it did and what I found. So I think that AI has real potential for very specific
use cases particularly around access gaps and in Nigeria what was done was after school twice a week, an AI tutor helped kids learn
English.
And it was for six weeks, which is not long, it was June, July, I think, and it was a randomized
controlled trial.
We're still waiting for all the evidence to come through, but 0.3 standard deviations,
which is pretty good, equivalent to maybe two years of average sort of English learning.
We see that difference with other technologies too. It doesn't have to be gen AI. It can be
rule-based AI. It could be predictive AI. We've seen sort of similar benefits, for example,
in Malawi teaching literacy and numeracy to kids with offline tablets where teachers have maybe 80 to 100
kids in a class and each kid is having sort of a personalized adaptive learning experience.
That is hugely beneficial as well.
So that's one use case.
Another use case that I think is really great is neurodivergent kids.
Super helpful.
There's all sorts of kids that have different learning differences, that struggle in school,
don't have access to the specialists that they need, that would benefit greatly from
being in a classroom where they could have a little assistant to help them navigate.
My youngest son has dyslexia, and they sort of read and write,
text to speech, speech to text has been game changing for him.
There's also use cases here in the US.
You see AI being used and experimented around supporting
wellness advisors who kind of fill the gap for
school counselors in rural school districts, for example, where they don't have
school counselors, which is actually an actual person, but
AI is boosting that person's ability to have a helpful
conversation with a kid. And it's bringing, through tech,
a mental health resource into a community that didn't have one.
So there's lots of use cases actually, if done well, contained well, designed well,
and we humans have our hand on the steering wheel.
Ethan Malik, who's an AI expert,
he's got this idea that has been influential for me
about the best available human.
Is AI better for you in a certain purpose,
not then the best human, but the best human
available to you at a given moment.
Exactly.
So yes, having a professional excellent editor like my editor at the New York
Times would be better, but most people don't have that available. So AI is better
than the best available editor to them. There's a lot more demand for therapy
than there are therapists. So oftentimes AI is, you know, and practically where
it's going, even for me sometimes,
it's a better therapist than the best available therapist
I have available at a given moment.
It certainly seems plausibly true in education too.
There's all kinds of times when you are confused
by what you are reading, what you are learning,
and you're in a big class and it's embarrassing
to ask 55 questions or there's even time
to ask 55 questions and you don't want to seem stupid.
But if you could contain the system somehow,
and that seems more plausible here
where there's a fundamental prompt at the core of them,
then if we got that right,
in a lot of these use cases, it could be really powerful.
Absolutely. The key is what you said, contain the system.
We can't just bring commercial tech into
our schools and hope it will solve these problems.
It has to have guardrails.
We have to make sure that the data that it's being
trained on is legit and not
going to create harmful prompts for kids.
We've seen terrible things with commercial AI companions,
with young people developing relationships
and being really manipulated emotionally.
But you can put guardrails, it's totally possible.
It's just where, who, what the,
frankly it gets back to the incentives.
It gets back to the business model, which is where you, you know,
regulation and government could and should step in.
So yes, if contained is the question.
So then let me ask you about the other impulse somebody might have,
which is not that you're going to be replaced by somebody who knows how to use AI,
but that in a world where we have AIs,
the most important thing for human beings to be is as human as possible.
And that what we need to do is return to more classical education, reading the great books,
developing the attentional faculties that a lot of data and anecdotas suggest that even very elite students are losing,
to read a long book and think about it, to write a long essay, to be educated in the way that was considered high civilization education 70 years ago, and you might get it at a St. John's or a U of Chicago
or certain private schools today.
AI is gonna be everywhere.
Schools should be a place not where we learn
how to partner with machines,
because the rest of society
is gonna tell you how to do that.
Schools should be a place where we develop
specifically human faculties,
such that we
are capable and flexible and attentive in moving through a world that we just cannot
predict.
We 100% want kids to have the capacity for deep attention, and you're thinking about
your own kiddos who are young, and I'm thinking about my own teenagers who are 13 and 16. And I
see the undermining of attentive faculties from when my 16-year-old got his phone. For
a long time, he didn't want a phone because I'd been droning on and on for years because
he has me as a mother about addiction and opportunity costs,
and just that it's okay to enjoy it a little bit,
but can't sacrifice sleep and physical exercise and in-person communication.
Then he did get his phone and he struggles with it and he says,
''Mom, this is really hard.''
It's eroding his ability to do his homework or to follow something
he wants to do. The only thing that it doesn't seem to distract him from doing is playing
the piano because he loves playing the piano. So anything that we can do to actually ensure
young people are developing the muscle, And it's not just attention,
and attention is the entry point.
That's the doorway that gets you through.
It's actually reflection and meaning-making,
which is what you get from deep reading
and reading full books,
which a lot of young people struggle to do today.
You also can get it from other means.
You could get it from long, Socratic
dialogues in community with diverse people over time. But it has to be an experience
where you reflect, you think about meaning, you think about different perspectives, and
it changes how you see the world. But what do you think about this idea
that school should be a rare screen-free oasis
in a child's life?
I've sometimes imagined a school
that I could send my kids to.
I'm not saying it exists just in my head.
Yes.
Where what they do is they go in
and somebody is watching them and helping them read books
and think through math and there's long periods and they have a certain amount of exploratory
capacity in that.
You can choose between different books.
But the idea that maybe one space in their life would just be a place that is trying to encourage in them that capacity for
meaning making, for deep attention, for deep contemplation. It seems to me to be more
valuable than it seems to be to other people to just have a teacher sit there and watch kids read
for an hour and a half at a time. And then there's a discussion.
Then to do a lot of what we do in school. This idea of schools is explicitly counter
to the trends of the moment
because they need to develop things
that the moment will not naturally develop.
How do you think about that?
I think that's right.
If I had to choose for my own kids, and I do,
we would have a school that has no phones for all the reasons we
know and Jonathan Haidt has done a great job on sort of catalyzing that movement here in
the US and bringing it from across the globe to our schools. We should have cell phone
bands in school, bell to bell. Don't have it at recess because that's where you start interacting and playing with kids.
And I think we should make school a place where kids can actually interact with each
other, develop human-to-human socialization capacities because there is massive commercial
tech the minute they leave school that is vying for their attention.
And make sure to do some high quality AI literacy. AI literacy is way, way different than using AI
to learn. AI literacy is, what is this? How was it made? What are the risks? What are the benefits? And let's talk about how our ethics around this new tool
and how to incorporate it into our lives
with an adult instructor talking about how it works
and what it is.
That's AI literacy and that's important.
I hope you're right.
I've been in general very skeptical
of how much literacy will do.
But I guess this goes back to the point
you were making about- I mean, there is a question how much we will do,
but your question is, will it make a difference?
I'm as phone literate as I think you can almost be.
I've been writing about this for years.
Yeah.
I'm functionally extremist on this issue.
And still the only way for me to
modulate my own use to the point I would like to is
to use a device that hobbles my phone, the brick, every time I touch it to the RFID chip.
And if I don't do that, all the literacy in the world, I have known John Height for many,
many years.
He has been on this show.
I've read The Anxious Generation.
Yep.
Yes.
It doesn't do me that much good because that's just not how the brain works.
Any more than knowing that I shouldn't eat so many Oreos keeps you from eating them if
they're on the table in front of me.
And I think you bring something up that's really important, which is these things need
to be regulated.
It's ridiculous that they're out there being used by kids.
And it's ridiculous to say, Ezra, it's your willpower that should be the deciding factor. It's ridiculous for adults. It's ridiculous for kids. And it's ridiculous to say, Ezra, it's your willpower that should be the deciding factor.
It's ridiculous for adults, it's ridiculous for kids. These are incredibly seductive technologies.
This is a really tough one for me around because you do want kids to be fluent in the new technology
of the time, and you do want them to have an ethics and awareness about it
You don't want them to be seduced by it the large AI labs are perfectly capable
perfectly capable if they wanted to of creating a gen AI
Product that is designed for kids that will not
Be as seductive
I was just thinking about that.
I think they are, but I also wouldn't overstate how well they even understand
what it is they are doing.
They don't fully understand the systems they're making now.
Relentlessly, the kids are more capable and ingenious than, you know, the eight
or 40 or a hundred developers on any given project.
When you're building something that has a small number of hundreds of people building
it and then it's used by 40,000 kids, I think our experience is that they are clever in
ways typically that you are not.
I do think that over time we can create things that are curbed.
It's just that I'm not sure we even know exactly what we are
targeting.
What we are creating.
Well, I would say they have to change how they're developing
the products.
You can't create an AI that'll be great for kids and teachers
and teaching and learning without having teachers
and kids and education experts and child development experts
in the development process with you,
and so few are. So I think about what the Dutch government is doing. They're doing a partnership
with sort of the teacher unions and the academics and the tech companies, and they're having a
little lab to figure out, you know, what would AI look like in schools. But any of that sort of
bottom-up experimentation is the way to go before you roll it out.
Because most AI developers, although they might be good people, they're not child development
specialists, but if they change the way they develop their products, they could.
So then I want to go back to where we began, which is, you know, you've got young kids
now, they're going to be going into school in the age of gen AI.
How should you think about their schooling?
So we can't really predict the shape of society in 15 or 20 years.
I don't think that's a question we could answer on the show.
If we could, we should probably be investing, not podcasting.
But what we have in education now is constant markers that are supposed to tell us as parents how well our kids'
education is going.
And that's basically grades and maybe to some degree counselor reports.
And the idea is if they get good grades and they seem happy and well-adjusted, then at
the end of that process, they'll go to a good college or go to a trade school and get a
good job.
And it's going to be a pretty straight line.
All A's equal good job.
The future is foggier.
What they will need to know is maybe a little foggier.
What then should a parent be trying to watch in the meantime?
How do you think about whether or not your kid's education is going well if
you're a little suspicious that the grades designed
for and maybe even not that well designed for the society we have had are not going
to correlate all that well to the society we will have.
And I think as a parent, you yourself, but also other parents out there, are right to be suspicious, because I think that linear line is going to be much more complicated as the years go on with AI
in our world. So what I would think about is a couple of things. One, getting back to
the research I've done with my co-author and colleague Jenny Anderson. Grades don't show you how much kids are engaged.
I mean, schools are not designed to give kids agency.
Schools are designed to help kids comply.
And it's actually not really the fault of the teacher.
Teachers are squished from above with all sorts of standards and squished
from below with parents, you know, putting a lot of pressure on teachers about their
kids' performance and outcome. And what you really want is some feedback loops that are
beyond just grades and behavior to know, is my kid developing agency over their learning?
And what I mean by that is, are they able to reflect and think about things they're
learning in a way that they can identify what's interesting,
and they can have the skills to pursue new information.
That right there is,
I think, going to be the core skill.
It is the core skill for learning new things in
an uncertain world, which is, I think,
one of the number one things we think about. In addition to that, It is the core skill for learning new things in an uncertain world, which is, I think,
one of the number one things we think about.
In addition to that, I would say make sure kids are learning to interact with other human
beings.
Any school that has them working with peers, but even connecting with community members,
our social networks are getting smaller.
There's going to be a premium on human-to-human interaction as more and more skills get automated
and done by AI, which are the more knowledge cognitive tasks.
The sort of interpersonal caregiving, teaching skills are going to continue to be important
for some time.
I'm not sure for how long, but for some time.
And then the last thing, which may seem silly to you, but I increasingly keep thinking about,
is think about speaking, listening and speaking as the missing piece of literacy alongside
reading and writing.
We're going to need to show our merit and our credentials more and more through what the British call oracy
skills.
I think we've lost the art of listening and speaking.
I think that's a good place to end.
Thank you for speaking and listening with me.
Always a final question.
What are three books you'd recommend to the audience?
So, the first one is Democracy in Education by John Dewey,
which is over 100 years old.
And we are now seeing through lots of great neuroscience
that his observations around the teaching
and learning experience and what makes for a good teaching
and learning experience were right.
He has some great discussions around the importance
of reflection, not just ingesting knowledge, but reflecting on it, making meaning, figuring
out how to do things with it. And I love it because we didn't talk about this as much,
but the role of schools in our society are more than just your and my kids' education
and getting a job, even though that's what we care about most as a parent. They are about creating a democratic society or not. So that's an oldie but goodie. I love
it. John Dewey. The second book is by Gaia Bernstein. It's called Unwired, Gaining Control
Over Addictive Technologies. She's a law professor at Seton Hall University. I really enjoy this book because it gives a really good overview, particularly around
kids and young people, of the incentives that commercial tech has and what are some strategies
for resisting that and getting to a better place.
And the last one, it's called Blueprint for Revolution, how to use rice pudding, Lego
men, and other nonviolent techniques to galvanize communities, overthrow dictators, or simply
change the world by Serga Popovic, who was the student sort of leader, Serbian student
leader that started a movement to overthrow Slobodan Milošević and now is doing quite a bit of work on nonviolent
protest against authoritarianism. And to me, this book is sort of like the updated version
of nonviolent activism. He really gets media. He really gets social media. And I just think
it's incredibly relevant today.
Rebecca Woodthup, thank you very much.
Thank you. This episode of the Esmeralda Klein Show is produced by Andy Galvin, Fact Checking by
Michelle Harris, our senior engineer is Jeff Geld with additional mixing by Amin Sahota,
our executive producer is Claire Gordon. The show's production team also includes Marie
Cassione, Roland Hu, Elias Isquith, Marina King, Jan Kobel, Kristen Lin and Jack McCordick.
We have original music by Pat McCusker, audience strategy by Christina Samuelski and Shannon
Busta.
The director of New York Times Opinion Audio is Andy Rose Strasser and special thanks to
Switch and Board Podcast Studio.