Your Undivided Attention - The Man Who Predicted the Downfall of Thinking
Episode Date: March 6, 2025Few thinkers were as prescient about the role technology would play in our society as the late, great Neil Postman. Forty years ago, Postman warned about all the ways modern communication technology w...as fragmenting our attention, overwhelming us into apathy, and creating a society obsessed with image and entertainment. He warned that “we are a people on the verge of amusing ourselves to death.” Though he was writing mostly about TV, Postman’s insights feel eerily prophetic in our age of smartphones, social media, and AI. In this episode, Tristan explores Postman's thinking with Sean Illing, host of Vox's The Gray Area podcast, and Professor Lance Strate, Postman's former student. They unpack how our media environments fundamentally reshape how we think, relate, and participate in democracy - from the attention-fragmenting effects of social media to the looming transformations promised by AI. This conversation offers essential tools that can help us navigate these challenges while preserving what makes us human.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X: @HumaneTech_RECOMMENDED MEDIA“Amusing Ourselves to Death” by Neil Postman ”Technopoly” by Neil Postman A lecture from Postman where he outlines his seven questions for any new technology. Sean’s podcast “The Gray Area” from Vox Sean’s interview with Chris Hayes on “The Gray Area” "Amazing Ourselves to Death," by Professor StrateFurther listening on Professor Strate's analysis of Postman. Further reading on mirror bacteriaRECOMMENDED YUA EPISODES’A Turning Point in History’: Yuval Noah Harari on AI’s Cultural TakeoverThis Moment in AI: How We Got Here and Where We’re GoingDecoding Our DNA: How AI Supercharges Medical Breakthroughs and Biological Threats with Kevin EsveltFuture-proofing Democracy In the Age of AI with Audrey TangCORRECTION: Each debate between Lincoln and Douglas was 3 hours, not 6 and they took place in 1859, not 1862.
Transcript
Discussion (0)
Hey everyone, it's Tristan, and welcome to your undivided attention.
The great late media theorist Neil Postman liked to quote Aldous Huxley, who once said
that people will come to adore the technologies that undo their capacity to think.
He was mostly talking about television. This was before the internet or personal computers
ended up in our homes or rewired our societies. But Postman could have just as easily
in talking about smartphones, social media, and AI.
And for all the ways television has transformed us in our politics,
our eating habits, our critical thinking skills,
it's nothing compared to the way that today's technologies are restructuring
what human relationships are, what communication is,
or how people know what they know.
As Postman pointed out many times,
it's hard to understand how the technology and media we use
is changing us when we're in the thick of it.
And so now, as the coming wave of AI is about to flood us,
With new technologies and new media forms,
it's never been more important to have critical tools to ask
of technology's influence on our society.
And Postman had seven core questions that we can and should ask of any new technology.
And I'll let him tell you in his own words.
What is the problem to which a technology claims to be the solution?
Whose problem is it?
What new problems will be created because of solving?
an old one. Which people and institutions will be most harmed? What changes in language are being
promoted? What shifts in economic and political power are likely to result? And finally,
what alternative media might be made from a technology. Now, I think about these questions often,
and it may not surprise you to hear that today's episode is one I've been wanting to do for quite a long time,
since Neil Postman has by far been one of the most influential thinkers for my own views about technology.
His ideas have been so clear-eyed, prescient, starting in the 1980s,
about the role of technology and shaping society that I wanted to dedicate a full hour to exploring them.
So today we invited two guests who've thought deeply about Neil's work.
Sean Elling is a former professor who now hosts the Gray Area podcast at Vox,
and is often written and discussed Postman's relevance to our current cultural crisis.
We also have Lance Strait, a professor of communication.
at Fordham University. He was actually a student of Postman's at NYU and spent his career
developing the field of media ecology that Postman helped create. Sean Lance, thanks for coming
on your invited attention. Glad to be here. Thank you. So I'm just curious, you know, for me,
Neil Postman has been such a profound influence on our work. So in 2013, when I was kind of having
my own awakening at Google, that there was just something wrong in the tech industry. There was
something wrong about the way we were going to rewire the global flows of attention and something wrong
with the scrolling, doom-scrolling culture that I saw in the Google Bus. And I, you know,
used to be someone who really deeply believed just in this kind of, you know, tech is only good.
We can only do good with it. It's the most powerful way to make positive change in the world.
And it was this friend of mine, Jonathan Harris, who is an artist in Brooklyn, who first introduced me
to Neil Postman's work and, you know, his books, Technoply, and amusing ourselves to death.
and I just could not believe just how prescient and just precise he was in his analysis.
And I have been wanting to bring Neil Postman's, you know, just really critical insights
to our audience, who include a lot of technologists for such a long time.
So I'm just very grateful to have both of you on and hope we can have like a really rich
conversation.
So just to sort of open with that.
That's great.
I think I got Postman pill back in 2016 or 2017.
and it's, I mean, I came up as a political scientist, political theorist.
That was my education.
And we didn't really encounter any of this stuff.
Right.
But once I sort of internalized the media, ecological way of seeing things,
it really kind of changed how I understood all of politics.
It's pretty profound.
What was your entree into Postman's work and what you see as kind of his critical insights?
In 2016, I was invited by a former classmate of mine to give a talk.
at Idaho State.
This is sort of right in the beginning
of the Trump era and all the chaos
involved with that.
And I gave my little talk,
and then I went for a hike with my buddy,
who's a media theorist, and we got to talk in.
And at the end of that,
he sort of introduced me to Postman and media ecology.
And that was sort of the germ of the book
that we ended up writing together,
The Paradox of Democracy,
which came out in 2022.
But before that,
I'd never really encountered media ecology, Neil Postman, and for me, the value of these
great media ecologists is that you really force us to stop looking at media as just a tool
of human culture. And instead, to see it as much more as a driver of human culture, and this
changed the way I looked at the political world. I mean, what you discover when you look at the
history of democracy and media is that all of these revolutions in media technology the printing press
the telegraph radio film tv the internet it's not so much that these technologies are bad it's that
they unleash new rhetorical forms and new habits and new ways of thinking and relating to the world
and each other and that's very disruptive to society and the established order and we're sort of
living through that i could go on but i'll pause and and let lance speak
Lance, how about you?
How did you first get into this work and starting with your being a student of Postman's?
Well, I mean, I could go back to the 70s as an undergraduate in a class on educational psychology.
Postman's first big book, Teaching as a Subversive Activity was on the reading list.
And that was when he was still following McLuhan with the argument that we need to adjust.
ourselves to the new media environment.
Just a note for the audience,
Marshall McLuhan is another very influential media ecology thinker from Canada,
who famously coined the idea that the medium is the message.
And you'll hear his name throughout this conversation.
But I first read him in, I guess, in 79 with teaching as a conserving activity,
which was also when I first met him.
And that's where he did his about face.
although maintaining the media ecology outlook,
but arguing that we needed to counter the biases of television
because we're inundated with it.
And when Postman introduced the idea of media ecology,
and he gave it a very simple definition
that it's the study of media as environments.
And once we understand that,
then it's no longer just a tool that we choose to use or not use
and we have complete control over,
but rather it's like the environment that surrounds us and influences us and changes us.
And we look at democratic society and democratic politics, that was shaped.
Modern democracy was shaped by a typographic media environment and that television is reversing
so many of those characteristics and is a real question about what will survive of the various institutions
that grew up within the media environment formed by print culture.
So there's just already so much to dig into here.
So let's set the table a little bit for listeners.
Let's start by talking about Neil Postman's book,
Amusing Ourselves to Death,
which is really a critique of television
and how the medium of television and taking over society
and transitioning us from Lance what you were just talking about,
of a typographic culture to a television culture,
would completely shift,
and transform public discourse, you know, participation, democracy, education.
Does one of you want to take a stab at kind of the Cliff Notes version of Postman's argument
before we dive into specifics?
Well, I mean, it really is the shift from typographic era to the television era
and that that has undone a lot of key elements of American culture.
You know, as you may know, I did a book that followed up on amusing ourselves to death
called Amazing Ourselves to Death.
And I don't think Postman quite made it overt in amusing ourselves to death, but he has four
case studies, you know, and they're the news, politics, religion, and education, and how each
one has been transformed in a negative way by television.
And what I tried to explain is that what Postman hit upon there are the four legs that
the table of American culture stands on. Politics, democratic elections, obviously journalism as
the First Amendment and the way that makes possible democratic participation, absolutely,
often overlooked, but religion forms the kind of moral and ethical basis that our republic was
founded upon. And then education as the promise that people will be literate, like the
bottom line of education is reading, writing, and arithmetic that people will be literate enough
to be able to govern themselves, to get access to information and think rationally and make good
decisions.
Sean, do you want to add to that?
There's so much here.
I mean, when people talk about, you know, typographic culture versus televised culture,
let's just zoom into what do we really mean?
Because so much of Postman and Marshall McLuhan is essentially a kind of holding up a magnifying
glass to the invisible. When we say it structures, you know, the way that we think, like,
what do we actually mean by that? What's the phenomenology of reading text on a page that's so
different from watching this podcast in a video right now? Well, for me, I mean, the point in all
of this is to get us to really see how every medium of communication is acting on us by imposing
its own biases and logics. And they are different. You know, I mean, Postman talks about,
so, you know, you have the printed word. What is it to read a book?
what is the exercise of reading it's deliberative it's linear it's rational it's demanding what is tv
it's visual it's discursive it's entertaining it's all about imagery and action and movement
what is social media it's reactionary it's immediate it's algorithmic it kind of supercharges the
discontinuity of tv and monetizes attention and new and powerful ways right and like once you
you have this media ecology framework, you look at the eras of politics that coincided with these
communication technologies. You can see it in the language of politics. You can see it in the kinds of
people that win elections, how they win those elections, how they appeal to people. You can see it in
the movements and the dominant forces at the time. Like I was saying, it still blows my mind that
I made it through a graduate education and political theory and we never managed to read any media
ecology, because it really is, especially in a free and open society where people can speak
and think and persuade one another, it's a kind of master variable that's not often seen as that,
but it should be. So let's dive into that just for a second. So people think, okay, we live in a
democracy. You have candidates. Those candidates debate ideas. They have their platforms.
They talk about themselves. And then voters are sitting there and they kind of take in all those
arguments and they make a rational choice. And that's just democracy. And democracy is democracy.
It doesn't change over the last 200 years. So let's just explain.
Sean, what you were just saying, maybe, Lance, you want to do this.
In what way does media define the winners and losers of our political world?
I mean, Postman gives so many examples.
Well, I think we have to start with the fact that democracy was founded on the idea that people have enough access to information to make decisions.
But it also presupposes that people will talk to one another and be able to speak in Iraq.
national way. I mean, Postman's kind of wonderful illustration is of how people went to listen to
the Lincoln Douglas debates for hours upon end. And it can imagine there was a carnival-like
atmosphere, but still that people were willing to sit and listen for whatever, six hours of
debating going on, whereas today everything is reduced to these soundbites, you know, these 10-second
sound bites and you know postman points to two key technologies in the 19th century that start the ball
rolling away from typography and ultimately come together with television one is tell the telegraph
because just by speeding things up we have no time to think and reflect and and that really is
harmful so just the speed at which we're moving right which we see today where we've you know in this
moment we feel overwhelmed and there's like a new story every few hours some new thing happening
and we don't know what to do and the other thing is the image the photography of the 19th century
becomes the dominant mode of communication so between the two it's all about appearance
and personality that's that's communicated over the televised image and this rapid turnover that
favors celebrity and fame over substance yeah can i just say something real quick um the the telegraph
is such a good example of like a practical example of mclewins you know the medium is is the message
you know that how the the the medium itself the technology itself doesn't just influence content
it really dictates what it actually means you know and i was going back and and i was reading thorough actually
actually when I was researching my book.
And Thoreau was talking about the telegraph
as a kind of proto-social media
that it was actually,
he's arguing that it's actually changing
what constituted information.
That with the telegraph,
it became a commodity to be sold and bought, right?
We get the birth of the penny presses
and tabloid journalism.
And for him, that was sort of the end of the idea
that information was something
that was definitionally important
or actionable.
It just became another source of entertainment.
It became a consumer product.
So much of our work in this podcast and at CHT,
obviously it's like there's this question of,
why does any of this matter?
Like, why are we here talking about this?
And it's because technology and media
are having a bigger and bigger influence
on constituting our culture.
People always say, you know,
if culture is upstream from politics,
then now technology is constituting
the culture that is upstream from politics.
I was just at Davos in Switzerland, and I would say the most popular question being asked,
and it was like right on the inauguration day, January 20th,
and basically the dinners I was at, people said,
what do you think will matter more in the next few years?
The choices of political leaders or the choices of technology leaders and companies,
and especially when you tune into AI.
And so I just want to ground this for listeners of like, why are we even talking about this?
It's because technology is going to structure what a human relationship,
relationship is, what communication is, how people know what they know, the habits of mind.
So I just want to just make sure we're returning to kind of set the stakes of why this is so
important because so often I think the thing that's problematic for me about Postman is it just
feels so abstract. McLuhan, the medium is the message. It doesn't hit you about how significant
that idea is. So I just want to return, Lance, to the thing you were saying about the Douglas Lincoln
debates in the 1800s. I think most people don't know. We kind of re-past it. They debated for three
hours each, I believe is one guy took three hours, then the next guy took three hours,
and then there was like an hour rebuttal. Can you imagine seven hours of political debates
that are long-form speeches in front of live audiences? And just what a different notion of the
word democratic debate. So here we are. We're using this phrase democratic debate,
but the meaning of it has completely shifted of what constitutes those two words in the year
2025 than the year 1862.
And so let's just dive into, I think, another aspect of why this matters, which is the power
that media confers in the way it sets up what kinds of people win or lose.
Sean, you look like you're trying to jump in.
What's interesting is that, you know, for Postman, the TV air was all about entertainment,
right?
So like everything that unfolded on or through that medium had to be entertaining because
that's what the laws of TV demand.
But this era, where TV is still around, it still matters, but not nearly as much.
There's much more of a convergence with other mediums, like the Internet and social media,
which are now more dominant, really, culturally and politically.
And on these mediums, it's not about entertainment so much as attention.
The attention economy is mastered now, right?
So in the TV era, politicians really had to be attractive and likable.
They had to play well on TV.
Now they just have to know how to capture and hold attention,
which means leaning into spectacle and provocation and performative outrage or virtue, as the case maybe.
They dictate a different kind of political skill set to win.
One of the reasons why both Postman and McLuhan are so prescient,
at least that people think of them that way, you know,
that what they were talking about largely television,
and yet it seems to apply so well to today.
And for many people, really, it seems to better fit today, is that their analysis was based
not just on the specific medium of television, but on the idea of electronic media generally.
But I think entertainment was Postman's way of getting at the larger point, which is that it's
trivial, it's not serious, and what catches our attention, it's a larger set of dynamics.
entertainment was just kind of way of pinpointing but it really is that non-serious trivialization it's a
different kind of entertainment so i just want to um name a pushback that i got when i remember
speaking to these arguments in the tech industry when i was at google in 2013 which is people some
people might say well why is that a problem if people like to amuse themselves people like
amusement um don't we all need some amusement uh in in the world um what would you say
to that? Or what would Postman's argument be against that?
Well, Postman wasn't
against amusement.
He said television is great. The best
thing about TV is junk.
He loved TV, especially
sports. We
actually bonded together as Mets fans,
although his real love was the Brooklyn
Dodgers, but in their absence,
it was the Mets.
He also, and he loved
basketball and all of that.
I mean, sports is one of the great things
that television can provide.
It's awful for politics.
It's awful for religion.
And it really has degraded religious, you know, participation and presentation by putting it on television.
And also, you know, through social media and all of the other advances that we've seen.
And it's bad for education.
I would go back to what you were saying earlier about distraction.
which is a really important word.
I think that's more closely pegged to the role of technology here,
fragmenting our attention, pulling us around like greyhounds chasing around a slab of meat.
I mean, I was talking to Chris Hayes the other day,
who was on my show, and he has a new book out about attention
and the fragmentation of attention
and really sort of the death of mass culture in any meaningful sense, right?
And I was asking him, well, I mean,
isn't democracy on some level
a kind of mass culture
and if we can't pay attention together
if we can't focus on anything together
then what the hell
does that make of our democratic
politics
right? I mean
that's what concerns me
right? I mean I remember
you know reading
McLuhan who
would talk about
media and time and he was so obsessed
with electric media because it
flattened time and it made everything instantaneous and um and he would argue that this sort of
scrambled society's relationship to time and you know like radio and tv and now the internet
create this landscape where everything unfolds in real time but you know in a print
dominated culture where you're consuming weekly or monthly magazines or quarterly journals or books
that facilitates a kind of deliberation and reflection that you don't get when everything is so
immediate and frenzied and in a democracy where the horizon of time is always the hell of the next
election it's the next new cycle that kind of discourse makes it very hard to step back and think
beyond the moment it makes it very difficult to to solve collective action problems and all the
most important problems are collective action problems totally yeah i think to sort of my
interpretation of what you're both saying is that there isn't a problem with people
having amusement in their lives or having entertainment, it's about whether the media
systemically structures the form of all information in terms of its amusing capability or its
entertainment capability. And that that systemic effect makes us confused about whether we're actually
consuming information or getting educated versus we're really just being entertained. And he says,
you know, the basic quote, the television is transforming our culture into one vast arena for show
business. And that was for the television era. When I think about social media era, and I think
about Twitter or X, I think, you know, social media is transforming our culture into one vast
Vadiator Stadium arena for basically drama and throwing insults and, you know, salacious tweets
back and forth.
Another sort of key concept that Postman is critical of is the information action ratio. And I remember
this actually in the tech industry that so many people, and I used to really believe, how many
problems really had to do with people just not having access to the appropriate information,
which is all about information access. I mean, I had a tiny startup called Apsur that was a talent
acquired by Google that was all about giving people contextual access to more information. I remember it.
Do you remember that? Okay. Yeah. Yeah, it was good. Yeah, well, thank you. I mean,
it was motivated by, I think, the good faith version of this, which is that if people don't have,
imagine, you know, right when you're encountering something that you have no reason to be
interested in, the perfect most engaging professor, guide, lecturer, you know, museum curator
showed up and held your hand and suddenly just told you why this thing that you're looking at
is the most fascinating thing in the world. And that's what this little apture thing was.
It was basically providing instant contextual, rich information that was supposed to entrance you
and deepen your curiosity and understanding about everything. And it was driven
by my belief, which is very common in the tech industry, that it's all about, you know,
driving so much more information access. And if we only just gave people more information,
then that would suddenly make us respond to climate change or respond to poverty or do something.
And so I'd love for you to articulate, what was Postman's kind of critique of information glut
and the information action ratio he speaks of? Well, you know, I mean, his, what he would say is that
in the 19th century, not having enough information was a problem. But we solved it.
We solved it long ago, and that creates new problems because we just keep going and going and going.
I mean, I would say, you know, think about how most of human history not having enough food was a problem,
and today we are wrestling with issues of obesity because we solved that problem a long time ago.
We've got plenty of food, but we just keep going and going and going.
So, I mean, this was actually one of McLune's points, is that you push things far enough,
and you get the reverse.
You get it flipping into its opposite.
So information scarcity, by solving it, we create a new problem of information glut, and that leads us, you know, as you say, since most of that we're powerless to do anything about, it leaves us with irrelevant information, leaving us feeling impotent, powerless, which again, I think a lot of people are feeling particularly right now.
yeah i always found with those types there's a tendency to conflate information in truth as though
they're the same and they are not the same i don't know how anybody can look at the world right now
and say that this superabundance of information has been a boon for truth and to the point that
lance is just making it's this combination of being constantly bombarded with information most
of it, some of it true, a lot of it bullshit, a lot of it terrible, being bombarded with that
and also the simultaneous experience of complete impotence in the face of that. We've also engineered
an environment that elevates the lies, it elevates the falsehoods, it elevates the distractions,
it elevates the things that stimulate our more base primal impulses. And that, in the contest
between diversions, amusements, provocations, and dispassionate truth, I think we all know
who's going to win that fight 99 times out of 100.
And I would think it's really important to distinguish between information and knowledge.
And knowledge is something that we largely got from books.
And information is something that we're inundated through the electronic media.
and it doesn't really have to be true or false,
and that's why in a way the distinction,
while valuable in some contexts,
but the distinction between misinformation, disinformation,
and just information, is not that important
because when we have information glut,
anything goes.
You can't tell what's what,
because it's not relating to anything out there.
I think it's a critical point that you're making
because even, let's say,
we solved the misinformation, disinformation problem, boom, it's gone. It's all gone from all the
airways. You're still just bombarded by information glut and information that doesn't give you agency
over the world that you're seeing. The company's profit from mistaking and reorienting or
restructuring what agency means in terms of posting more content on social media. So I see
the social cause that's driving me to emotion and then I hit reshare and think that I've
done my social action for the day. I think Malcolm Gladwell wrote about this like 10 years ago.
so the kind of failures of tech solution is I'm going to reshare this content.
What I'm really doing is actually driving up more things for people to look at and keep getting addicted on social media.
So I'm perpetuating the money printing machine that is the social media company.
I want to actually get us to AI because so much of this conversation was really motivated for me about how do we become a more technology critical culture, which I think is what Postman was all about.
It's like, what does it look like to have a culture that can adopt technology in conscious ways,
aware of the ways it might restructure community, habits of mind, habits of thought, education, childhood development,
and then consciously choose and steer or reshape that technology impact dynamically,
such that you get the results you would want by adopting that technology?
And in doing that, I think I want to turn at this point in the conversation to his other book, Technoply,
which he wrote several years later,
which the subtitle is the surrender of culture to technology.
And I think this is actually the heart of what I'm...
I mean, I think that amusing ourselves to death is a very accessible thing for most people
and the race to the bottom of the brain stem
and social media as an extension of TV.
I think Technopoli really gets to the heart of what does it mean
to have a society consciously adopt technology in ways
that it leads to the results that it wants.
And what does that relationship look like?
So how would we set the table of the argument
that Postman is making in Technoply?
either of you. Yeah, I mean, this book was very interesting. In a lot of ways, his idea of
technopathy is really like a more accessible expression of Heidegger's critique of technology.
Technologies are things we use in the world to get things done or improve our experience in the
world. And then gradually as we move into the modern world, technology becomes almost a way
of being. As Postman says, we became compelled by the impulse to invent.
It's innovation for the sake of innovation.
It is a blind mania for progress disconnected from any fixed purpose or goal.
And that's sort of what Postman is calling technopathy,
where our whole relationship to the world is defined by and through technology.
Technology is this autonomous self-determinative force
that's both undirected and independent of human action.
And we're almost a tool of it rather than the other way around.
Here's Postman in his own words.
Well, in the culture we live in, technological innovation does not need to be justified, does not need to be explained.
It is an end in itself because most of us believe that technological innovation and human progress are exactly the same thing, which of course is not so.
Postman was talking about the personal computer as a quintessential technology of,
technopoly. I mean, my God, what would he make of AI, which by any measure is and will be
far more immersive and totalizing than personal computers? I just want to briefly add the quote
that Postman cites from Thoreau, since we've mentioned it multiple times, that our inventions
are but an improved means to an unimproved end. I think this really speaks to what you're
speaking about, Sean, which is Postman's critique that we deify technology. We say that efficiency
and productivity and all the new capabilities,
whatever they are that technology brings,
are the same thing as progress,
that technology progress is human progress.
And it's never been more important to interrogate
the degree to which that's true and not true.
And this is not an anti-technology conversation,
but it's about how do we get critical about it?
Lance, you were going to jump into that?
Well, first I'd say that Postman would say
that Heidegger was a Nazi
and should not be mentioned anymore,
but that the big influence
is on Technoply, where Lewis Mumford, who was one of the great intellectuals of the 20th century
and a key mediocrology scholar, and then Jacques Aulul.
And it definitely is this argument that, particularly in America, it's not about the stuff.
It's not about the gadgets.
It's about a whole way of looking at the world.
And that efficiency becomes the only value that we make any decision.
decisions on, which means that it's almost impossible to say no when somebody goes,
here's a more efficient way to do this. You can do it faster, do more with it, and we almost
never say no. And you must have seen this new thing about mirror genes or whatever. The, you know,
mirror bacteria. Yeah, well, they can create organisms with mirror image DNA, which our bodies would
have our immune systems would have absolutely no defense over and so we shouldn't do it well
somebody's going to do it i mean you know that somebody is going to do it because once we have that
capability nobody puts a stop to it um you know postman did know about AI because that that's been
around you know for uh much longer than people you know than this sudden emphasis on it and joseph
Weisenbaum, who was somebody that Postman knew,
I was one of these sort of pioneers in artificial intelligence.
He did the Eliza program and in his book, Computer Power and Human Reason,
you know, he introduces the word ought that we've forgotten to use, O-U-G-H-T,
you know, ought we do this, not can we do this, but ought we do it,
and that that is just vanished from our vocabulary.
And, you know, he argues that we need to reintroduce it.
it you know i i always think of that uh hilarious john stewart joke you know that the
the last words a human being will ever utter will be you know some dude in a lab coat who says
it worked you i mean like tristan i would ask you a question i mean you you were you were part of this
world in a way i am not you talk to these people the people who are building
a i who are who are want to build aGI and whatever else i mean they are acule aware
of how potentially destabilizing it can be.
Why do they persist in that?
Is it just the simple, well, if we don't do it,
China's going to do it or whoever's going to do it.
And so, therefore, we got to be first.
Same thing with the nukes.
It's actually related to what Lance is speaking about,
that if we don't have a collective ability
to choose which technology roads we want to go down
and which ones we don't.
And if we just say it's inevitable,
someone's going to do it and better we,
the good guys, who we think we have better values
than the other guys,
better off that we do it first,
we actually even know what the dangers are and can try to defend against the bad guys.
And I think that the thing that, you know, Lance, you were just speaking about with the
mirror bacteria is a perfect example because the reason that Postman's questions here
about how do we consciously make decisions about what technologies we should do and not want to do
rather than just because we can, we do it, is because AI is about to exponentiate the
introduction of new capabilities into society. So it's just, it's going to be a Cambrian explosion
of brand new text and media and generative everything that you can make.
You can make law.
You can make new religions.
You can make, you know, as we say, language is the operating system of humanity,
from code to law, to language, to democracy, to conversation.
And now generative AI can synthesize and decode and hack the language,
either of conversation in the form of misinformation,
hack code in the form of hacking cyber infrastructure,
hack law in the fact of overwhelming our legal systems or finding loopholes in law.
And so as we're unleashing all these new capabilities, it is more important than ever that we get an ability to consciously choose, do we want to do mirror bacteria?
But then the challenge is, as technology democratizes the ability for more people to do more things everywhere beyond global boundaries, our problems are international, but our governance is not international.
We have national governance responding to global interconnected issues.
And then we can see the political headwinds are not really trending in the direction of global.
governance, which is looked upon as a kind of a conspiracy of people who are out of touch of the
national interests of the people, which is a very valid critique. So yeah, Sean, I'm sort of wanting
to play with you here on what's your relationship to this question that you're laying out?
I don't know. I mean, I'm just constantly thinking of what are the trade-offs going to be.
I mean, you just think about the explosion of the Internet and the trade-offs involved there.
you know one consequence of that there are a lot of incredible benefits i love the interwebs i use them
every day but one of the consequences of that is the complete destruction of gatekeepers of any kind
of boundaries at all on the information environment so we lost a capacity society lost a capacity
to dictate the stories society was telling about itself and you know digital just exploded all that
The internet is like this choose-your-own-adventure playground, and it unsettles and undermines trust.
And a lot of people might say, well, good, these institutions, the elites were corrupt and untrustworthy to begin with.
Okay, fine.
But we tend to underappreciate how much what we take to be true is really just a function of authority.
Most of us haven't observed an electron or a melting glacier.
We take it to be true because we believe in the experts who tell us these things are real.
And we believe the video clips on the evening news of glaciers melting.
But if that trust is gone and the info space is this hopelessly fragmented thing,
riddled with deep fakes and misinformation and consensus reality isn't possible anymore,
then where does that leave us?
I will say I think there's actually a way to get to a good world.
It's just we have to distinguish between the Internet being a problem
versus the engagement-based business models that profited from drama,
derivatives, you know, the amusement culture, the tweetification culture, and personalized
information bubbles, which are incentivized.
So it's important to recognize the reason we have personalized, it's not just that you can
choose your own inventory, it's also true, but the mass like reinforcement of personal
information bubbles is actually incentivized by the business models because it's better
to keep you coming back if I gave you more of the thing that got you interested last time.
And so we can split apart the toxic thing of the engagement-based business models from the
internet, and then I think you could say is there a different design of internet protocols and
design of these Metcalf monopolies, meaning these network effect-based social media places where
there's only a handful of them, could they be designed in a different way that actually do
reward the kinds of mediums that actually enrich and bring out the better angels of human nature,
and that's still the optimist in me that believes that it's possible to do that.
Lance, I see you sort of nodding and also maybe skeptically nodding your head here, so feel
free to jump in. Well, I mean, I think Postman would question whether more technology is the answer
and every new innovation solves some problems but creates many more, which we then solve by
more technologies and it just keeps expanding and expanding and expanding that way. You know,
when I teach my students' media ecology, I try to emphasize, let's think about what are the appropriate
uses for this particular medium and then what's inappropriate and you know if we can start with that
the internet or various aspects of it were great for certain things and they were empowered people who
were you know kind of in minorities and and brought together people who were having difficulties in a lot
of ways i can speak just in terms of my own family with having raised an autistic child that
parents of autistic children were largely unable to like go to a self-help group in person
because your hands are full and being able to communicate over a discussion list or group
online was you know very valuable so you know this is where we face this problem of trying
to evaluate the costs and benefits i actually feel like there there is a vision of a world that would
work. And I agree with you, Lance, that it actually, it takes asking what are the appropriate uses
of a technology and the actively inappropriate uses and then consciously designing our social
structures, our social norms, our culture, like, not designing, but like, you know, practicing
cultural values that allow us to say what, how do we reward those appropriate uses and
anti-reward the inappropriate uses. Now, I want to just move a little bit from admiring the
problem because there's a tendency to kind of rehash all these things. And I,
I think Postman is unique in offering, I don't know if I call it solutions, but a form of taking an active and agentic stand on technology.
And he has this famous lecture series where he outlined seven questions that we can ask of any new technology.
And he said that these questions are a kind of permanent armament with which citizens can protect themselves from being overwhelmed by technology.
You know, the first is, what is the problem to which this technology is the solution?
What is the actual human or social problem for which that technology is the solution?
It's a very basic question, but it's a very powerful one.
So anyway, we can go into some of the others, but I'm just curious if either of you have a reaction to this
or as we move into more of a solutions-oriented posture.
Sean, what's your sense of this?
I think it's a great question.
I just go back to what we were saying a minute ago.
How do we answer it?
What is a mechanism for having that conversation?
You know, science is very good at giving us more of what we want.
It cannot tell us what's worth wanting in the first place.
And the problem is I don't know how as a society we have that conversation together about what's worth wanting
and then have a conversation about how to go about getting it.
I just don't know.
And the problem with some of these new technologies like AI is it's not even clear what they're going to do.
So it's very hard to talk about the tradeoffs that might be involved.
But I don't know, it's not a very good answer because I don't have one, I guess.
Well, and it's interesting because I think that, so one of the things that actually excites me about AI
is the ability to use it to more quickly augment society's ability to see the downsides and
externalities and play out simulations of various new technologies.
because one of the things that we have to get incredibly good at
is actually foreseeing the negative unintended consequences
before they happen.
So imagine inventing plastics,
but actually knowing about forever chemicals
and then taking a left turn so we don't go down the road
of creating more pollution
than we have the capacity to clean up.
And the same thing with social media.
And that's one of Postman's other questions
is whose problem is it?
So if it's the problem of not being able to generate content at scale,
whose problem was that?
This is basic second question.
The third question is what new problems will be created
by solving this problem with this technology?
So in the case of generative media,
we will create a new problem of people
have no idea what's true
because now anybody can create anything
and flood the information airwaves.
And then he asks which people and institutions
will be most harmed by the adoption of this technology.
So, for example, gatekeepers
or the idea of trustworthy or having
you know, any kind of authority or expertise is suddenly going to be eliminated by the fact that
there's a flood of information, kind of a denial of service attack on democracy, through all
this stuff that's coming. And then he has this really important, subtle question that he asks,
what changes in language are being promoted by this technology? And I'm curious, Lance, if you have
some examples that Neil has given on that one, because I think it's such a crucial one that's
very subtle. Well, sure. And I think it's actually a very important one. And you're right that it does
sort of take a left turn from the other questions. But what's often missed when folks just
look at like amusing ourselves to death and Technoply is that Postman's grounding was in the
study of language. And he was, he started out in English education. And he was also very much
associated with general semantics, which in a large part is about our use of language and trying to
understand our misuse and how that changes our thinking. I mean, I think for me a great example is
community. And when you think about the use of the word community, in a real community, people
are together and they don't all share the same interests and viewpoints, which is what we mean when you talk
about online community, virtual community, and that's where you get that siloing effect.
You know, in a real community, people have to negotiate with people who are very different
from themselves and find a way to live together. And you can't just like pick up and leave
where you live. Whereas on the internet, you can just, you know, click a button and you've left
that community. And you find one that's more to your liking. So that meaning of the word
community has changed drastically by that usage and that is also you know you could also connect
that back to a kind of Orwellian quality because that was you know the idea in 1984 and it's
expressed in the index that we can change the meaning of words and change the way people think
that may not be happening all that intentionally as it was under a totalitarian system and
And it actually did happen under Nazi Germany and in the Soviet Union.
But it's still happening and it's still changing the way we think.
I think it's an excellent point.
And it feeds back into real community.
So when people are in real community, their expectations have been formed by these online experiences and these new definitions for words.
Sean?
I guess I've done a lot of technology bashing here.
And I just want to say it's not all of our problems cannot be laid at the feet.
of technology. I mean, it is also true that over the last three, four decades, we have stopped
as a society investing in social infrastructure, community centers, libraries, third spaces,
where people can actually get together and talk and be with one another and engage their
community and not just be home alone ordering pizzas with the app so that they don't have
to engage with another human being in the entire process, right? So my worry,
is that these technologies have pushed society in a more solipsistic direction.
It's pulling us more inward, a la the movie, her, I feel like that's where we're going,
where people are just, they're going to be in relationship with chatbots.
They're going to be, you know, at home using VR technology or whatever,
and they're going to stop going outside and doing things with other people.
And so we have failed on both fronts.
And there is a, there are policy solutions that could counterbalance some of this
if we invested in those things.
And we, we haven't.
or we stopped, and we should again.
I agree.
I just wanted to name one other example of language that change that is happening without
really reckoning with it is Elon's redefinition of saving free speech when he takes over
Twitter to protect people's ability to reach millions of people anonymously inside of a news feed
that rewards the most salacious inflammation of cultural fault lines in the cultural war.
And like a system that just like rewards the toxicity of inflammation.
on cultural fault lines everywhere,
and then saying that that's about free speech,
it's like it's a kind of a new speakian kind of turn
on what was freedom of speech really meant to protect
in its original essence as defined by the Founding Fathers,
and it had nothing to do with,
or it certainly did not foresee a world
where a single person could reach 200 million people
every day with their thumb as many times as they wanted to,
and that's a different thing than the deeper ideas.
And so I just think that's a question of,
of language we just imagine a society that is actually asking that question so imagine a sort of a postman informed society and every time there's a new technology rolling out their immediate first thoughts are instead of being entranced by it and deifying the technology and welcoming it with excitement and using it they first asked what is the problem for which this technology is the problem is that what are the new problems that are going to be created by this technology what are the changes in language that it's actually hiding from us about the way it's reconstituting things so i just i feel like that's a vision of of
society that I'm reminded of the I think it's the opening chapter of technoply where he talks about
the story of was it socrates and and famous um famous famous famous and where it's it's really about
what is a conscious adoption strategy of technology where I think in that story they're actually
talking about should we adopt the written word and they're sort of talking about that as a choice
and noticing all the things that that's going to give and also it's what it's going to do and also
of which things it's going to undo in the society.
And I just feel like that's so within reach
is to have cultures that actually are critical of technology
in which Postman is part of the curriculum
of political science courses at every university
and part of undergrad education.
And it's all the more important because technology
is so central in the fundamental shaping forces
of the entire world.
So maybe I'm just a dreamer, but this is the place.
Can I ask you a question?
Tristan?
Do you think it's the responsibility of the people?
building these technologies to ask themselves these questions, or do you think it's the responsibility
of the public to ask and answer these questions and then impose their solutions on them?
It's all the more important that the people building it have a critical understanding of what it will do
because they're being at the driver's seat and the control panels about how it's going to roll out
means that it's even more important that they're contending or tending with these questions
than it is with the regular public, and I think the regular public needs to contend with it as maximum as possible,
Lance? Well, I mean, the history of invention shows that inventors pretty much are wrong about what their technology is going to do. And so they're the last people. I think Arthur Kessler called them sleepwalkers. You know, that television's a great example because when television's introduced or, you know, especially in the post-war period, all of the write-up of it is it's going to bring culture into everyone's home. They'll have opera.
and ballet and classical music.
And it's going to be wonderful for democratic politics
because we'll be able to televise political conventions
and people will see, you know, debate and discussion
on political issues.
You know, and they couldn't be more wrong.
And I, you know, I think there's a great spirit of play
that comes with invention.
It's just, you know, to see, you know, what can be done,
what we can do.
But I don't even know if an air,
AI program, I mean, you mentioned this before, Tristan, but I don't know if that can, they, I don't know, it can adequately foresee all of the consequences because you introduce a change into a highly complex interdependent system. It's going to change something, it's going to change other things. They're going to interact with one another. It's a complex system for sure. Yeah. And to be clear, I want to say a couple things. I agree that,
We don't, inventors don't have a good track record of foreseeing the consequences of their invention.
I do think that there are tools one can use to much better foresee what those consequences will be.
In 2013, how could I foresee that the attention economy would lead to a more addicted, distracted,
sexualized society?
It's because the incentives at play help you predict the outcome.
And I think an incentive literate culture that follows the Charlie Munger quote, you know,
if you show me the incentive, I'll show you the outcome.
If we can understand what the incentives are, you can get a very good sneak preview of the future.
I don't think it's an easy thing to reach for,
but I think it's something that we need more of
if we're going to be a technology-enhanced society
and actually make it through
because we're quite in danger now.
Sean?
Yeah, look, even if the answer to these questions
is, you know, in the words of Nate Bargazzi,
nobody knows, we should still be asking them.
That would at least be a start.
And that's just not something that we've done or are doing.
I think one of the real needs is to really reinforce literacy and that this is ultimately what's being threatened
because that is the foundation of democracy and it's the foundation of the Enlightenment.
In Postman's last book was building a bridge to the 18th century, which wasn't saying that we should go back to the 1700s,
but that we should retrieve from that era that literacy, typography,
the enlightenment, and the respect for science and democracy that existed back then,
that we need to reinforce those elements of the media environment
that the electronic media are really doing away with.
And when you say, what is the problem that AI is going to solve?
And I actually mentioned it before.
I mean, information glut is one of the problems that it's there to solve.
But I think one of the problems is that reading and writing are hard.
They're hard to do.
Anyone who has written a book will tell you that.
What could be more unnatural than sending a five-year-old to sit still for hours on end?
But that's what you need to learn how to read and write.
And so what are we doing?
I mean, and we've been doing this for a long time now.
We're developing technology to read for us and to write for us.
I mean, that's what AI, voice synthesis and voice recognition, that's what it's all doing.
So we don't have to do it ourselves.
So the way to at least try to mitigate this is by reinforcing those aspects of the media environment that we still have that are under assault today.
yeah i would just say that in a lot of ways the problem of our time is this misalignment between
our interest and our incentives and the tragedy really is that we have built tools that have
undermined our capacity to alter our incentive structures in healthy ways that is that is it
that if our whole damn problem could be distilled that's it i don't know what to do about
that but that's the challenge ahead of us and we've got to figure it out completely completely agree
If incentives can control the outcome, then, and governance is normally the ability to change what those incentives are.
You pass a law or a policy and you build social norms and consensus in order to get that law or policy passed to change and say, hey, you're not allowed or you can't profit from this thing that would be highly profitable, like whether it's underage, drugs, sex trafficking, whatever the thing is.
So I completely, completely agree.
I know we're basically here out of time and just want to close with this quote that no medium is excessively dangerous if its users understand what its dangers are.
It's not important that those who ask the questions arrive at any at my answers, or Marshall McLuhan's.
This is an instance in which asking the questions is sufficient.
To ask is to break the spell.
And that just feels like what we're arming here is let's arm ourselves with the questions to protect ourselves from getting further overwhelmed.
and also let's be honest about the nature of what's coming.
Questions are our most important medium.
That's from language, and that's the way that we start to think about things critically and deeply.
Well, no one's going to listen to a three-hour Lincoln-style speech to save us, so we just need a kick-ass meme.
That's going to bring us all together.
That's your job to find a tweet for this one and create some memes that are going to go viral.
We'll tweet our way through it.
No worries.
Sean and Lance just wanted to thank you for coming on your undivided attention.
That's great.
My pleasure.
Thank you.
So a thought I'd like to leave you with.
There's a quote from the introduction of amusing ourselves to death that has always stuck with me,
where Postman compares two dystopian visions for the future.
The first presented by George Orwell in 1984, of surveillance of Big Brother,
and the other presented by Aldous Huxley in Brave New World.
Postman wrote, what Orwell feared were those who would ban books.
while Huxley feared was that there would be no reason to ban a book,
for there would be no one who wanted to read one.
Orwell feared those who would deprive us of information.
Huxley feared those who would give us so much
that would be reduced to passivity and egoism.
Orwell feared that the truth would be concealed from us,
while Huxley feared the truth would be drowned in a sea of irrelevance.
Orwell feared that we would become a captive culture,
while Huxley feared we would become a trivial culture.
As Huxley remarked, the civil libertarians and rationalists who are ever on the alert to opposed
tyranny failed to take into account man's almost infinite appetite for distractions.
And it was Postman's fear that it would be Huxley, not Orwell, whose prediction would come true.
Your undivided attention is produced by the Center for Humane Technology,
a non-profit working to catalyze a humane future.
Our senior producer is Julia Scott.
Josh Lash is our researcher and producer, and our executive producer is Sasha Fegan.
Mixing on this episode by Jeff Sudaken, original music by Ryan and Hayes Holiday.
And a special thanks to the whole Center for Humane Technology team for making this podcast possible.
You can find show notes, transcripts, and much more at HumaneTech.com.
And if you like the podcast, we'd be grateful if you could rate it on Apple Podcast, because it helps other people find the show.
And if you made it all the way here, let me give one more thing.
Thank you to you for giving us your undivided attention.