Big Technology Podcast - Is Social Media Making Our Society Stupid? — With Jonathan Haidt
Episode Date: July 20, 2022Jonathan Haidt is a professor of ethical leadership NYU's Stern School of Business and author of a number of books, including The Coddling of the American Mind. His recent story in The Atlantic, "Why ...The Past 10 Years Of American Life Have Been Uniquely Stupid," sparked a debate about whether social media was bad for society, and how we know for sure. Haidt joins Big Technology Podcast to discuss why he thinks social media is indeed responsible for our "structural stupidity," digging through the research and answering critics' objections. Stay tuned for the second half where we actually discuss some solutions. --- You can read Haidt's article here: Why The Past 10 Years Of American Life Have Been Uniquely Stupid You can review the collaborative Google Doc here: Social Media and Political Dysfunction: A Collaborative Review And here's my story on the Retweet button: The Man Who Built The Retweet: “We Handed A Loaded Weapon To 4-Year-Olds” Please review the podcast to help us get more great guests!
Transcript
Discussion (0)
LinkedIn Presents
Hello and welcome to the big technology podcast,
a show for cool-headed, nuanced conversation of the tech world and beyond.
Jonathan Haidt is our guest today.
He's the professor of ethical leadership at New York University School, Stern School of Business,
the author of The Codling of American Mind, a very popular book that you may have heard of, likely have heard of.
And he wrote a terrific story in the Atlantic called Why the Past Ten Years of American Life have been uniquely stupid.
We're going to get into that and much more as we discuss the role of social media in our society and how much it's contributed to how bad things have gotten.
Professor, welcome to the show.
Thanks, Alex.
It's a pleasure to be talking with you.
Great having you here.
Let's start with this Google Doc that you put together.
It's called social media and political dysfunction.
And it's pretty fascinating how it has some arguments for why social media has been bad
and then some arguments for maybe the role of it as being overblown.
So can you tell us a little bit about the origination of this document and sort of, yeah,
how it came to life and what you learn from it?
Sure.
So I'm a social scientist and I have this naive belief that if we just do the right
experiments and studies and gather all the data, we can figure out what things will lead
to the best outcomes.
And this is something that has, I guess, been thought by many people, particularly on the left for the last 300 years.
And it actually doesn't have a very good track record.
And the reason is because we all suffer from confirmation bias.
We all are very good at finding evidence to believe whatever we want to believe.
And so what I found is when I get into a debate, you know, my natural impulse is to just say why I'm right and find evidence why I'm right.
but I'm a great fan of John Stuart Mill
who said he who knows only his own side of the case
knows little of that.
Okay, this is a very roundabout way of saying.
I found that the best way to actually make any progress on something
is you try to get all the evidence on all sides
and you invite people in who are your critics
and you cite your critics.
And I guess this is a very, you know, Jewish or Talmudic way of thinking.
It's, you know, you actually, you know,
you canonize the whole argument, both sides of it.
So anyway, I did this first for the debate around whether social media is harmful to teen mental health.
And I hope we'll talk about that one later.
I started that one in January of 2019, soon after the Kotting of the American Mind came out.
And I was focused on, you know, is social media the cause of the collapse in teen mental health that began in 2013.
And then more recently, as I've been turning my attention to whether social media is behind the collapse of democracies, or I should say the decline of democratic efficiency.
and quality around the world.
So I started collecting evidence there.
I teamed up with a sociologist named Chris Bale,
and we've been collecting all of the evidence there.
And so it's been getting some press because it's actually really interesting
what you see when you go through it.
It's now like 120 pages long.
It's long.
Yeah.
But the cool thing about it is that I began with the question,
Is social media destroying democracy?
Well, how do you operationalize that?
How do you measure that?
And it turns out that you can break it down
into at least seven different questions
for which there is a lot of data.
So it's things like,
is social media creating echo chambers?
Is it fomenting political violence?
You know, do nasty, angry content, tweets, posts,
do those go further?
So that's what the document is trying to do,
is collect all the evidence on seven different questions.
Right.
And so there's studies that show, yes, you know, it is responsible for a lot of these ills.
And some of the studies that show, okay, maybe not.
And you got, you come to the conclusion, and maybe you can correct me if I'm wrong on this one,
but you took a look at the available evidence and you came to the conclusion that the past 10 years of American life,
which we know has, has been, you know, fairly ridiculous is largely in, you know,
downstream of some of the things that we've seen in social media.
So can you talk a little bit about your journey of reading through these papers,
the pro and the con, and then coming to your conclusion that you did in that Atlantic story.
Yeah, sure.
So it's interesting.
Yeah, you said, is downstream from, that's actually a good phrase.
You know, there's a, you know, a lot of people talk about the way politics is downstream from culture.
You know, culture will cause things to happen in politics.
But there's a growing recognition that politics and culture, especially culture, are both
downstream from technology.
And it's already very well established in political science research.
that cable TV had a huge impact on our culture and our politics because you go from broadcasting
where there were three networks, temporary period of a few decades.
Everyone's got the same message.
Everyone is on the same page.
When you get cable TV, and especially like Fox News in the 90s, you get narrowcasting.
So we all understand that that had a huge change on our information ecosystem.
Now the question is, what happens when we go from narrowcasting of cable TV, you know,
hundreds of stations aimed at Tart, at sub audiences.
When we go from that to microcasting, which is, you know, millions and millions of individuals
broadcast, well, putting out stuff on Twitter and Facebook, Instagram.
And so everybody kind of agrees that, yeah, this has big effects, but the research community
has had trouble coming to any conclusion.
Now, this is typical.
Does television promote more violence?
It's taken us decades to figure that one out.
And actually, I don't even know the answer because, you know, the studies were all
all over the place, and that's not my field.
I haven't reviewed that.
My fear was that we were going to do the same thing with social media.
Mark Zuckerberg repeatedly cites two studies to say, no, there's no problem,
because, you know, look, the most polarized people are actually the elderly,
and they watch cable TV and they don't consume, you know, they're not that much on Facebook.
So he cites that one study, and then there's another one he cites about how polarization
has gone up in some countries and down in others.
So Zuckerberg himself always signing these two studies by a team of research at Stanford.
And I found this a little frustrating because there are a lot of studies out there.
And so again, when you put them all together, well, here, let's just go through it.
I would invite listeners to go to jonathanhite.com slash social media.
I've put all my papers, everything I've done on that one page.
And if you then click on the thing that says social media and mental health, a collaborator of review, you can then scroll down.
you look at the table of contents,
and we have it organized, as I said,
it's seven questions.
So I'll just read them here,
and then we'll dive into one of them.
So the first question,
is there an association between social media use?
Oh, wait, I'm sorry.
I'm on the wrong Google Doc.
I got so many of them now.
Okay, let me change that.
I need to be on the one that says,
oh, social media and political dysfunction,
a collaborative review.
That's the one that we're talking about here.
Okay.
Yeah. So now we scroll down to the table of contents.
Question one. Does social media make people more angry or effectively polarized? That is, doesn't make people hate the other side more. Does social media create echo chambers? That's number two. Number three is, does social media amplify posts that are more emotional, inflammatory, or false? Question four, does social media increase the probability of violence? Question five, does social media enable foreign governments to increase political dysfunction in the United States and other democracies? Question six, does social media decrease trust?
Question seven, does social media strengthen populist movements?
So in each of those seven questions, Chris and I have organized the studies into those
that indicate the answer is yes, there's a problem or no, there's no problem.
And also sometimes there are mixed studies.
And what I want to point out here is that for any question in the social sciences,
you have to operationalize it in a particular way.
You have to define what are we, you know, what's the thing we're measuring that we think is
causative, causal, and then what's the thing we're measuring that we think is the result?
So there's a lot of decisions that go into that.
So let's dive into just one, which is echo chambers.
This is the one that Facebook mostly talks about.
So there are a number of studies showing that actually, if you look at the average Facebook
user and you say, is Facebook taking this user and putting them in an echo chamber where
they only hear one thing, where is Facebook exposing them actually to far more than
they would ever get if they weren't on Facebook if they just watch TV. And Facebook is right
that the average Facebook user actually is exposed to more stuff. More stuff crosses their eyeballs
than if they weren't on Facebook. And that was surprising to you. Well, not necessarily. Well,
okay, it's, I, yes, I had a gut feeling that echo chambers were real. But here's where it gets so
interesting in the social sciences is it depends how you operationalize it. And so if you
operationalize it as, let us look at the average user, you know, we'll look at, we'll look at 10,000
users, and we'll take the average, and we'll see whether they have more or less stuff coming in.
And if that's the way you do it, then Facebook is right, and you'll see several studies in there,
let's show that the average user actually is exposed to lots of stuff.
Now, first of all, does that mean that they get less polarized?
And as several writers have pointed out, including Zenette Tufecki, I hope I'm pronouncing her name, right?
Just because stuff crosses your eyes doesn't mean that it's going to be depolarizing because a lot of, you know, if you're on the left, you're exposed to right-wing stuff, but mostly because other people have said, can you believe this bastard, this racist, this horrible person? Look what he says. So even if you accept that operationalization, it doesn't mean that it's actually having a beneficial or mind-opening effect. But, and here's the really cool thing, when you look at the studies closely, what you see is that there are multiple operationalizations in the
in the pool of studies.
And one way of operationalizing it is, well, what about extremists?
What happens if you're already far on the left or far on the right?
And then you go on Facebook or Twitter, other social media platforms.
And there, the answer is yes, yes, you do get sucked into world pools of other extremists.
And so what's the question we're asking?
Facebook takes one operationalization and declares victory, not just on, not just on
on echo chambers, but on everything, they see it's not bad for democracy.
But there are so many different possible paths of harm, and we have to check them all out.
And it turns out that on this one, echo chambers, the extremists do get more extreme.
And they're the ones who are killing people.
They're the ones who are shooting up churches and synagogues.
They're the ones who we should be most worried about.
And there it is a problem.
And this one question is the most favorable of all the questions to Facebook.
But the other six questions, the evidence is more on the negative side.
It's not so, so sort of, you know, so many studies saying that there's not a problem.
So the good thing about a document like this is it gets you out of the ridiculous state that we're in otherwise, which is, well, wait, but I heard there was a study debunking of that.
Oh, yeah.
Well, no, no, I heard that there was a study proving it.
Like, there's thousands of studies.
You have to look at them in a holistic way to make sense of what's happening.
Right. Okay. And so now that we spent some time like talking about the process and stuff like that, can you just like as briefly as possible talk about like in your view, what is the stupid that we've been living in in the past decade? And then why exactly is social media responsible for that?
So the stupid, yeah, thank you for giving me the chance to really clarify that. Some people told me I shouldn't use the word stupid in the title because it seemed insulting. So let me think it was a great headline. Yeah, but sorry. Go ahead.
Yeah. Actually, the Atlantic, it wasn't my title. The Atlantic A-B tested, and that's the one that won.
Okay. Technology, man.
Yeah, technology. A lot of good things about it. A lot of powerful things about it.
Yeah. So Americans are getting smarter over the decades. IQ is rising in our country and around the world. It's not that Americans are getting stupid.
What I argued in this Atlantic essay was that we have this new thing called structural stupidity.
And just as you can have structural racism, even if nobody is a racist,
You can have structural stupidity, even if no one is stupid.
And so we see this in universities.
This is where I really got onto it.
I started graduate school in 1987, and universities have always been like these walled gardens
where you can entertain ideas, and it was actually good to be provocative.
And our patron saintess Socrates, who said he was the gadfly of Athens.
And of course, they put him to death for it eventually.
But we always valued people who questioned received list.
We thought that was a good thing to do.
But suddenly in 2014, 2015, we had this new thing happening where if you questioned received wisdom around sacred issues, especially race, gender, immigration, a few other topics, if you question receive wisdom, the social consequences are so severe that you learned quickly not to question received wisdom.
And then you would have just people saying and doing all kinds of stupid things.
So I saw that in the academic world.
just people failing
to make the most basic
social science observations
about base rates
about other causal factors.
And that started in 2015.
It really hit us hard in 2015.
We got really stupid.
And we implemented all these policies
that backfire
and we just keep using them.
Mandatory diversity training,
you know,
all these things we did
in response to protesters
that don't work.
That makes things worse.
And then around 2017, 2018,
these ideas leaked out
from universities into journalism, a bunch of other areas,
in part because you guys hire people from Arleigh universities.
They bring these bad ideas with them.
But what they're really bringing with them
is that a small number of young people use social media
not to make arguments, not to respond to arguments,
but to destroy anyone who contradicts their sacred values.
And that's what makes a group stupid.
If journalists refuse to cover the other side,
and you hear this among young journalists,
why should we give a platform to the other side?
I'm not going to interview people on both sides of this question, then journalists get stupid.
And so that's what I meant by structural stupidity.
And now, why is social media responsible for the structural stupidity?
I mean, the...
Oh, because...
Yeah, well, okay, go ahead, and then we can talk about it.
I mean, you know, look, if we lived in a place where people could just literally stab you with a knife,
if you said anything wrong, we'd all be walking on eggshells and we wouldn't ever want to offend.
But we don't live in that world.
If somebody stabs you with a knife, then they're going to go to jail.
so that's not an issue.
But having a relationship ruined or damaged,
being publicly shamed or called names
is really scary for people.
The Romans observed that people were often less afraid of death
than they were of social slandering and social ruination.
And so what Twitter in particular,
Twitter's the worst here,
but it's also on Facebook and Slack and other channels.
What Twitter did is it essentially gave everybody a dark gun,
anybody could shoot a darted to anyone else.
Like imagine if you literally,
if I were to question anything about police violence
or the gender gap pay or anything,
if I were to question that instantly,
I would get shot with 20 darts,
like physical darts penetrating my flesh,
like really, really hurt.
I'd be very careful what I said about any of those topics.
And that's what happened.
Social media gave everybody a dark gun.
And as a result,
because most people don't want to shoot anybody,
but the far left, the far right,
trolls, a few groups of people, shoot moderates in their institutions, and they shoot the leaders
of the institutions. So that's why we're seeing, we keep saying, why don't the university president
stand up for academic values? And they rarely do. Why is that? Because they get shot full of
adults. Right. And I thought something that was interesting in your story was talking about, like,
who's actually, you know, the ones that are sharing on social media. I think this is, yeah, this is
the right step from your story. Let's see. Who's using, who's using what you call, you know,
these dark guns. Let's see, the furthest to the right known as devoted conservative comprised
6% of the U.S. population. The group furthest to the left, the progressive activists comprised
8% of the population. The progressive activists were by far the most prolific group on social
media, 70% at shared political content over the previous year. The devoted conservatives
followed at 56%. So talk about who's actually shaping the conversation. And by the way,
like before we well why don't why don't you go ahead and then i'll ask my follow up on that sure yeah
so um most people what i what i've seen throughout this i got i began getting involved in this
issue is when campuses started going crazy in 2014 2015 most people are reasonable decent don't want to
hurt anyone most students want to learn um the problem is that um is that a small number of people
who who want to display their morality and want to change the world um
use methods that they think will be effective, but that end up hurting people and backfiring for
their cause. And so those numbers you gave, that was from the hidden tribe study by a British
group more in common. They studied American electorate, also the British electorate.
What they found through cluster analysis is seven groups of people, and the one of the far
right is that, you know, that's the polite, they didn't say this, but you can see it in the report.
They're authoritarian.
They are the more narrow-minded.
They are the more racist.
They're the whitest of all the groups.
So they are the second most politically active group.
And if you ever cross the right, you're going to get death threats.
It's very scary to cross that group on social media.
But the group furthest left, the progressive activist, that's the second whitest of all the groups.
It's actually very interesting.
This is mostly young white people who have a kind of politics in which, again, they don't learn to make arguments.
They don't persuade, they attack and intimidate.
And there's a role for that, I suppose, in politics and the public square, but not in universities, not in newspapers and journalism.
So it really, it's the, you know, the right has really no, I've never met anyone on the far right practically in my life.
But in universities, there are a lot of people, the progressive activists, who are sort of given, what's the word, not free reign, people sort of are very careful around them.
and it has a really chilling impact on what we say,
and therefore it prevents us from doing our jobs.
In fact, I'd like to cite an article I read last year,
headline is the man who built the retweet.
We handed a loaded weapon to a four-year-old,
the button that ruined the internet
and how to fix it by this guy, Alex Cantorowitz.
That's right.
And this is one of the points of my article
that I didn't know until I teamed up with Tobias Rostockwell,
who was the tech writer.
social media was not at all toxic in 2004, you know, when the Facebook came out in MySpace.
And if you're sharing photos of your dog, there's no problem.
It really was the move to the news feed.
And then especially the implementation of the retweet button, which became the share button, and then also the like button.
But things just become much more engaging, algorithmized, and viral.
And that couldn't, that didn't start until 2009 with the retweet button.
So you tell the story of this engineer, Chris Weatherill.
And, you know, they thought that they were giving more people voice.
And he's a progressive.
He thought this is going to be great for, you know, for black people, for women, for all the groups that don't have voice.
This really lets them get their voice out there.
But what happens?
They're exactly the groups that get swarmed and mob.
They're exactly the groups that get most harassed.
So I think this hyperviral, these changes in architecture in 2009, that's what I focus in on.
That's where I think the real problem is.
Yeah. And I really like how you put it, how we moved from connection to performance. And, you know, that's something I wish I thought of when I was writing this story, you know, speaking with Chris about what the retweet and the share button has done to social media. But it did change the way that people use social media from, you know, this is the way we connect with people, share something that our small group will be interested in. And then in the name of engagement, you know, it became performance. How can I reach the most people? And often you do that by stoking outrage.
well that's right and so this is so i'd like to make a point here this always bugs me when people say
oh yeah sure you know you're all critical of social media but thank god we had social media
what would we have done without social media to which i say yeah can you imagine if all we had
was the telephone and texting and zoom and and Skype and multiplayer video games and WhatsApp
and and all these other ways that we can communicate and all we were missing
was platforms in which you post, you write content, and then you wait and hope that others will
like it, retweet it, comment on it. It's the performative platforms. That's what we're talking about.
We're not talking about the internet. We're not talking about communications technology. Communication is
good. We're talking about a small number of platforms where people do the labor, they create the
content in exchange for prestige. That's what I think is destroying a generation in terms of mental
health. And that's what I think is most damaging to democracies.
So let me see if I can sum up what the argument is here, really, is that social media is
being used very effectively for outrage. And it's the people on the furthest, most extreme
parts of their parties that end up being the ones that want to stoke the outrage the most, hence
creating this kind of stupid moment in U.S. politics where we're being ruled by the extremes.
Yep. That's right. Let me add two things.
to that. There are two other groups that use this very effectively. One is trolls. And these are
men. They're almost always men. I've never heard of a female troll who enjoy harassing,
burning things down, causing trouble. It's like the Joker in the Batman series. So they're
men who are extremely disagreeable. They get banned from platforms. So each man used to just be able
to be an asshole to a hundred people a year. Now they can be an asshole to millions of people
year. And that just turns people off. It makes people shut up. It makes people. And the fourth group
is Russian agents. The Russians began what they called active measures against the US, sort of in the
1920s under Lenin, but with a big increase in 1957 or 59, they authorized this program to really
try to mess us up, divide us, make us hate each other. So they used to have to send people over
from Russia to do that. And beginning, but once with social media, they were studying it. Obviously,
it's not the Soviet Union anymore, but of course, Vladimir Putin is a KGB.
agent, where his enemy. So they literally turn on the switch in 2013, and here I'm drawing on
Renee DeResta's work. They turn on the switch in 2013 to really go live with their efforts to mess
us up. This is before Trump did theirs. And so social media has been a gift to America's
enemies as well as to America's assholes. Oh, that's pretty nice. Maybe we'll put that on that's
a cold quote. Yeah. So then I have some questions to ask you about this. First of all, if we're
If we're going real stupid, how do you explain the fact that Joe Biden, who's as centrist as they come,
you know, sort of a right-leading Democrat is the president of the United States right now?
And crushed Trump in the popular vote.
Yeah, yeah.
No, that's right.
So this is like, I think one of the most important things, one of the most important points in my Atlantic essay,
which I think nobody has picked up on or commented on, is the asymmetry between the stupidity on the right and left.
So if you're on the left, you see the same.
stupidity of the Republican Party, and it is undeniable, and it is criminal, and it is beyond
anything we've had in American history. I mean, they tried to steal an election. They stole
a Supreme Court seat with McConnell. The Republican Party is horrible. I hate them. I want them to
fail over and over again, and I've been trying to help the Democrat since 2004 to do it. There is no
contest here. The Republican Party is the stupid party. They shoot their moderates. They don't
have any more moderates other than Liz Cheney and a couple, you know, but they used to have more
than three, and now they're down to whatever, just a couple. The Republican Party is a stupid
party, no doubt about that. Now, if you talk to conservatives, however, what you see is a big
asymmetry that liberals don't get. It's not the Democratic Party that's stupid. The Democratic Party
has moderates and the far left, and guess who wins? Usually the moderates. So the Democratic
Party is a functional party. They have debates. The moderates usually win. There's a big
asymmetry. The stupidity on the left is the cultural left. The stupidity on the left is the progressive
activist. I'm not calling them stupid as people. I'm saying because they use intimidation rather
than persuasion, the far left now in America does not persuade people. It wins by intimidation.
And you can do that. You can take over a school board. You can take over a company. You can cause
your company or your school or your school board to say all kinds of crazy things until a
election day. Election day is the one day every other year when it actually matters what most people
think. And this is why the Democrats can't win. This is why I'm almost giving up on them. I mean,
I've been really hoping that the Democrats can beat the Republicans three elections in a row,
and that would cause the reform. There are smart reformers in the Republican Party, want to make it a
middle class party. I want them to win. But I finally realized the Democrats can't win three
elections in a row because cultural left, now empowered by social media, is so good at Pyrick
victories. The Pyrick victory is one where you win the battle, but the cost is so high that
your side ultimately loses. And so what the Democrats have, what the progressive activists have done
on police reform, on education, on COVID, on so many things, on COVID restrictions,
what they've done is they keep winning victory after victory that is so offensive to people
in every ethnic group. Let me make this clear. The hidden tribe study and others find that
majorities of everybody hate wokeness, black, white, male, female, even majority of Democrats
don't like wokeness. But because of social media, a small number of progressive activists
are able to prevail. So back to your question. Yeah, Joe Biden won because the Democratic Party
is not insane. But the Democrats are likely to get wiped out because most people really hate
what the cultural left is doing in their schools, their companies, and their country.
Right. And she goes out saying Biden is old as hell and doesn't seem like there's a strong bench
behind him. That's right. That's extremely concerning to me. So yeah. Yeah. The picture of politics
in the U.S. is pretty upy right now. It's a mess. So I do want to, you know, kind of ask you this,
like this one thing we haven't covered yet and then we're going to go to break. But how do you
balance the need between like needing to have accountability for folks and then the fact that
sometimes, you know, people you might not like are the ones that are bringing this accountability
to people.
Like social media can be good for a number of things, can make good connections, and it can
definitely, when there are evils, like, it can help, you know, call out those evils and
make sure that, you know, they don't have quarter because before this, you know, I'm not saying
this whole call-out culture movement has been good, but I'm trying to at least make the argument
and have you respond to it.
Before this, there was some really nasty people running a lot of very bad.
powerful organizations, and it's a lot more difficult for them to do that, and I think that's a good
thing. Yes. Let's talk about accountability. So I'm a professor in a business school. In fact,
my formal title, as you said, is professor of ethical leadership. One of the things that we always
cover in my classes here at Stern is whistleblowing. It's very important to have a procedure so that
if there are violations of rules, if there's exploitation, if there's crime, you have to have
a way to report it, you have to have accountability. If you don't have accountability in an
organization, you'll have corruption. So I totally agree. Now, let's think about how you should
have accountability. Is Twitter helpful in accountability? Is a world with Twitter, is that a world
that has more accountability? Well, in a very crude sense, yes, because anyone, rather than just
putting an anonymous tip in the anonymous tip line, anyone can call out anyone publicly from behind
to fake name. Is that really accountability? Well, that is accountability that is as unaccountable
as you can imagine, where anyone with a grudge against anyone can call out anyone, accuse them
of anything, no context, no proof, no accountability for the caller-outer. So I think what we're
seeing is that this is a nightmare. This does not bring us a world of justice. This brings us a world
of fear. I also often hear what you said, like, oh, you know, without Twitter and Facebook,
How could we have had the Me Too movement?
How could we hold anyone to account?
Well, you know, blogs, YouTube videos, email chains.
Like, it was really, like, if Twitter and Facebook had never been invented, it's really easy.
If you want to call it wrongdoing, it would be really easy without those.
And at least there's more accountability if you have to write a blog post or even if it's anonymous.
You know, the fact you have to put any effort to write a blog post.
Whereas, you know, 240 characters.
no accountability, accuse whoever, whatever you want.
So I think that the principle, the need for accountability is a good one, but if we think
consistently about it, we want accountable accountability, not witch hunts, where whoever
makes the most accusations gets the most prestige and the most safety from being accused
themselves.
Right.
And a lot of the big stories of the Me Too era didn't really originate on Twitter.
They resulted from reporters actually getting sources to come on record.
Exactly.
That's right.
As long as journalists are doing their job.
And people have access to journalists.
And now, of course, it's easy for journalists to get the word out.
So, yeah, we've had accountability.
And tech has helped with that.
Jonathan Haidt is with us.
Haidt is with us.
Got it right this time.
Professor of Ethical Leadership at New York University Stern School of Business,
author of Kotting of American Mind.
And you could check out his story why the past 10 years of American life
have been uniquely stupid in the Atlantic.
It's also linked in the show notes.
We'll be back right after this.
Hey, everyone.
Let me tell you about the Hustle Daily Show,
a podcast filled with business, tech news, and original stories to keep you in the loop on what's trending.
More than 2 million professionals read The Hustle's daily email for its irreverent and informative takes on business and tech news.
Now, they have a daily podcast called The Hustle Daily Show, where their team of writers break down the biggest business headlines in 15 minutes or less and explain why you should care about them.
So, search for The Hustle Daily Show and your favorite podcast app, like the one you're using right now.
And we're back here for the second half of Big Technology podcast with Professor Jonathan Haidt.
Welcome back, Professor.
You spoke in the first half about Russia's influence here.
I was, as a reporter, like, pretty close on a lot of these stories after the 2016 election.
I'm beginning to think that it might make sense to minimize the impact that Russia and, you know, potentially China can have.
There's a section in your story why it's about to get much worse.
worse. But, you know, we, we haven't really seen, you know, that much evidence after the platforms
were caught flat-footed with the Internet Research Agency in the 2016 election, which was definitely
bad and impactful that they've been successful and actually working to manipulate the American
public. And I think the arguments that you make about outrage and, and the loss of the center
are so strong that I do wonder if we need to think about, like, AI power, Chinese propaganda
and, you know, Russian disinformation where we're just not seeing it as a, you know,
concrete program problem that we're dealing with today. So can you explain a little bit
about why you bring them into the argument? Yeah. So, so this question, this is question five in
the review. Does social media enable foreign governments to increase political dysfunction
in the United States and other democracies? And if you go there, you'll find that there are
a number of studies giving evidence that they have been active and they have been working on
it. And you'll find some studies in section 5.2 is where we collect the ones that are contrary
that seem to say that, well, actually the reach may not have been that big and most of the
people who encountered this stuff already believed it. So if you operationalize it as the effect
on the average user, what percentage of, let's say, Facebook users are actually influenced by
Russians? And it's, you know, I don't know what the number is, but it's not 10%. It's
So, you know, it's a very small number.
So you say, oh, it's a small number.
I guess we don't have to worry.
That's one way to operationalize it.
But look at it this way.
The story that AIDS was created by a government lab in, I think, Fort Dietrich, Maryland,
this was one of the Russians' many successes in the 20th century.
They put it into an Indian, a small Indian newspaper, which I think they funded or fronted.
It was a little English language thing in India.
That's where the story started.
And then other newspapers reported it.
That's the source. And now other newspapers report these other newspapers, which are more credible.
And before you know it, around the world, there's this idea that AIDS was an American conspiracy.
But how many people read that Indian newspaper? Probably seven or 70, whatever.
So you might say, you know, look, the Russians put, like, only 70 people saw it.
So come on, it doesn't mean anything.
So given the incredible virality of modern social media, the fact that only a small number of extremists are directly exposed to the,
the Russian fake information is not the end of the story.
When you trace out how stories flow and what happens, then they can have a big impact.
And the fact that we now live in a world in which any random kid, some 18-year-old in
upstate New York or in Texas, can be motivated by things they read on an extremist site to go shoot
up a black supermarket.
Yeah, very few people go to those sites.
those that do can have a huge impact.
So again, I think it's important to know that the direct Russian posts don't literally
reach lots and lots of Americans, but it doesn't mean they're not having an impact.
The big finding, the failure to find Russian collusion just means Trump wasn't working with the
Russians.
I do believe that.
But they have been trying since the 50s and even the 20s to do exactly what social media is doing.
They have been trying to make us believe that America is a deep.
racist anti-Semitic country. And, you know, I think, and that's just the Russians. The Chinese
have much more focused interests. Of course, they want us to think well of China. They want to
suppress anyone who criticizes China. And they're probably in the long run a lot more competent
and capable and persistent than the Russians. So I think it would be naive to conclude that there's
not a major national security risk.
Yeah, the risk is there.
The question is, have the platforms done a good enough job at handling this?
Because after the blow up with the Internet Research Agency, like the ability to message,
the ability to buy ads in the political campaign is far more restricted than it has been.
Although I did read this story.
Yeah, there was a story recently that Mark Zuckerberg doesn't seem to care about this as much
as he does, virtual realities.
Yeah, I read an ugly truth at biography of Facebook.
I forget the name of the authors.
Shear Frankl and. Yes, Celia King.
Yes. Thank you. So I read that. And that covers that in great detail. And so if you imagine, if you imagine a major arm of Russian intelligence that is devoted to this full time with a lot of funding. And then you have this one office in Facebook headed by a guy who seems, you know, very, I forget his name, but, you know, whatever the head of security was, he seems to be working very hard. And he's got a bunch of employees.
He's famous. He's gone now, though.
That's it.
Yeah.
Okay.
It works out.
Stanford.
So it's kind of comical and sad if you see this dedicated branch of the Russian military.
And up again, and there, the defense is this guy and, you know, a few dozen, what, maybe
there's a hundred, I don't know how many people he has.
But they're working on lots of problems.
And this is not their top priority.
So, no.
I, you know, yes, of course they took some easy steps.
But there is no way they can get a handle on this.
This has to be, and I think they call it for this, this has to be a sort of a government-wide.
partnership between, coordinated, between the CIA, the FBI, NSA, Facebook, Google.
They all have to be working together.
And as far as I know, they're not, or at least they're not working in a coordinated way.
Right.
But the question underlying a lot of this is, whose responsibility is it?
Is it the platform's responsibility or is it our responsibility?
And I think that-
Are like citizens?
Is it us citizens' responsibility to not get hacked by the Chinese and Russians?
No, I'm talking about like the broader discussion that we're having.
like, you know, we alone, well, we, we can, we can practice good in OPSEC, you know, but we're kind of
powerless when it comes to going up, you and I, against China or Russia. But I'm talking about
this broad discussion of the yields of social media, what is turning our society into.
Oftentimes, this discussion leaves out the fact that, you know, there's a supply and a demand
problem that, you know, there are people who are responding to these incentives and pushing it as
well. I'm curious from your perspective when you think about the problems with social media,
I mean, how much of this is on us to look inward and engage in healthier behaviors? I mean,
it's kind of interesting even looking at it in contrast with the coddling of American mind,
where it's like, this is something that we are doing versus this is something that social media
platforms are doing to us. I know social media was part of your book, but I'm curious how you think
about the balance. Yeah. So in the social sciences, we talk a lot about like the prisoner
dilemma, comments dilemmas, there are all kinds of situations that you can engineer in which each
person pursuing their self-interest ends up that the group is much worse off than otherwise.
And so, you know, in the prisoner's dilemma, you know, if two prisoners committed a crime
together, and if they're caught, and if they both agree to keep quiet, then the DA can't
get them on major charges, but the DA says to each one, if you turn evidence against the other
one, I'll go easy on you. But if they both turn evidence, then you're both screwed. And so it's
very hard to get out of a prisoner's dilemma. Now, common's dilemma is a prisoner's dilemma
generalized to end people and can be a million, it can be seven, whatever. So the clearest
case is Instagram for 10-year-olds. So no parent wants their 10 or 11-year-old on Instagram.
But yet, sixth graders, in my experience, both my kids, men, they started sixth grade in New York
City Public Schools. They said, dad, everyone's on Instagram. Can I have an account? Now, how did
that happened? Well, it's a
common's dilemma. Each
kid says to his mother,
mom, everyone's
on Instagram. I need to be on Instagram.
Now, none of us want the kids on
Instagram, but we don't want our kid cut off.
So this is a trap that they've set for us
and they know about and
they won't do anything about it.
Because if they do, if they do
keep kids off below 13,
then the kids will just go to TikTok.
So, you know, I understand why
Instagram doesn't do much about
underage use. But it's a trap. And so, no, I don't put this on the parents or on the kids.
The platforms have set a trap. Most of the kids are falling into it with disastrous results
for their mental health. This is on the platforms. And the platforms, because of the competition
between platforms, can't solve it. So the way, the classic way that economists tell us to solve
these dilemmas is with government regulation. So I'm a big fan. Senator Bennett of Colorado has a bill
to actually have a new regulator of social media.
When radio and television came in in the 20th century,
we had no way of dealing with them.
And so we created new agencies, the FCC, for example.
We had to have a new government regulator
that had expertise and had some and jurisdiction
to do something about radio and television.
And I think the same thing is happening now.
There is no government agency
that has jurisdiction over social media.
It doesn't go out over the airwaves,
so it's not FCC.
There's antitrust issues.
That can be regulated, but that's a small part of the problem.
So Bennett's bill would actually create a regulator that could understand these traps and recommend ways out.
So until we have, as long as people are in a trap like that, I don't put any blame on the people.
I'm very reluctant to tell adults what to do if they're not being tricked.
But I'm very reluctant to have the companies telling my children, or at least luring my children into traps without my knowledge or ability to stop it.
So I do think we need government regulation.
Right.
And that's children mental health, but we talk about the degradation of democracy.
Like here's another study that I think Professor Brendan Nyhan had where he said, he found that
this is from the how harmful is social media.
It's a story in a New Yorker.
He said he found that almost all extremist content is either consumed by subscribers to
the relevant channels or encountered via links from external sites.
Now, of course, we understand that the folks who are extremists, like, you know, they are
the ones that are like committing lots of the crimes, like you mentioned.
earlier in the show. However, like, you know, again, like, is that a social media problem
or is that an extremist problem? So there are always extremists. They're always conspiracy
theorists. There are always people who are so motivated by their political values that they
want to kill other people. And social media makes it easier for them to find each other.
So, you know, I divide things up into the growing pains of any new technology and the harms
that are not necessary, or the harms that are things we really need to focus on.
So put it this way.
The Internet, it's so important to always distinguish between the Internet and social media.
So the Internet is the greatest boon to mankind since electricity or fire, I'm not sure which.
And, of course, the Internet makes possible all sorts of bad things as well.
Those I consider to be growing things.
When automobiles came out, a lot of people got killed, but we gradually got better regulation of automobiles, and now the death rate is, you know, way lower than it was even 30 years ago.
And the same thing is going to happen with the Internet.
So if it's, you know, the issue of extremists finding each other, that is intrinsic to the Internet.
And especially as things get encrypted, there's no way to stop that.
But at the same time, there are tools, there are things that really, really help extremists.
for no good reason. And Facebook and other platforms, their unwillingness to verify people or vet them,
their willingness to allow anyone to create hundreds of fake accounts and use them however they want.
That's really unnecessary. Now, I say however they want. Obviously, if you make death threats,
you'll get shut down, but you just open another account. So I think that the original idea,
let's just let everybody, connect with everybody, let everybody, it'll be great, it'll be like John Lennon,
imagine there's no distinctions, no walls, no nations, but that's working out really,
really badly. So badly, in fact, that I think that American democracy is likely to fail catastrophically
at some point in the next decade or two. It's very hard to predict the future. But I think
the way we're headed is a very, very bad direction. And so I think we have to revisit these
original libertarian and progressive assumptions about if we just let everybody, you know,
everybody online to do whatever they want, unverified, fake name, it's going to be great.
Okay. And, you know, this is a show where we like to talk about solutions, not just the
problem. So I'm glad you gave us this beautiful segue into the solutions part of it. One of the
things that you bring up in your piece is authenticating people so you can't just be anonymous.
Now, I brought this up recently and I kind of got hammered over it. I think it's a good idea.
And I guess that's sort of, this is going to prove your point. But the counterargument to
this is that if you do have to authenticate people, then you're going to end up with a lot less
activists. You'll end up with authoritarian governments demanding that data. I actually wrote
another story that same year as the retreat story about how two gentlemen with ties to Saudi Arabia
worked for Twitter and ended up passing along information to an organization close to Crown Prince
Muhammad bin Solomon in Saudi Arabia. So you do run that risk if you're ending up tying every
every account to a phone number or a license, you know, whether, you know, whether that's
public or not. So what's your thought about that? Sure. So first, thank you. You're one of the
first interviewers who has understood that authentication does not mean using your real name in
public. So if you go to, if listeners go to the Google Doc and go to section 11.2 user
authentication, we lay out how there are five different levels of authentication.
Level zero is nothing, that's what we have now.
Anyone can do anything as much as they want.
Level one is what Elon Musk tweeted, authenticate humans.
You just have to prove you're a human.
You have to pass a hard captcha before you can start an account.
So that's at least something that would cut down on the box.
Level two is authenticate unique identity once and untraceably.
And this would solve the problem you're talking about.
You just have to prove that you're a person once,
and then whatever this third party authentication thing is,
passes back to Facebook or Twitter, whatever.
Like, yes, this is a real person.
And maybe they're, you know, what I'd like to see is this is a real person,
and they're over 18, and they're in a particular country.
But there's no record kept of it, just that they passed this check.
So now there's nothing, you know, even if that thing got hacked,
all it would be is, yeah, there's a person.
But, you know, I mean, it's easy to tell that John Haid is a person
And in America, like, there's no big secret there.
So level two, authentication would be so much better than what we have now.
And we have examples here, human ID.org and worldcoin.
So there are multiple, look, the tech industry is brilliant.
They come up with a lot of different ways to solve these problems.
And so human ID.org and world coin, those are two that would work for level two authentication.
Now, level three authentication is what you're concerned about.
This is where you have authentication by a third part who can,
keeps the information.
And so that, I understand, could cause problems.
And so I'm not, I'm suggesting that we look into this.
I'm not saying that the federal government should mandate level three for any platform
that gets Section 230 protection.
But I would like us to consider that the federal government mandate that if you want
the benefit of level of, you want the benefit of Section 230 protection and this incredible
freedom from lawsuits that only you and the gun lobby have, because a lot of
If you want that, you have some minimal responsibilities to keep off bots, Russian agents, and children.
And so I think level two authentication, it wouldn't solve the problem, but it would make it a lot better.
What do you think about that?
I like that.
I think that's actually a much more new lot of solution.
You're going to get creamed for saying you liked it, but thank you.
No, I think it's, look, I, it's crazy sometimes to be talking about social media.
I'm sure you know.
If you're in the process of inquiry, just trying to figure things out, you can have the
mom come after you, as you know, I have and I'm sure you have as well. But hey, if we don't
have these conversations, then, you know, we're not going to end up in a good place. So I'm glad
we're doing it. I think that's a good point. Yeah. All right, let's talk about one last thing
before we go, which is virality. You know, it's I, you may or may not have mentioned this in your
story, but you obviously cited the retweet story for, I mean, folks can go and read. I'll drop it
in the show notes. But like my thought has always been at a certain point, you got to make people copy
can paste the links as opposed to just hitting retweet that will create much more thoughtfulness.
It's surprising to me that this hasn't caught on at all.
After having read the story and written about this dart throwing that you mentioned in your story,
what's your view on virality?
How do you think it needs to be changed, if at all, to bring this stupid a little bit.
Yeah. No, I mean, virality, yeah.
So virality is what brought down the Tower of Babel, is my argument.
That, you know, social media wasn't the whole point.
problem. A lot of the polarization problem predates social, predates social media. But the ability
of anything to go viral at any moment means that people are walking on eggshells. Even seventh graders
are walking on eggshells because any little thing they say could blow up within their school or
beyond their school. So virality is the problem. And so my one thought about that, because yeah,
I agree with you. I mean, it would be great if we put friction in, make people copy and paste. But in a
competitive marketplace, if customers don't like that, they'll go to a market that, you know,
that doesn't require those hassles.
But I like to think about it this way.
At present, on some platforms, particularly Twitter,
the more of a jerk you are, the more successful you are.
I mean, obviously in some subsets, that's not true, and in some subsist it is.
But the hyperverality of Twitter shapes them.
You can see people losing their minds.
And I can see professors who used to be reasonable people.
They get trained in almost a behaviorist reinforcement mechanism.
and they turn into jerks.
Well, that's because Twitter incentivizes being a jerk
in order to go viral.
What I'd like to see is an alternate platform compete with Twitter
in which the incentive is to not be a jerk.
The incentive is to be productive.
And at that point, you know, then maybe like all the, you know,
there still would be, you know, I'm not going to,
we can't put Twitter out of business,
but the only people left on Twitter would be like watching, you know,
wrestling, what is it, professional wrestling?
If people want to watch professional wrestling can go to Twitter.
But people who want to keep up with the news could then move to this alternate,
you know, the anti-Twitter site.
And one way to do that, I'll just float this idea on the big tech podcast,
and I hope people who know better will tell me.
One idea which I talked over with Reid Hoffman,
and he seemed to think, had some promise,
is what if you had multiple ways of coding users,
not individual tweets, let's say,
but you code individual users for how much they are trolls,
or troll life. So people who just attack a lot, there's a lot of obscenity, there's no depth,
no complexity, no nuance, they're trolls. So suppose you have AI rate this, you have crowdsourcing
rate this, and you have experts ratings. You have three different ways to do this, and you can,
you know, play them off each other to figure out. Anyway, the point is you can get a rating of
trollishness. And suppose everybody gets a trollishness rating from one to five. And when you
sign up for Twitter, you have a switch, you have a switch set at four, which means,
you will see everybody who's four and below.
But if someone's a five, you don't see them and they don't see you.
This is very important.
They don't get to see my tweets because they disappear.
Why should such people be in the public square?
If Twitter is essentially the public square,
we shouldn't have people who are just stabbing people and shooting people and urinating on people.
They don't belong in the public square.
They're nuisances.
So some people might lower it from a four to a three or a two.
And if we did that, then the incentive would be to be,
to not be a total asshole.
That's what I'd like to see.
That's the way that I think we could create much better environments
that social media could be a much more constructive part of our Democratic deliberation.
Because right now, I think it's really bad for Democratic deliberation.
Yeah, I agree with that.
I think it's kind of tough to figure out exactly what trolling behavior is so much of this.
One of the biggest problems with social media is so much of it is in the gray.
Like, I even wrote a tweet this weekend about how I liked yellow cabs and how people might want to give them a shot outside of Uber.
And, you know, it's kind of like, oh, it was a joke, it was part critique of the, you know, messianic visions of these tech companies, partly a troll.
So anyway, but I think it's a good, it's a good inclination.
There's got to be a way to improve what we're seeing right now because the things that we've talked about, the fact that the extremes are controlling these platforms, that is just a whole.
home for outrage and trolling that. And people can probably, you know, feel this for sure. But when
you go on Twitter, you almost always end up leaving, feeling worse. There's got to be solutions to
this stuff. That's right. Yeah. So I'd like to end. I know we're just about at a time. So let me
just end. You know, I'm so pessimistic, but here's my optimistic vision. If we go back to the late
90s, early 2000s, you know, we all thought that social media, the internet first and then social
media was going to be an incredible boom to democracy. And of course, it can be. And if we think
about all the range of possibilities that there are, if you think about almost like a complex
dynamical system with a complex topography, it's all these different states, there is a
configuration somewhere in the future in which technology gives us the best democracy that
has ever been. That is possible, I believe. I don't know if we'll get there in 10 years or in 50 years.
I don't think we'll be there in five years. But there is a future in which technology gives us
the best democracy, the best discussions, the best society that is.
imaginable. Now, the obstacles from here to there, we don't know how high the hill is or the
mountain. We don't know if we can get there. But that's my hope that the tech community, which is
so creative and so brilliant, that's my hope that more of them will put their minds to that
long-term vision of how can tech give us the best democracy that has ever been, rather than
what might happen is pulling the rug out from under us and having American democracy collapse.
Yeah, I hope so too. And I can say that, you know, speaking with the listeners to this show, the people who read big technology, the folks are the tech companies that I speak with, there's a lot of positive intent. And one of the things I try to remind people who listen or read is that we're really in the early innings here. I mean, Facebook's 20 years old. The internet's going to be with us for a long time. But like you mentioned, I hope we go in in the good direction versus the bad. And I'm optimistic too. I think we'll get there. It's just this is the, you know, what happened after the.
the printing press, right?
30 years war.
Yeah.
So are we in that?
A hundred years war, I think.
I can't remember which was which, but it's, yeah, it's going to be more, maybe more
than 30 years.
Yeah.
Are we in there or, well, the nice thing about technology is the cycles move faster.
So hopefully that's the optimistic note we leave on is that the cycle moves faster.
Professor, thanks so much for joining us.
Do you want to shout out that you or all the people can go to one more time just so
they're able to peruse and read all this stuff?
Yes.
All of my writings, my Google Docs, all of that is available.
at Jonathan Height, that's H-A-I-D-T, dot com slash social media.
Okay, great.
Well, Professor Jonathan Haid,
thank you so much for joining us here on Big Technology Podcast.
Thank you, Alex.
Thanks, everybody for listening.
Thank you, Nick Guatney, for turning the audio around, doing the editing,
making this sound great.
Appreciate you.
Thank you, LinkedIn, for having me as part of your podcast network.
And thanks to all of you, the listeners.
Appreciate it coming back.
We after week.
Stay tuned.
Next week, we'll be back with another show
with The Tech Insider or Outside Agitator.
Thanks again, and we'll see you next time on Big Technology Podcast.