The Paul Wells Show - Encore: Author Max Fisher on the social media chaos machine
Episode Date: July 9, 2025New York Times writer Max Fisher talks about his book, The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds And Our World. Relying on international reporting, leaked corporate doc...uments and social science, he lays out the case that the problem with social media is not just about amplifying the wrong messages; it’s that social networks are designed to bring out the worst in everyone. This episode originally aired on November 2nd, 2022.
Transcript
Discussion (0)
Hi, this is Paul Wells and here's an encore episode of the Paul Wells show from the early days.
What if Elon Musk isn't the problem?
What if there's something deep down in each of us that just wants to watch the world burn?
Maybe social networks are wired to bring that side of us that just wants to watch the world burn?
Maybe social networks are wired to bring that side of us out.
That would be scary, wouldn't it?
I certainly didn't think of social media as something that could have this effect to drive real world violence.
And this was one of the first demonstrations to me at least that it could, that it was doing it at scale,
and that it was doing it with this machine-like consistency.
I'm Paul Wells. Welcome to the Paul Wells show.
This week, how social networks put society in danger.
What you're about to hear may come as a relief,
a discussion about social media that doesn't
mention Elon Musk.
This is easy to explain.
I interviewed this week's guest, author and New York Times columnist Max Fisher, in the
Before Times, last week, before Elon Musk closed his deal to buy Twitter, and before
just about everybody on Twitter who doesn't agree with Elon Musk started to freak out.
It seems so long ago. Those were innocent times, weren't they? But I'm glad this conversation isn't
about Musk, because Max Fisher's book, The Chaos Machine, The Inside Story of How Social Media
Rewired Our Minds and Our World, isn't about Musk either. Fisher's claim isn't that social
media networks might fall into the wrong hands, and it's not only that networks might amplify the wrong voices.
Fisher's argument is broader and bolder, that even when they're used as designed, social
media networks can bring out the worst in each of us.
I wasn't paying too much attention when I started reading this book.
Seriously, at this stage of the game, who needs a book to tell us social networks have
a dark side?
But I found the way he builds his argument tremendously compelling.
He's built a career at the Atlantic Monthly, the Washington Post, Vox, and now at the Times,
explaining big global phenomena through the lens of social science. So he brings psychology to bear,
as well as on-the-ground reporting in places like Germany and Myanmar.
What he found is worth your attention.
Internal memos at Facebook from analysts who say the platform exploits the human brain's attraction to divisiveness. The way likes and shares give your brain little chemical jolts that turn your
phone into the world's most ubiquitous slot machine. The strange theories that pop up around
the world about mysterious outsiders who want to do horrible things to local children, theories that will be familiar in North America as the basis of QAnon,
and the case of Dirk Denkhaus, a soft-spoken German firefighter who started out making
jokes online about anti-immigrant sentiment and then started to believe it and ended up trying
to start a fire in a residence for refugees. You'll notice that I spend a lot of my time here just teeing Max up
so he can tell his own story in his own words.
I don't spend a lot of time cross-examining him.
It's a complex argument in a lot of ways,
and I wanted to make sure he tells it clearly.
But I think you'll also find he builds a compelling and disturbing case.
After the break, I'll bring you my conversation with Max Fisher.
Hey, Max Fisher, thanks for taking the time to join me and talk about this book. Thank you so much for having me.
I have to say that this book was a bit of a hard sell for me because when someone says
social media is bad for you, it's almost at this point, like saying this Pacific ocean is moist.
Uh, it feels like something that I should
probably already know.
And I wonder whether part of the challenge
in writing it was the challenge of getting
people to take the premise, uh, as news and as
something they had to learn about.
It's a great point.
And it's actually the thing that initially kept
me from writing about social media at all. It's not usually my beat.'s actually the thing that initially kept me from writing about social
media at all. It's not usually my beat. I'm not a technology reporter. I'm an international
correspondent for the time. So I cover, you know, wars, social change, political change.
But I came around to thinking around 2017, 2018, a time by which there was a kind of ingrained
conventional wisdom that social media is not great, that it makes us more polarized, that it makes us more extreme, but also how impactful can
it really be. Because when I traveling around the world, I'd start to see one instance after another
of social media having a much deeper effect than I thought possible at that point. And it's seeming
to change not just political views, but the way that people behaved and thought on this massive scale.
And I thought, this is something that actually seems pretty
important, and I should probably try to investigate and
understand. And that became this years long project to try to
measure something that had not been, I don't think, measured
up to that point, which is, what is the actual aggregate effect
of social media platforms
on users, on our minds and on the world?
And I came away from that thinking,
this is really a lot bigger than I thought it was
and that its effects are much more substantial.
And also thinking that we now finally can empirically measure
and can empirically, scientifically show
what those effects are and how it works.
And it felt very urgent and important to me
to get that message out to people,
even understanding that at this point, we kind of think that we know the story.
I thought I knew the story and I, I'm telling you that it's, it's a,
it's a lot bigger than I certainly thought it was.
You're telling the story from a kind of a, a fantastic platform,
which is that your gig at the New York times is to, is, is you're the interpreter.
You are someone who sets big stories into context.
Before we get into the argument of the book, how do you interpret that role and how do
you see that obligation?
And I have to assume that to some extent you had to invent that beat.
The idea behind the beat and the column, which I've been doing for about six, seven years now, is that the
kind of big thorny analytical questions that we are confronted with when we read the news generally,
and especially international news questions like, why do wars happen? Why is the far right on the
rise? Why is democracy declining? Why is North Korea the way that it is? Why is Russia the way that it is? That these are questions that can be answered and explored
in a more fulsome, nuanced, and often empirical and scientific way than was the case 10 or
20 years ago. There's been huge advances in social science and political science, and I've been really lucky to
have the platform to use those as tools to try to understand and explore and demystify some of these
really big questions for people, which, as you say, is not something that initially brought me to
social media, but that started to feel like a good application of it.
In the book, you say that one of the points when you decided you're going to spend a lot of time
looking at the effect of social media was when you're talking to a woman in Mexico,
a researcher in Mexico named Jemma Santa Maria, who was investigating a bunch of strange occurrences.
Can you kind of tell me the story of those occurrences and the lessons that she drew from
that? I started hearing in 2016, 2017, 2018
from people like Hema Santa Maria based all around the world these eerily similar stories
of a particular rumor or a couple of rumors that would spread like wildfire on social media and
end up provoking or being linked to these very similar incidents
of violence. So the incidents is in Mexico that Hema, this Mexican researcher had noticed popping
up in a few different totally disconnected places. It was always the almost identical
rumor, which was some version of there's some sort of outsiders, ethnic minorities, just people from outside the community
who are coming through our town to,
and this is gonna sound crazy,
to kidnap children and to harvest their organs
and harvest their blood.
And it sounds nuts on first blush.
And in fact, of course it is,
but what was happening she found,
and then we found working with another reporter
tracking these incidences around a few different countries was that what
would happen is that someone would post some version of this rumor, usually on
Facebook, sometime on WhatsApp or on Twitter, and there's billions of posts
every day on these platforms, the systems that govern what you see on them can tell
any story they want by picking out certain kinds of content. But these systems, these algorithms would identify these rumors as something that was
particularly engaging to other people. And that's what the systems are designed to do. They're
designed to present you with whatever kinds of content, whatever series, whatever sort of
presentation and order that will get you to spend more time in the platform and get you engaged in the platform yourselves. They discovered that taking these posts that would come
from just some random small account somewhere in for example a rural Mexican
town would have a version of this claim and push that post in front of lots of
users and push versions of it over and over again such that if you are on the
platform you see this over and over again, such that if you are on the platform, you see this over and over again, it starts to feel like something that is this siren ringing out from your community as a whole,
even though maybe only one or two people believe it. That is a particular kind of conspiracy that
for whatever reason just hooked into people's brains. They would engage with it, spend a lot
of time with it. The algorithms would learn to push it more and more people. And that over days
and weeks and months of the system cultivating over and over and over this ludicrous claim and bringing
people into it and encouraging people to interact with it, people would eventually
come to think that the rest of their community believed it and that would
make them want to believe it because they didn't see this is the choice of
the Facebook album. They would think, oh my god my community is so upset about
this there must be something to it and And then one day, out of nowhere, there would
be, you know, one of the examples in Mexico, like just a father son coming through town
to pick up some supplies to build a fence, I think was one of them. And this community
that had been whipped up into this paranoia and fear and stereo by what they were seeing
on social media, go out and grab these people and kill them,
often quite brutally. And something that I was noticing around this time talking to people like
Hema in Mexico and lots of different countries is that this same phenomenon and often this same
rumor was popping up over and over again in one place after another. The system would identify
these small accounts
that would just, for whatever reason,
landed on some version of this and would push it out
and it would have this incredible effect on people.
And that to me was this very powerful demonstration
that first of all, what these platforms
are pushing out to us is dangerous.
And it's not just dangerous in the sense of like,
oh, it's like reading a newspaper article that is particularly salacious, particularly divisive. There's something about
the engagement based format of this, the fact that it pulls you in the fact that it presents
information to you as coming from your community, even though it is in fact, coming from the social
media platform, there's something about this that has especially powerful pull on people and effect on them. And that the system has, they have learned something really deep and dark about what is in
human nature and how to cultivate it in this really consequential way. I certainly didn't think of
social media as something that could have this effect to drive real world violence. And this was
one of the first demonstrations to me, at least, that it could,
that it was doing it at scale, and that it was doing it with this machine-like consistency.
And it was especially striking about this one particular rumor, and it's possible that you're
already seeing where I'm going with this, because maybe it sounds familiar to you, is that this same
rumor is the same thing that would eventually form the core of the QAnon conspiracy theory.
So you say that what you heard in Mexico or what you heard from this researcher in
Mexico sounds crazy except it sounds mutatus mutandis exactly like QAnon.
People on the other side are coming to hurt our children.
Exactly. They're coming to hurt our children and they are coming to hurt them in this very
specific way where they're going to harvest their organs or harvest their blood. To be clear, this is just the start of what became
a very long investigative journey to understanding how the platforms had cultivated QAnon. This
was not the one piece of evidence for it, but it was a really strong suggestion that
this wasn't something that just emerged organically from wackos who were already on the internet. That there was something
specific about this that the platforms pulled out and had learned to cultivate in large numbers of
people up to the point of radicalizing them into real-world violence. That was the start of this
years-long investigation that I went on to not just to track this one particular rumor, but all of the ways that
social media is cultivating things and not just the QAnon, not just the anti-vaxxers,
not just the extremists, but in all users who are on the platforms, the ways that it has learned to
pull out and cultivate certain emotions in us and especially certain behaviors because that's really
what the platforms want to do. They want to train behaviors into you that will be useful for these companies
because they lead you to spend more time in the platform
so they can sell more ads against your time there
and make their tens and hundreds of billions of dollars
in advertising revenue.
So there's a lot there to unpack.
Let's start with that.
I mean, these companies are not saying,
let's make people believe that there are baby murderers
coming for their children.
The primal impulse of these companies
is the primal impulse of just about any company,
which is growth.
And it's the idea that what drives growth online
is engagement, the amount of time that people spend
on a given platform.
And then you get into what drives engagement.
That's exactly right.
And it's like you say, and it is important to say this, that there
was no one sitting in Silicon Valley saying, you know, it would be a great
idea is to cultivate this baby murdering conspiracy that will lead to, you know,
violence in Mexico and Indonesia and Guatemala and Malaysia and cultivate
QAnon in the United States.
What they're saying is let's design automated systems that use this technology called machine learning. And
what machine learning basically is is a type of program that is constantly fine tuning
itself. And the way the algorithms work on Facebook and YouTube and Twitter is that they
essentially use
every post that appears on the platform which is a billions of posts if not per
day then per week testing what wins engagement and what kind of engagement
it wins and how it does it and what sequence of posts win engagement and what
doesn't and it's running these tests constantly to learn itself how to
promote content that will keep you engaged.
Now, if you're in Silicon Valley,
an engineer working on these platforms,
you believe that you are, you know,
helping to serve up content that people like to click on.
So therefore it must be good for them,
or at least it's value neutral,
and it will make you some money.
What they did not realize
when they started to develop this
technology is that it would sync up with and very quickly learn to identify and exploit some of the
gravest weaknesses and cognitive blind spots and deepest instincts in human nature that we have
learned through thousands of years of socializations of social
norms and social institutions to manage and control and at times suppress. And these systems
have become incredibly adept at pulling out of us at this industrial scale to keep us engaged.
Now, even if the people of the platforms didn't want to cultivate that, they now have all the
evidence to know that that's what their systems are doing.
And in fact, internal research at some of these companies have even identified this
and internal research presented to the executives have said in these blaring five siren headlines,
these alarm bells saying, this is what our systems are doing.
It's cultivating hatred, division, us versus them, tribalism, conspiracy theories, basically all the worst
stuff in human nature. And at every point, they were largely overruled by the executives,
whose, as you say, their overriding impulse was to make money, and they continue to be
very effective at doing that.
So you quote some researchers at Facebook who in an internal 2018 presentation said,
quote, our algorithms exploit the human brain's attraction to divisiveness.
And you say you went and visited some folks, like some of the senior folks at Facebook,
and they were really interested in talking about malign exploitation of the platform or bugs in the platform, but when you put it to them
that the platform itself is the problem,
they seemed not to have heard you.
It's a wild experience.
A lot of the people who work at these companies,
especially people who are higher up,
a lot of them are really smart.
And I know that's not gonna sound surprising.
They work at a giant company.
A lot of them are engineers, of course they're smart,
but a lot of them are very civic minded.
I was talking to a lot of people who work on, as you say, things like exploitation by hostile
governments or extremists on the platform.
A lot of people who came from the human rights world, who came from working in things like
the State Department of the Pentagon have real backgrounds in this, are really passionate about it
and would be really smart and thoughtful
until we got to anything related to this premise
that the platforms themselves are not just a passive conduit
for bad behavior or for harm, but are cultivating it,
encouraging it, inculcating it into users very assiduously and very effectively,
which is at the time I was first having these conversations with them in 2018.
This is something that was being just first established by outside researchers.
It wasn't until a couple of years later that we knew thanks to Francis Haugen,
a Facebook employee who leaked a lot of these documents,
that Facebook's own researchers were saying this.
But you would say this to them and it would be like they didn't understand or they thought
that's the most ridiculous thing I've ever heard. How could it possibly be doing that?
A lot of the people who are higher up there were just not willing to grapple with the idea that
they were basically working at the cigarette company, that they were designing products that are their value, the commercial value comes from being addictive in a way that is harmful. And if
you want to believe that you're saving the world, which is what a lot of the Silicon Valley internal
ideology says, it says we are out here elevating humanity to the next stage of human evolution,
you don't want to believe that at the end of the day,
you work for Marlboro. I want to get at what spending a lot of time on these platforms does
to all of us, rather than the pathological cases, except the pathological cases are
so extraordinary that let's detour through a couple of them on our way to what they do
to you and me.
What happened to Dirk Denkhaus?
Oh man, Dirk Denkhaus.
Dirk was a firefighter in a small town in Germany.
And when I went to this small town in Germany in 2018, the thing that everybody was eager to tell me about Dirk is that he was a nice enough guy.
He was never that political, not necessarily the sharpest was in his early mid 20s at this point,
started spending just an enormous amount of time on Facebook and WhatsApp and we think also on
YouTube. And he did what we now colloquially refer to as fall down the rabbit hole, which is that he
started just following the content that the platform was serving up to him and that he found
most engaging and that the platform had learned to artificially amplify the reach of because it
tended to pull in people like Dirk and get them to spend a lot more time in the platform.
And this was content that initially started as the colloquial term is irony posting or
edge lording which is basically just things that are a
little bit more extreme just to get a rise out of you, just to be offensive, just to
be kind of in your face.
And then as he spent more and more time with this, it went from ironic expressions of hate
or extremism or like a joking, you know, Hitler or a Nazi meme to more earnest and much more
sincere and much more legitimately hateful.
And after months and months on the platform, he, as has happened to a lot of people who fall down
these rabbit holes, started to lose a sense of differentiation between the jokes and the actual
hate, started to lose his sense of grip with frankly the gap between reality as he was
experiencing it in a small town and what he was experiencing online, which was an overwhelming wave of hatred towards
refugees who were at that point coming into his small town in Germany in very large numbers.
This was in this is happening to him around 2016, which was of course the big refugee
crisis in Europe, large numbers of Syrians and Afghans being resettled in Germany. And
one day in what
appeared to outsiders to be completely out of nowhere, but to people who knew Dirk was something
that had been, he had been driven towards over months on the platforms. He and a friend who had
gone through a similar journey climbed up on top of this refugee resettlement house, basically in
his town that had a bunch of refugee families in it. And thankfully he failed, but he tried his best to burn down the
house with all of these families within it. And this became this case that got a
lot of attention in this corner of Germany because a lot of this was
happening at this point. I talked to some police inspectors who were there,
members of the community who were there, and they said, you know, we started to notice around 2016, 2017, then all of a sudden this would happen over and
over that people who were basically fine, who were not political would start spending a lot of time
on Facebook, start spending a lot of time on YouTube, and they would get pulled into this
extremism and hate until they got to the point of acting out. And there was actually this really fascinating study that a lot of this reporting
led me to that tried to measure the impact of time on social media and the rate of vigilante
violence against refugees and what they had very complicated metrics for getting to this because
how do you isolate the effect of Facebook? What they basically looked at is towns where overall Facebook usage but not internet
usage is higher, significantly higher than the average. They found that there
was something like 20 or 30 percent more attacks on refugees and what this led
them to is this theory that when a community as a whole, not just an
individual community, a whole is spending more time on social media, that the conspiracies and hatred that it
pushes out in the aggregate, even if it's making everyone just 10% more hateful,
that you get some subset of people, the Dirk Denkhaus, people who are kind of on
the margins and the fence who are a little bit more susceptible, that they
end up lashing out. But at the same time, someone like Dirk Denkhaus is the tip of the iceberg for what they found to be much deeper
social change of hatred towards refugees or towards any cultural outsiders that were driven
by the platforms because that is a kind of sentiment that we know from a lot of other
researcher is very effective at getting people to spend more time online.
So once again, in a context where there were
an awful lot of newcomers in towns in Germany
after the Syrian refugee crisis,
this guy who had no particular history of animus,
he wasn't a racist from way back,
he started to front online.
He started to kind of joke about this xenophobic sentiment,
and he started to buy his own BS essentially. That's the story that you're telling.
CB Yeah, because he, I mean, first he was just passively consuming it. There's quite a few ways
that platforms train sentiments into you. And one of them is that he found that if other people
who he knew online or if he were to post online some version of xenophobic hate, something that is,
you know, we hate refugees or the Syrians are destroying our culture or Islam is against Germany,
that that would win much more engagement. And it's easy to say, well,
what does that really mean? You just get some more likes in a post. But the thing to understand is
when you spend a lot of time on these platforms, as the median user does, when you see that other
people get what looks like social approval for expressing a certain kind of sentiment, or
especially if you start to get social approval for expressing that sentiment,
that your mind is so sensitive to that perception of, oh, this is what my community wants of me,
this is what my community considers to be right and wrong, this is what my community,
especially if fears or is rallying against is, you know, let's say the quote unquote threat of
Syrian immigration to quote unquote our culture,
that is something that starts to feel internally truer to you. It becomes an internal sentiment
that you yourself start to chase because it feels so real to you. And that is how you get this
trading that the Dirk Dankausses of the world, when they start to see that hatred basically goes viral and
gets a lot of attention, then they start to see that in their own interactions and then they start
to feel it themselves. But the reason that it is getting so much interaction is not because their
broader community is hateful. And this is something that was really fascinating with Dirk's community
is actually a very welcoming place to refugees. It just feels like it is something that wins a lot
of approval because the algorithms on these platforms
identify that sentiment as something that is
potentially engaging and it juices up
Artificially how much reach it gets and how much engagements it gets so Dirk Denkhaus and all the people like him
We're really under the impression that their community wanted them to burn down a house full of refugees. And it absolutely did not, but it was something
they were tricked into by these platforms.
You interview one young woman who, in Germany,
who is involved in a lot of these online
conversations and you say, well, what happens?
Do you get in fights over this sort of stuff on
Facebook?
And she's kind of confused and she says,
everybody thinks this way.
Because in the chat groups that she's involved in,
that's how it looks is that everyone thinks this way.
Right.
Yeah.
It was really, this was an even a different town.
This is a different part of Germany because it kind
of popped all over the place.
Um, where to walk around the town itself, it's a
very welcoming place towards refugees.
They had a big refugee center.
They had, uh, all these community events to welcome folks from West Africa and from the Middle East and
Central Asia. But then the people who were spending a lot of time online were convinced
that their community hated refugees and in fact hated them to the point of wanting this vigilante community violence against them. And that is how effective
these platforms can be at manipulating your emotions and sentiment, which is explicitly
what they are designed to do. And that's always what they've been designed to do. This is something
that they used to not even hide in Silicon Valley. They would talk very openly about this idea. They
called it persuasion, which is a kind of Orwellian term for training users
to believe or think certain things
that will make them wanna come back to the plow.
It was only once we started to see the consequences of this,
which run up to and including genocide,
that they started to say,
oh no, actually we're just a neutral vessel.
You've mentioned genocide, which brings us to Myanmar,
where between 2012 and 2015, membership
in Facebook grew 80-fold.
And then what happened?
The Myanmar genocide was something that erupted for lots of reasons.
I don't want to pin it all on social media.
It's like any major political event, no matter how prominent the link to social media, it's not going to happen without lots
of things happening. There's pre-existing tension between the country's Buddhist majority and a
Muslim minority that lived in the Northwest of the country. And for various reasons, that had
been increasing somewhat in the years leading up to this
sudden very deliberate explosion of social media use which is something that
the Obama administration was really assiduous about bringing in Silicon
Valley companies and saying we want to put everyone on Facebook at one point
the state newspaper said a person without a Facebook page is like a person
without an identity. There's this
real move to say, let's get everyone on social media because that will be liberating for
us as a society. But in fact, what happened and I was in Myanmar both before and during
the genocide so I could kind of see this transformation is that as people started to spend more time online. They were just saturated with a very, very extreme version of
hatred and tension that had already somewhat existed in society and conspiracy theories about
this Muslim minority that had already somewhat existed, but that were so ramped up and were
delivered so effectively and that especially so often had these kind of calls to action that like Dirk Denkhaus felt
that he experienced in his time on Facebook or that people in these villages in Mexico
felt that they experienced that it played an enormous role according even to the United
Nations in driving this bottom-up grassroots organic explosion of violence that cultivated in the genocide of an
enormous part of this population, the country, the run out of the country. It's also driven in
parts by elements. The military, there were also parts of it that were top-down, but the role of
Facebook particularly, but also Twitter and also to a lesser extent YouTube, although YouTube did
not have the same penetration there, was so clear.
You know, I was there at the time and anybody you would talk to, they would bring the conversation back to Facebook.
It would just, it would be driving everything.
You would talk to extremists who are involved in it.
They would bring it back to Facebook.
You would talk to rights workers.
They would say Facebook is poisoning our society.
And you would talk to especially digital activists who would say, we thought this technology was going to free our society, entrenched democracy.
And instead it has turned us against one another and is literally helping to burn our country down.
Surely at some point through all of this, there were people trying to tug the sleeve
of executives at Facebook and say, you got trouble.
Oh yeah. You had these rights groups, one after another
Westerners who were working in the country, uh,
Myanmar nationals in the country who were trying to
tell Facebook you have a really serious problem.
And we're in fact demonstrating empirically in
some research that they did that really severe hate
speech calls to genocidal violence and really out there conspiracy theories
and misinformation is going super viral on your platforms because your platforms are spreading
it viral and the company is best anyone could tell did basically nothing with that information.
It just it did not seem to ever become a priority. Okay so let's look at some of the social psychology that tries to explain these
extraordinary phenomena. And you quote some researchers who say that bad things start to
happen when two familiar phenomena intersect. One is de-inviguation and the other is status threat.
Can you explain those terms and talk about how they interact?
is status threat.
Can you explain those terms and talk about how they interact?
The colloquial term for de-individuation
is basically mob mentality.
Think of being in the stands at a sports match.
You start to sublimate your sense of your own
identity into that of the larger group.
You start to feel like you're part of this
collective whole.
And that could be a nice thing.
That can be a really powerful thing.
But what you also start to do, especially as a part of this
de-individuation, is you start to defer your sense of morality and moral action
and moral behavior to the group, such that if the group starts to feel a sense
of moving towards, let's say, collective violence, that is something that if you
would not normally go along with,
because you have sublimated yourself into the group, you become much, much likelier
to do that.
Think of any riot at a sports match.
What you're seeing is de-individuation.
And it's obvious to see how de-dividuation can happen and in fact is deliberately cultivated
by social media platforms by activating a sense of identity that's something that does
really, really well on the platforms. The platforms have learned this, the automated systems have
learned this, that feeding you a sense of your part of some group is something that
just really charges you up and really makes you want to spend a lot of time interacting
with that group. And that identity might be moms, it might be a partisan political identity, might be a racial or religious
identity, it might be local community, whatever identity can be anything, but whatever it
is, the platforms have learned to feed that to you really aggressively, and to play that
up and to heighten that because it makes you spend more time online. This de-individuation
is dangerous when it combines with this other force you mentioned which is status threat. And
status threat is basically the thing driving the populist and especially the
white populist backlash and the far-right backlash worldwide which is it
is a belief regardless of how true it is or not true it is that your in-group
your social in-group racial, religious, partisan,
whatever is under some sort of threat from a scary out group. And maybe that threat is
you believe that you're going to be outnumbered. Maybe it believes that you're going to be
dominated by that group. Maybe you believe that that group has some sort of opposing
values or opposing beliefs that they're against you. And this is something that is a very deep-seated human instinct, right?
Facebook did not invent the instinct of status threat.
But what it did learn is that because it's a very, very powerful social instinct, that
when you feel that sense of status threat, what you really want to do is you want to
cling onto and hold on your sense of group and you want to rally that group to some sort
of collective action in self-defense and collective attack against whatever you perceive to be the dangerous out
group that that is something that is really really good if it's fed to you on social media
at getting you to spend more time on the platform. So that is something that the platform's learned
very very effectively how to deliver to you. And that is something that you have
almost certainly experienced, even if you think of yourself as a just regular user who's, you know,
you're not a QAnon, not an Indivax, or you're not an extremist. If you have been on the platforms,
and you've experienced really angry partisan content on there, something that says that the
other party, you know, labor, whatever,
that they're not just enacting bad policies, but as a group, they are coming to get you,
they're coming for our culture, they're doing something that's really dangerous,
and we have to sound the alarm. That is status threat, and that is de-individuation. And maybe
the effect is not as extreme as something in the collective violence in Myanmar, and in the vast majority of cases, it's not.
But what that effect is going to be is instead,
it's heightened social polarization of any form.
Even if it's only an effect of degrees on most users,
when you are multiplying that out by a scale of billions
of people, an effect of by degrees
can have a really significant effect and consequences
for a society.
Let me set up a straw man for you to emphasize a point that I think is worth making.
Surely Max Fisher, you're describing to me
things that happen to right wingers.
You're not talking about stuff
that happens to everyone else.
So I see what you, like your point, which is a good one,
is that this does also happen on the left,
even though we
don't think of it because right-wing sentiment team or right-wing politics tend to be more concerned
with racial and demographic lines and tend to be more activated against demographic change. So those
become much more evident, but it does also happen on the left wing. And maybe it
just plays out in terms of say, partisan animus. And you see a lot of the same inclinations towards
conspiracy theories towards a sense of social division, but it's it's on somewhat less obvious
grounds when it is has a left wing valence that might be associated with say class or partisanship.
But it is also a real question. And it's a tricky question, and it's a fuzzy one
about does this have more of a radicalizing effect on the right than it does on the left?
And there's not a clean answer to that. In Western countries, it seems to be having more of an effect
on the political right. It's possible that that is just coincidental and not because there's
something inherent with the right or with right-wing ideas and it's just coincidental
because in Western countries, political right is more focused on, like I said, immigration and
demographic change right now. There's also a theory and there's some data behind it,
but it's probably too early to say whether it's conclusive or not, that these systems do have an inherent pull
towards the political right because right-wing thought and right-wing politics in whatever
country, in Western countries and Myanmar and India does tend to be much more concerned with
preserving the status of the majority demographic group, which is why you see Facebook, Twitter, and YouTube,
for example, playing an enormous role in activating the political right in India,
which is much more Hindu nationalist. Obviously, that has nothing to do with Western politics.
It has everything to do with the Hindu majority feeling, not correctly, but feeling threatened
by the presence of a Muslim minority and you see a
real amplification of that sentiment and of anti-Muslim conspiracy theories on Indian social media.
Now, does that mean that the platforms, by their nature, by their focus on identity threat,
out-grouping, have an inherent tilt towards the right? It's possible. It's definitely possible.
It's a compelling theory, but I would
say that I think we're at a point where the research into that is ongoing and I think we'll
get a better sense of it in the near future probably. What made me ask my question is you
discussed two American social psychologists, sometime collaborators named Billy Brady and
Molly Crockett and each of them starts getting into
this because they discover that they like getting angry on social media. Molly Crockett is enjoying
being enraged at a story about mistreatment of border crossers along the US southern border.
And then she discovers it's a story from when Obama was president and she has to sort of check
her assumptions. Right, right. Yeah, no, it's a good point.
It is important to remember that these effects,
even if we might say in the aggregate,
they have more of an impact
in one political group or another, they affect everybody.
If you were on the platform, it is having this pull on you.
I found in reporting the book
that I noticed it having that effect on me. And I'm someone who
thinks of myself as a very careful social media user, very aware, very sensitive to the effects
on it. But I was so horrified by a lot of what I found in the research of the universal effects
on it that I stopped using it. And I noticed an immediate change. This is not just me,
this is something that research has found
shows up over and over again to people who are somewhat regular users of the platforms as I was.
Quit the platforms, their affect really changes. They become less prone to outrage and moral
outrage not just when they're online but generally in their life. They become less prone to
polarization and to dividing the world between
in-groups and out-groups. And I guarantee if you took a break for a few weeks, you would start to
notice it too, regardless of whether you're on the right or the left. Well, you're preaching to the
choir here because I, after a false start in 2016, I quit Twitter in 2018 and it was really hard. If
you work in communications, of course you persuade yourself that you need to use that to
get your message out and to engage with people.
And then you actually describe one study where
people charged a fairly high sum of cash to get off
of social media.
And then they reported that they felt better, even
though they had been that reluctant to engage in the
first place.
Yeah. That study was so fascinating where they, like you said, these were the then they reported that they felt better, even though they had been not reluctant to engage in the first place. Yeah, that study was so fascinating. Like you said, these researchers went to a bunch of people and said, how much would we have to pay you to take,
I think it was like six weeks off or four weeks off social media. And this is a kind of study they
use to basically determine people's threshold for how badly they like something or
how much they like something, how much they are willing to sacrifice to hold on to it.
And it was a relative to what these studies usually find. It was a relatively high sum,
like 150 bucks, something like that, to get people to turn it off of what was wild about it.
As you say, as soon as people did that, especially the people who were more resistant
to turning off social media, those are the people who reported the greatest increase in happiness and life satisfaction
and who were the likeliest to say that they were not going to return to social media, which is a
one of many very strong indicators we have that people don't use it because it makes them happy
or because it enriches their lives, even though sometimes it can do both of those things, that they use social media because it is addictive
and it's designed to be addictive. And we know now that it is chemically addictive,
it's physically addictive, it creates a reaction in your brain that is along the same pathways
as substance addiction. And I found the same thing. It's hard to turn it off. And when you find yourself reaching
for your phone, over and over again, that's not just boredom.
And it's not just a short attention span, it doesn't feel
like the same urge, maybe as reaching for a cigarette, because
we don't think of this as something you were addicted to.
But that is absolutely on a chemical level, exactly what it
is the pack of cigarettes in your pocket that
you're reaching for every time you go to pull up
one of those apps.
So what do we do about this?
Um, I mean, in the end of the book, you, you say
this might not be the sort of thing where the
answer is, boy, we got to be careful or somebody
should really tweak those algorithms.
Right.
You go a little further.
Yeah. So I talked to a lot of people who study this from outside the industry, people who are in the industry whistleblowers, I'm still a reporter at the New York Times,
so I don't make policy recommendations myself. That wouldn't be an appropriate thing for me
to do. But the recommendation I did hear from a lot of people that I spoke to was that the
engagement maximizing features on the platforms and that
we predominantly associate that with the algorithm but it also means
likes it also means the share button uh the retweet button
uh means the up next feature on youtube that these are things that can only be
harmful in the aggregate and that they
have such a profound and often profoundly negative effect that
the only real way is what some people argue to get back to a
social media that brings us the good but is healthier without a
lot of the bad is to turn off those engagement maximizing
features. And to bring us back to a version of social media
that we did actually used to once have pre like 2008, 2009. The platforms
were much more neutral, did not have a lot of these features and they were much less
lucrative but they also, they weren't driving genocides. They weren't driving national or
global scale polarization or out grouping or extremism. Now, how do you actually get the companies to do that?
I mean, I live in Washington, DC, and that's a very big conversation there. And the American
government is, for better or worse, probably the only body with any power to actually force
the companies to change that you are certainly trying to use a lot of sticks and carrots,
and especially carrots these days to try to get change from the companies and is finding that there's basically no stick big enough. But
I know because I've heard from a lot of people in Congress, especially after the book came out,
that there are some pretty active efforts underway to try to figure out what are the kind of
legislative answers to dealing with technology
that is increasingly understood in Washington
as inherently harmful to politics and society?
Well, that is a debate
that we're gonna have to follow closely.
And it's one that we'll understand better
thanks to your book.
The book is called The Chaos Machine,
the Inside Story of How Social Media Rewired Our Minds
and Our World.
The author is
Max Fisher. Max, thanks so much for taking the time to talk to me. Paul, thank you. It was great.
Thanks for listening to The Paul Wells Show. The Paul Wells Show is produced by Antica.
Our senior producer is Kevin Sexton. Our associate producer is Haley Choi.
Our executive producer is Lisa Gabriel. Stuart Cox is the president of Antica.
If you're looking for me on Twitter, don't bother.
I got out of there before it was trendy, but I do have a subscription newsletter where
you can find all my political writing at paulwells.substack.com.
If you're enjoying this show, please tell a friend.
We'll be back next Wednesday.