The Paul Wells Show - Author Max Fisher on the social media chaos machine
Episode Date: November 2, 2022New York Times writer Max Fisher talks about his new book, The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds And Our World. Relying on international reporting, leaked corporate... documents and social science, he lays out the case that the problem with social media is not just about amplifying the wrong messages; it’s that social networks are designed to bring out the worst in everyone.Â
Transcript
Discussion (0)
What if Elon Musk isn't the problem?
What if there's something deep down in each of us that just wants to watch the world burn?
Maybe social networks are wired to bring that side of us out.
That would be scary, wouldn't it?
I certainly didn't think of social media as something that could have this effect to drive
real world violence.
And this was one of the first demonstrations, to me at least, that it could, that it was
doing it at scale, and that it was doing it with this machine-like consistency.
I'm Paul Wells.
Welcome to The Paul Wells Show.
This week, how social networks put society in danger.
What you're about to hear may come as a relief, a discussion about social media that doesn't mention Elon Musk. This is easy to explain. I interviewed this week's guest,
author and New York Times columnist Max Fisher,
in the Before Times last week,
before Elon Musk closed his deal to buy Twitter,
and before just about everybody on Twitter
who doesn't agree with Elon Musk started to freak out.
It seems so long ago.
Those were innocent times, weren't they?
But I'm glad this conversation isn't about Musk,
because Max Fisher's book, The Chaos Machine, the inside story of how social media rewired our minds and our
world, isn't about Musk either. Fisher's claim isn't that social media networks might fall into
the wrong hands, and it's not only that networks might amplify the wrong voices. Fisher's argument
is broader and bolder, that even when they're used
as designed, social media networks can bring out the worst in each of us. I wasn't paying too much
attention when I started reading this book. Seriously, at this stage of the game, who needs
a book to tell us social networks have a dark side? But I found the way he builds his argument
tremendously compelling. He's built a career at the Atlantic Monthly, the Washington Post,
Vox, and now at the Times, explaining big global phenomena through the lens of social science.
So he brings psychology to bear, as well as on-the-ground reporting in places like Germany
and Myanmar. What he found is worth your attention. Internal memos at Facebook from
analysts who say the platform exploits the human brain's attraction to divisiveness. The way likes and shares give your brain little chemical jolts
that turn your phone into the world's most ubiquitous slot machine. The strange theories
that pop up around the world about mysterious outsiders who want to do horrible things to
local children. Theories that will be familiar in North America as the basis of QAnon. And the case of Dirk Dankhaus,
a soft-spoken German firefighter who started out making jokes online about anti-immigrant
sediment, and then started to believe it, and ended up trying to start a fire in a residence
for refugees. You'll notice that I spend a lot of my time here just teeing Max up so he can tell
his own story in his own words. I don't spend a lot of time
cross-examining him. It's a complex argument in a lot of ways, and I wanted to make sure he tells
it clearly. But I think you'll also find he builds a compelling and disturbing case.
After the break, I'll bring you my conversation with Max Fisher, thanks for taking the time to join me and talk about this book.
Thank you so much for having me.
I have to say that this book was a bit of a hard sell for me,
because when someone says social media is bad for you,
it's almost at this point like saying the Pacific Ocean is moist.
It feels like something that I should probably already know. And I wonder whether part of the
challenge in writing it was the challenge of getting people to take the premise as news and
as something they had to learn about. It's a great point. And it's actually the thing that
initially kept me from writing about social
media at all. It's not usually my beat. I'm not a technology reporter. I'm an international
correspondent for the time. So I cover wars, social change, political change. But I came around
to thinking around 2017, 2018, a time by which there was a kind of ingrained convention of
wisdom that social media is not great, that it
makes us more polarized, that it makes us more extreme, but also how impactful can it really be,
is when I, traveling around the world, would start to see one instance after another of social media
having a much deeper effect than I thought possible at that point. And it's seeming to change
not just political views, but the way that people behaved and thought
on this massive scale. And I thought, this is something that actually seems pretty important,
and I should probably try to investigate and understand. And that became this years-long
project to try to measure something that had not been, I don't think, measured up to that point,
which is, what is the actual aggregate effect of social media platforms on users, on our minds, and on the world.
And I came away from that thinking, this is really a lot bigger than I thought it was,
and that its effects are much more substantial.
And also thinking that we now finally can empirically measure and can empirically, scientifically
show what those effects are and how it works.
And it felt very urgent and important to me to get that message out to people, even understanding that at this point, we kind of
think that we know the story. I thought I knew the story. And I'm telling you that it's a lot
bigger than I certainly thought it was. You're telling the story from a kind of a
fantastic platform, which is that your gig at the New York Times is you're the interpreter. You are
someone who sets big stories into context.
Before we get into the argument of the book, how do you interpret that role and how do you see that
obligation? And I have to assume that to some extent you had to invent that beat.
The idea behind the beat and the column, which I've been doing for about six, seven years now,
behind the beat and the column, which I've been doing for about six, seven years now,
is that the kind of big, thorny, analytical questions that we are confronted with when we read the news generally, and especially international news, questions like,
why do wars happen? Why is the far right on the rise? Why is democracy declining? Why is North
Korea the way that it is? Why is Russia the way that it is? That these are questions that can be answered and explored in a more fulsome, nuanced,
and often empirical and scientific way than was the case 10 or 20 years ago. There's been huge
advances in social science and political science, And I've been really lucky to have the
platform to use those as tools to try to understand and explore and demystify some of these really big
questions for people, which, as you say, is not something that initially brought me to
social media, but that started to feel like a good application of it.
In the book, you say that one of the points when
you decided you're going to spend a lot of time looking at the effect of social media
was when you're talking to a woman in Mexico, a researcher in Mexico named Hema Santa Maria,
who was investigating a bunch of strange occurrences. Can you kind of tell me the
story of those occurrences and the lessons that she drew from that?
I started hearing in 2016, 2017, 2018 from people
like Hema Santamaria based all around the world, these eerily similar stories of a particular
rumor or a couple of rumors that would spread like wildfire on social media and end up provoking
or being linked to these very similar incidents of violence.
The incidences in Mexico that Hema, this Mexican researcher, had noticed popping up in a few
different totally disconnected places, it was always the almost identical rumor, which was
some version of there's some sort of outsiders, ethnic minorities, just people from outside the community,
who are coming through our town to, and this is going to sound crazy, to kidnap children and to
harvest their organs and harvest their blood. And it sounds nuts on first blush. And in fact,
of course it is. But what was happening, she found, and then we found working with another
reporter tracking these incidences around a few different countries, was that what would happen is that someone would post some version of this rumor, usually on Facebook, sometime on WhatsApp or on Twitter.
And there's billions of posts every day on these platforms.
The systems that govern what you see on them can tell any story they want by picking out certain kinds of content.
Systems that govern what you see on them can tell any story they want by picking out certain kinds of content.
But these systems, these algorithms would identify these rumors as something that was particularly engaging to other people.
And that's what the systems are designed to do. They're designed to present you with whatever kinds of content, whatever series, whatever sort of presentation and order that will get you to spend more time in the platform and get you engaged in the platform yourselves.
order that will get you to spend more time in the platform and get you engaged in the platform yourselves. They discovered that taking these posts that would come from just some random small
account somewhere in, for example, a rural Mexican town would have a version of this claim and push
that post in front of lots of users and push versions of it over and over again, such that if
you are on the platform, you see this over and over again, it starts to feel like something that is this siren ringing out from your community as a whole,
even though maybe only one or two people believe it. That is a particular kind of conspiracy that
for whatever reason, just hooked into people's brains, they would engage with it, spend a lot
of time with it, the algorithms would learn to push it more and more people. And that over days
and weeks and months of the system cultivating over and over
and over this ludicrous claim, and bringing people into it and encouraging people to interact with it,
people would eventually come to think that the rest of their community believed it. And that
would make them want to believe it because they didn't see this as the choice of the Facebook
album, they would think, Oh, my God, my community is so upset about this, there must be something to
it. And then one day, out of nowhere, there would be, oh my God, my community is so upset about this. There must be something to it. And then one day out of nowhere, there would be one of the examples in
Mexico, like just a father-son coming through town to pick up some supplies to build a fence,
I think was one of them. And this community that had been whipped up into this paranoia and fear
and hysteria by what they were seeing on social media, go out and grab these people and kill them,
often quite brutally. And something that I was noticing around this time talking to people like
Hema in Mexico and lots of different countries is that this same phenomenon and often this same
rumor was popping up over and over again in one place after another. The system would identify
these small accounts that would just, for whatever reason, landed on some version of this and would push it out,
and it would have this incredible effect on people. And that, to me, was this very powerful
demonstration that, first of all, what these platforms are pushing out to us is dangerous.
And it's not just dangerous in the sense of like, oh, it's like reading a newspaper article
that is particularly salacious, particularly divisive. There's something about the engagement
based format of this, the fact that it pulls you in, the fact that it presents information to you
as coming from your community, even though it is in fact coming from the social media platform.
There's something about this that has especially powerful pull on people and effect on them
There's something about this that has especially powerful pull on people and effect on them.
And that the system has, they have learned something really deep and dark about what is in human nature and how to cultivate it in this really consequential way.
I certainly didn't think of social media as something that could have this effect to drive
real world violence.
And this was one of the first demonstrations, to me at least, that it could,
that it was doing it at scale, and that it was doing it with this machine-like consistency. And it was especially striking about this one particular rumor, and it's possible that you're
already seeing where I'm going with this because maybe it sounds familiar to you,
is that this same rumor is the same thing that would eventually form the core of the QAnon
conspiracy theory.
So you say that what you heard in Mexico or what you heard from this researcher in Mexico
sounds crazy, except it sounds mutatis mutandis, exactly like QAnon.
People on the other side are coming to hurt our children.
Exactly.
They're coming to hurt our children, and they are coming to hurt them in this very specific
way where they're going to harvest their organs or harvest their blood. To be clear, this is just the start of what became a very long investigative
journey to understanding how the platforms had cultivated QAnon. This was not the one piece of
evidence for it, but it was a really strong suggestion that this wasn't something that just
emerged organically from, you know, wackos who are already on the internet, that there was something
specific about this that the platforms pulled out and had learned to cultivate in large numbers of
people, up to the point of radicalizing them into real world violence. That was the start of this
years long investigation that I went on to not just to track this one particular rumor, but all
of the ways that social media is cultivating things and not just the QAn to, not just to track this one particular rumor, but all of the ways that social
media is cultivating things. And not just the QAnon, not just the anti-vaxxers, not just the
extremists, but in all users who are on the platforms, the ways that it has learned to pull
out and cultivate certain emotions in us, and especially certain behaviors, because it's really
what the platforms want to do. They want to train behaviors into you that will be useful for these
companies because they lead you to spend more time in the platform so they can sell
more ads against your time there and make their tens and hundreds of billions of dollars in
advertising revenue. So there's a lot there to unpack. Let's start with that. I mean,
these companies are not saying, let's make people believe that there are baby murderers coming for
their children. The primal impulse of these companies is the primal impulse of just about any company,
which is growth. And it's the idea that what drives growth online is engagement,
the amount of time that people spend on a given platform. And then you get into what drives
engagement. That's exactly right. And it's like you say, and it is important to say this,
that there is no one sitting in Silicon Valley saying, you know, what would be a great idea
is to cultivate this baby murdering conspiracy that will lead to violence in Mexico and Indonesia
and Guatemala and Malaysia and cultivate QAnon in the United States. What they're saying is let's design
automated systems that use this technology called machine learning. And what machine learning
basically is, is a type of program that is constantly fine tuning itself. And the way the
algorithms work on Facebook and YouTube and Twitter is that they essentially use every post that
appears on the platform, which is billions of posts, if not per day, then per week,
testing what wins engagement and what kind of engagement it wins and how it does it and what
sequence of posts win engagement and what doesn't. And it's running these tests constantly to learn
itself how to promote content that will keep you engaged. Now, if you're
in Silicon Valley, an engineer, working on these platforms, you believe that you are, you know,
helping to serve up content that people like to click on. So therefore, it must be good for them,
or at least it's value neutral, and it will make you some money. What they did not realize,
and it will make you some money. What they did not realize when they started to develop this technology is that it would sync up with and very quickly learn to identify and exploit
some of the gravest weaknesses and cognitive blind spots and deepest instincts in human nature
that we have learned through thousands of years of socializations, of social
norms and social institutions to manage and control and at times suppress. And these systems
have become incredibly adept at pulling out of us at this industrial scale to keep us engaged.
Now, even if the people of the platforms didn't want to cultivate that, they now have all the
evidence to know that that's what their systems are doing.
And in fact, internal research at some of these companies have even identified this
and internal research presented to the executives have said in these blaring five siren headlines,
these alarm bells saying, this is what our systems are doing.
It's cultivating hatred, division, us versus them, tribalism, conspiracy theories, basically all the worst
stuff in human nature. And at every point, they were largely overruled by the executives whose,
as you say, their overriding impulse was to make money, and they continue to be very effective at
doing that. So you quote some researchers at Facebook who in an internal 2018 presentation said, quote, our algorithms exploit the human brain's attraction to divisiveness.
And you say you went visited some folks like some of the senior folks at Facebook and they were really interested in talking about malign exploitation of the platform or bugs in the platform. But when you put it to
them that the platform itself is the problem, they seemed not to have heard you. It's a wild
experience. A lot of the people who work at these companies, especially people who are higher up,
a lot of them are really smart. And I know that's not going to sound surprising. They work at a
giant company. A lot of them are engineers. Of course, they're smart. But a lot of them are very civic-minded.
I was talking to a lot of people who work on, as you say, things like exploitation by
hostile governments or extremists on the platform.
A lot of people who came from the human rights world, who came from working in things like
the State Department or the Pentagon, have real backgrounds on this, are really passionate about it, and would be really smart and thoughtful
until we got to anything related to this premise
that the platforms themselves are not just a passive conduit for bad behavior or for harm,
but are cultivating it, encouraging it, inculcating it into users very assiduously and very effectively,
which is at the time I was first having these conversations with them in 2018.
This is something that was being just first established by outside researchers.
It wasn't until a couple of years later that we knew, thanks to Francis Haugen,
a Facebook employee who leaked a lot of these documents, that Facebook's own researchers were saying this.
But you would say this to them, and it would be like they didn't understand or they thought that's
the most ridiculous thing I've ever heard. How could it possibly be doing that? A lot of the
people who are higher up there were just not willing to grapple with the idea that they were
basically working at the cigarette company, that they were designing products that are,
their value, their commercial value comes from being addictive in a way that is harmful. And if you want to believe that you're saving the world, which is what a lot of the Silicon Valley
internal ideology says, it says we are out here elevating humanity to the next stage of human
evolution. You don't want to believe that at the end of the day,
you work for Marlboro.
We'll come back to my conversation with Max Fisher in a minute.
I want to take a moment to thank all of our partners, the University of Toronto's Munk School of Global Affairs and Public Policy,
the National Arts Centre,
our founding sponsor, TELUS,
our title sponsor, Compass Rose,
and our publishing partners, the Toronto Star and iPolitics. I want to get at what spending a lot of time on these platforms does to all of us,
rather than the pathological cases.
Except the pathological cases are so extraordinary
that let's detour through a couple of them on our way to what they do to you and me.
What happened to Dirk Denkhaus?
Oh, man. Dirk Denkhaus. Dirk was a firefighter in a small town in Germany. And when I went to
this small town in Germany in 2018, the thing that everybody was eager to tell me about Dirk is that he was a nice enough
guy. He was never that political, not necessarily the sharpest tool in the shed, but someone who
had seemed basically harmless. At one point, Dirk, like a lot of people his age, I think he was in
his early mid-20s at this point, started spending just an enormous amount of time
on Facebook and WhatsApp, and we think also on YouTube. And he did what we now colloquially
refer to as fall down the rabbit hole, which is that he started just following the content that
the platform was serving up to him and that he found most engaging and that the platform had
learned to
artificially amplify the reach of because it tended to pull in people like Dirk and get them
to spend a lot more time in the platform. And this was content that initially started as
the colloquial term is irony posting or edgelording, which is basically just things
that are a little bit more extreme just to get a rise out of you, just to be offensive, just to be kind of in your face.
And then as he spent more and more time with this, it went from ironic expressions of hate or
extremism or like a joking, you know, Hitler or Nazi meme to more earnest and much more sincere
and much more legitimately hateful. And after months and months on the platform, he, as has happened to a lot of people
who fall down these rabbit holes,
started to lose his sense of differentiation
between the jokes and the actual hate,
started to lose his sense of grip with,
frankly, the gap between reality
as he was experiencing it in a small town
and what he was experiencing online,
which is an overwhelming
wave of hatred towards refugees who were at that point coming into his small town in Germany in
very large numbers. This was happening to him around 2016, which was, of course, the big refugee
crisis in Europe, large numbers of Syrians and Afghans being resettled in Germany. And one day,
in what appeared to outsiders to be completely out of nowhere but to people who knew
Dirk was something that had been he had been driven towards over months on the platforms
he and a friend who had gone through a similar journey climbed up on top of this refugee
resettlement house basically in his town that had a bunch of refugee families in it and thankfully
he failed but he tried his best to burn down the house with all of these families in it. And thankfully, he failed, but he tried his best to burn down the
house with all of these families within it. And this became this case that got a lot of attention
in this corner of Germany, because a lot of this was happening at this point. I talked to some
police inspectors who were there, members of the community who were there. And they said,
we started to notice around 2016, 2017, that all of a sudden this
would happen over and over.
The people who were basically fine, who were not political, would start spending a lot
of time on Facebook, start spending a lot of time on YouTube, and they would get pulled
into this extremism and hate until they got to the point of acting out.
And there was actually this really fascinating study that a lot of this reporting led me to that tried to measure the impact of time
on social media and the rate of vigilante violence against refugees. And what they
had very complicated metrics for getting to this, because how do you isolate the effect of Facebook?
What they basically looked at is towns where overall Facebook usage, but not internet usage, is higher, significantly
higher than the average. They found that there was something like 20 or 30 percent more attacks on
refugees. And what this led them to is this theory that when a community as a whole, not just an
individual community whole, is spending more time on social media, that the conspiracies and hatred that it
pushes out in the aggregate, even if it's making everyone just 10% more hateful, that you get
some subset of people, the Dirk Denkhouses, people who are kind of on the margins and the fence who
are a little bit more susceptible, that they end up lashing out. But at the same time, someone like
Dirk Denkhouse is the tip of the iceberg for what they found to be much deeper social change of hatred towards refugees or towards any cultural outsiders that were driven by the platforms because that is a kind of sentiment that we know from a lot of other researcher is very effective at getting people to spend more time online.
at getting people to spend more time online.
So once again, in a context where there were an awful lot of newcomers in towns in Germany after the Syrian refugee crisis, this guy who had no particular history of animus, he
wasn't a racist from way back.
He started to front online.
He started to kind of joke about this xenophobic sentiment, and he started to buy his own BS, essentially.
That's the story that you're telling.
Yeah, because he, I mean, first he was just passively consuming it.
There's quite a few ways that platforms train sentiments into you. And one of them is that he found that if other people he knew online, or if he were to post
online some version of xenophobic hate, something that is, you know, we hate refugees, or the
Syrians are destroying our culture, or Islam is against Germany, that that would win much
more engagement.
And it's easy to say, well, what does that really mean?
You just get some more
likes in a post. But the thing to understand is when you spend a lot of time on these platforms,
as the median user does, when you see that other people get what looks like social approval for
expressing a certain kind of sentiment, or especially if you start to get social approval
for expressing that sentiment, that your mind is so sensitive to that perception of, oh, this is
what my community wants of me. This is what my community considers to be right and wrong. This
is what my community, especially if fears or is rallying against is, you know, let's say the
quote unquote threat of Syrian immigration to quote, our culture, that is something that starts to feel internally
truer to you. It becomes an internal sentiment that you yourself start to chase because it feels
so real to you. And that is how you get this trading that the Dirk Denk houses of the world,
when they start to see that hatred basically goes viral and gets a lot of attention, then they start to see that in
their own interactions, and then they start to feel it themselves. But the reason that it is
getting so much interaction is not because their broader community is hateful. And this is something
that was really fascinating with Dirk's community. It's actually a very welcoming place to refugees.
It just feels like it is something that wins a lot of approval because the algorithms on these
platforms identify that sentiment as something that is
potentially engaging, and it juices up artificially how much reach it gets and how
much engagements it gets. So Dirk Denkhaus and all the people like him were really under the
impression that their community wanted them to burn down a house full of refugees. And it
absolutely did not, but it was something they were tricked into by these platforms. You interview one young woman in Germany who is involved in a lot of these
online conversations and you say, well, what happens? Do you get in fights over this sort
of stuff on Facebook? And she's kind of confused and she says, everybody thinks this way. Because
in the chat groups that she's involved in, that's how it looks is that everyone thinks this way. Because in the chat groups that she's involved in, that's how it looks, is that everyone
thinks this way. Right. Yeah, it was really, this was an even a different town. This is a different
part of Germany because it kind of popped all over the place. Where to walk around the town itself,
it's a very welcoming place towards refugees. They had a big refugee center. They had all these
community events to welcome folks from West Africa and
from the Middle East and Central Asia. But then the people who were spending a lot of time online
were convinced that their community hated refugees, and in fact, hated them to the point
of wanting this vigilante community violence against them. And that is how effective
these platforms can be at
manipulating your emotions and sentiment, which is explicitly what they're designed to do. And
that's always what they've been designed to do. This is something that they used to not even hide
in Silicon Valley. They would talk very openly about this idea. They called it persuasion,
which is a kind of Orwellian term for training users to believe or think certain things that
will make them want to come back to the platform. It was only once we started to see the consequences of
this, which run up to and including genocide, that they started to say, oh, no, actually,
we're just a neutral vessel. You've mentioned genocide, which brings us to Myanmar,
where between 2012 and 2015, membership in Facebook grew 80-fold. And then what happened?
The Myanmar genocide was something that erupted for lots of reasons. I don't want to
pin it all on social media. It's like any major political event, no matter how prominent the link
to social media, it's not going to happen without lots of things happening. There's pre-existing tension between the country's
Buddhist majority and a Muslim minority that lived in the northwest of the country. And for
various reasons, that had been increasing somewhat in the years leading up to this sudden,
very deliberate explosion of social media use,
which is something that the Obama administration was really assiduous about bringing in Silicon
Valley companies and saying, we want to put everyone on Facebook. At one point, the state
newspaper said a person without a Facebook page is like a person without an identity. There's this
real move to say, let's get everyone on social media because that will be liberating for us as a society. But in fact, what happened,
and I was in Myanmar both before and during the genocide, so I could kind of see this
transformation, is that as people started to spend more time online, they were just saturated
with a very, very extreme version of hatred and tension that had already somewhat
existed in society and conspiracy theories about this Muslim minority that had already somewhat
existed, but that were so ramped up and were delivered so effectively, and that especially
so often had these kind of calls to action that like Dirk Denkhaus felt that he experienced in
his time on Facebook, or that people in these Denkhaus felt that he experienced in his time on Facebook,
or that people in these villages in Mexico felt that they experienced, that it played an enormous
role, according even to the United Nations, in driving this bottom-up grassroots organic
explosion of violence that cultivated in the genocide of an enormous part of this
population of the country that had run out of the country. And it's also driven in parts by
elements of the military. There were also parts of it that were top-down. But the role of Facebook,
particularly, but also Twitter, and also to a lesser extent, YouTube, although YouTube did
not have the same penetration there, was so clear. I was there at the time, and anybody you would talk to, they would bring the
conversation back to Facebook. It would be driving everything. You would talk to extremists who are
involved in it, they would bring it back to Facebook. You would talk to rights workers,
they would say Facebook is poisoning our society. And you would talk to especially digital activists
who would say, we thought this technology was going to free our society, entrench democracy,
was going to free our society, entrench democracy, and instead it has turned us against one another and is literally helping to burn our country down.
Surely at some point through all of this, there were people trying to
tug the sleeve of executives at Facebook and say, you got trouble?
Oh, yeah. You had these rights groups, one after another, Westerners who were working in the country, Myanmar nationals in the country, who were trying to tell Facebook, you have a really serious problem. And we're, in fact, demonstrating empirically in some research that they did, that really severe hate speech calls to genocidal violence and really out there conspiracy theories and misinformation
is going super viral on your platforms because your platforms are spreading it viral.
And the company, as best anyone could tell, did basically nothing with that information.
It just, it did not seem to ever become a priority.
Okay, so let's look at some of the social psychology that tries to explain these
extraordinary phenomena. And you quote some researchers who say that bad things start to happen when two familiar
phenomena intersect.
One is de-invigilation and the other is status threat.
Can you explain those terms and talk about how they interact?
The colloquial term for de-inviguation is basically mob mentality. Think of being in the
stands at a sports match. You start to sublimate your sense of your own identity into that of the
larger group. You start to feel like you're part of this collective whole. And that can be a nice
thing. That can be a really powerful thing. But what you also start to do, especially as a part of this deindividuation, is you start to defer your sense of morality and moral action and moral behavior to the group, such that if the group starts to feel a sense of moving towards, let's say, collective violence, that is something that if you would not normally go along with, because you have sublimated yourself into the group, you become much, much likelier to do that. Think of any riot at a sports mess. What you're seeing is de-individuation.
And it's obvious to see how de-individuation can happen and in fact is deliberately cultivated
by social media platforms, by activating a sense of identity. That's something that does really,
really well on the platforms. The platforms have learned this. The automated systems have learned this,
that feeding you a sense of you're part of some group
is something that just really charges you up
and really makes you want to spend a lot of time
interacting with that group.
And that identity might be moms.
It might be a partisan political identity.
It might be a racial or a religious identity. It might be a racial or a religious identity.
It might be local, community, whatever identity.
It can be anything.
But whatever it is, the platforms have learned to feed that to you really aggressively and
to play that up and to heighten that because it makes you spend more time online.
This de-individuation is dangerous when it combines with this other force you mentioned,
which is status threat.
And status threat is
basically the thing driving the populist and especially the white populist backlash and the
far right backlash worldwide, which is, it is a belief, regardless of how true it is or not true
it is, that your in group, your social in group, racial, religious, partisan, whatever, is under some sort of threat from a scary outgroup.
And maybe that threat is you believe that you're going to be outnumbered.
Maybe it believes that you're going to be dominated by that group.
Maybe you believe that that group has some sort of opposing values or opposing beliefs and that they're against you.
And this is something that is a very deep-seated human instinct.
Facebook did not invent the instinct of status threat.
But what it did learn is that because it's a very, very powerful social instinct, that
when you feel that sense of status threat, what you really want to do is you want to
cling onto and hold on your sense of group.
And you want to rally that group to some sort of collective action in self-defense and collective
attack against whatever you perceive to be the
dangerous out group that that is something that is really really good if it's fed to you on social
media at getting you to spend more time on the platform so that is something that the platforms
learned very very effectively how to deliver to you and that is something that you have almost certainly experienced,
even if you think of yourself as a just regular user who's, you know, you're not a QAnon,
not an anti-vaxxer, you're not an extremist. If you have been on the platforms and you've
experienced really angry partisan content on there, something that says that the other party,
you know, labor, whatever, that they're not just enacting bad policies,
but as a group, they are coming to get you, they're coming for, you know, our culture,
they're doing something that's really dangerous. And, you know, we have to sound the alarm,
that is status threat. And that is de-individuation. And maybe the effect is not as extreme as something
in, you know, the collective violence in Myanmar, and in the vast majority of cases, it's not.
But what that effect is going to be is instead, it's heightened social polarization of any form. Even if it's only an effect of degrees
on most users, when you are multiplying that out by a scale of billions of people,
an effect by degrees can have a really significant effect and consequences for a society.
Let me set up a straw man for you to emphasize a point that I think is
worth making. Surely, Max Fisher, you're describing to me things that happen to right-wingers.
You're not talking about stuff that happens to everyone else.
So I see what you, like your point, which is a good one, is that this does also happen on the
left, even though we don't think of it because right-wing sentiment or right-wing politics tend to be more concerned with racial and demographic lines and
tend to be more activated against demographic change. So those become much more evident,
but it does also happen on the left wing. And maybe it just plays out in terms of, say,
partisan animus.
And you see a lot of the same inclinations towards conspiracy theories, towards a sense
of social division, but it's on somewhat less obvious grounds when it has a left wing
valence that might be associated with, say, class or partisanship.
But it is also a real question.
And it's a tricky question.
And it's a fuzzy one about, does this have more of a radicalizing effect on the right than it does on the left? And there's not a clean answer
to that. In Western countries, it seems to be having more of an effect on the political right.
It's possible that that is just coincidental and not because there's something inherent
with the right or with right-wing ideas. and it's just coincidental because in Western countries, political right is more focused on,
like I said, immigration and demographic change right now. There's also a theory,
and there's some data behind it, but it's probably too early to say whether it's conclusive or not,
that these systems do have an inherent pull towards the political right because right-wing thought and right-wing
politics in whatever country, in Western countries, in Myanmar, in India, does tend to be much more
concerned with preserving the status of the majority demographic group, which is why you see
Facebook, Twitter, and YouTube, for example, playing an enormous role in activating
the political right in India, which is much more Hindu nationalist. Obviously, that has nothing to
do with Western politics. It has everything to do with the Hindu majority feeling, not correctly,
but feeling threatened by the presence of a Muslim minority. And you see a real amplification
of that sentiment and of anti-Muslim
conspiracy theories on Indian social media. Now, does that mean that the platforms, by their nature,
by their focus on identity threat, outgrouping, have an inherent tilt towards the right? It's
possible. It's definitely possible. It's a compelling theory. But I would say that I think
we're at a point where the research into that is ongoing.
And I think we'll get a better sense of it in the near future, probably.
What made me ask my question is you discussed two American social psychologists, sometime collaborators, named Billy Brady and Molly Crockett.
And each of them starts getting into this because they discover that they like getting angry on social media. Molly Crockett is enjoying being enraged at a story about mistreatment of
border crossers along the U.S. southern border. And then she discovers it's a story from when
Obama was president, and she has to sort of check her assumptions.
Right, right. Yeah, no, it's a good point. It is important to remember that these effects,
even if we might say in the aggregate, they have more of an impact in one political group or
another, they affect everybody. If you were on the platform, it is having this pull on you. I
found in reporting the book that I noticed it having that effect on me. And I'm someone who
thinks of myself as a very careful
social media user, very aware, very sensitive to the effects on it. But I was so horrified by
a lot of what I found in the research of the universal effects on it that I stopped using it.
And I noticed an immediate change. This is not just me. This is something that research has found
shows up over and over again. People who are somewhat regular users of the platforms, as I was, quit the platforms, their affect really
changes. They become less prone to outrage and moral outrage, not just when they're online,
but generally in their life, they become less prone to polarization and to dividing the world
between in groups and out groups. And I guarantee if you
took a break for a few weeks, you would start to notice it too, regardless of whether you're on the
right or the left. Well, you're preaching to the choir here because I, after a false start in 2016,
I quit Twitter in 2018. And it was really hard. If you work in communications, of course,
you persuade yourself that you need to use that to get your message out and to engage with people. And then you actually describe one study where people charged a fairly
high sum of cash to get off of social media. And then they reported that they felt better,
even though they had been that reluctant to engage in the first place.
Yeah, that study was so fascinating. Like you said, these researchers went to a bunch of people
and said, how much would we have to pay you to take, I think it was like six weeks off or four
weeks off social media. And this is a kind of study they use to basically determine people's
threshold for how badly they like something or how much they like something, how much they are
willing to sacrifice to hold on to it. And it was a relative to what these studies usually find,
it was a relatively high sum was like 150 bucks, something like that, to get people to turn it off
of what was wild about it. As you say, as as soon as people did that, especially the people who were
more resistant to turning off social media, those are the people who reported the greatest increase
in happiness and life satisfaction
and who were the likeliest to say
that they were not going to return to social media,
which is one of many very strong indicators we have
that people don't use it because it makes them happy
or because it enriches their lives,
even though sometimes it can do both of those things,
that they use social
media because it is addictive, and it's designed to be addictive. And we know now that it is
chemically addictive, it's physically addictive, it creates a reaction in your brain that is along
the same pathways as substance addiction. And, you know, I found the same thing, it's hard to
turn it off. And when you find yourself reaching for your phone over and over again, that's not just boredom and it's not just a short attention span. It doesn't
feel like the same urge maybe as reaching for a cigarette because we don't think of this as
something you were addicted to, but that is absolutely on a chemical level, exactly what it
is, is the pack of cigarettes in your pocket that you're reaching for every time you go to pull up one of those apps. So what do we do about this? I mean,
in the end of the book, you say this might not be the sort of thing where the answer is, boy,
we got to be careful, or somebody should really tweak those algorithms. You go a little further.
Yeah. So I talked to a lot of people who study this from outside the industry, people who are
in the industry, whistleblowers.
I'm still a reporter at the New York Times, so I don't make policy recommendations myself.
That wouldn't be an appropriate thing for me to do.
But the recommendation I did hear from a lot of people that I spoke to was that the engagement
maximizing features on the platforms and And that we predominantly associate that with
the algorithm, but it also means likes, it also means the share button, the retweet button,
means the up next feature on YouTube, that these are things that can only be harmful in the
aggregate, and that they have such a profound and often profoundly negative effect
that the only real way, this is what some people argue, to get back to a social media that brings
us the good but is healthier without a lot of the bad is to turn off those engagement maximizing
features and to bring us back to a version of social media that we did actually used to once
have pre like 2008, 2009. The platforms were much much more neutral did not have a lot of these features and they were much less
lucrative but they also they weren't driving genocides they weren't driving a national or
global scale polarization or outgrouping or extremism now how do you actually get the
companies to do that i mean i live in washington DC, and that's a very big conversation there. And the American government is, for better or worse, probably the only body with any power get change from the companies and is finding that there's basically no stick big enough. But I know because I've heard from a lot of people
in Congress, especially after the book came out, that there are some pretty active efforts
underway to try to figure out what are the legislative answers to dealing with technology
that is increasingly understood in Washington as inherently harmful to politics and society.
Well, that is a debate that we're going to have to follow closely,
and it's one that we'll understand better thanks to your book.
The book is called The Chaos Machine,
The Inside Story of How Social Media Rewired Our Minds and Our World.
The author is Max Fisher.
Max, thanks so much for taking the time to talk to me.
Paul, thank you. It was great. and our world. The author is Max Fisher. Max, thanks so much for taking the time to talk to me.
Paul, thank you. It was great.
Thanks for listening to The Paul Wells Show. The Paul Wells Show is produced by Antica,
in partnership with the National Arts Centre and the University of Toronto's Munk School of Global Affairs and Public Policy. It's published by the Toronto Star and iPolitics.
Thanks to our founding sponsor, TELUS, and our title sponsor, Compass Rose.
Our senior producer is Kevin Sexton.
Our associate producer is Hayley Choi.
Our executive producer is Lisa Gabriel.
Stuart Cox is the president of Antica.
If you're looking for me on Twitter, don't bother.
I got out of there before it was trendy,
but I do have a subscription newsletter
where you can find all my political writing
at paulwells.substack.com. If you're enjoying this show, please tell a friend. We'll be back next
Wednesday.