The Rich Roll Podcast - Max Fisher: How Social Media Rewired Our Brains (+ Our World)
Episode Date: September 12, 2022Max Fisher is a New York Times investigative reporter, Pulitzer Prize finalist, and author of the vitally important book: 'The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and... Our World'—a referendum on Big Tech and the social media platforms that that have come to monopolize our attention, foment division and fracture our world. Today's conversation covers Max’s journey to reporting on social media and politics, the specific ways social media changes its users’ morality, and how algorithms can make users more prone to violence. We also dive into cutting-edge research on how social media inculcates a super-exaggerated feeling of outrage and intolerance, making users more likely to believe misinformation and the complicated role of free speech in it all. Finally, we discuss the implications of data harvesting human behavior—and why social media addiction is so terrifying. Watch: YouTube. Read: Show notes. This is an admittedly scary but crucial conversation about how social media’s reach and impact run far deeper than we have previously understood. I hope you learn as much as I did—and adjust your screen time accordingly. Peace + Plants, Rich
Transcript
Discussion (0)
The Rich Roll Podcast.
Social media is changing our minds.
Social media acts like a drug.
My guest today is Max Fisher.
Max is a writer for the New York Times, and he's
the author of a brand new, vitally important book called The Chaos Machine. 80% of Americans are
taking a drug 15 times a day without realizing it's a drug. It's a referendum on all of the
damage currently being wrought by social media. Social media has much more power to affect how you think and how
you feel than you might believe. I've become increasingly convinced that the impact of social
media and technology on our lives and the lives of our children is truly one of the great existential
issues of our time. Just spending time on it, you inevitably end up not just serving the platform
when you're online, but in your offline life, your emotional balance changes. I'm very excited
to share with you what I think is a very important conversation. We're going to get right into it in
a sec, but first. We're brought to you today by recovery.com.
I've been in recovery for a long time.
It's not hyperbolic to say that I owe everything good in my life to sobriety.
And it all began with treatment and experience that I had that quite literally saved my life.
And in the many years since, I've in turn helped many suffering addicts and their loved
ones find treatment. And with that, I know all too well just how confusing and how overwhelming
and how challenging it can be to find the right place and the right level of care,
especially because unfortunately, not all treatment resources adhere to ethical practices.
It's a real problem. A problem I'm now happy and proud to share has been solved by the
people at recovery.com who created an online support portal designed to guide, to support,
and empower you to find the ideal level of care tailored to your personal needs. They've partnered
with the best global behavioral health providers to cover the full spectrum of behavioral health disorders, including substance use disorders, depression, anxiety, eating disorders, gambling addictions, and more.
Navigating their site is simple. Search by insurance coverage, location, treatment type, you name it. Plus, you can read reviews from former patients to help you decide. Whether you're a busy exec, a parent of a struggling teen, or battling addiction yourself, I feel you.
I empathize with you.
I really do.
And they have treatment options for you.
Life in recovery is wonderful, and recovery.com is your partner in starting that journey.
is your partner in starting that journey. When you or a loved one need help, go to recovery.com and take the first step towards recovery. To find the best treatment option for you
or a loved one, again, go to recovery.com. We're brought to you today by recovery.com.
I've been in recovery for a long time. It's not hyperbolic to say that I owe
everything good in my life to sobriety. And it all began with treatment and experience that I had
that quite literally saved my life. And in the many years since, I've in turn helped many suffering
addicts and their loved ones find treatment. And with that, I know all too well just how confusing
and how overwhelming and how overwhelming
and how challenging it can be to find the right place and the right level of care, especially
because unfortunately, not all treatment resources adhere to ethical practices. It's a real problem,
a problem I'm now happy and proud to share has been solved by the people at recovery.com,
has been solved by the people at recovery.com who created an online support portal
designed to guide, to support, and empower you
to find the ideal level of care
tailored to your personal needs.
They've partnered with the best
global behavioral health providers
to cover the full spectrum
of behavioral health disorders,
including substance use disorders,
depression, anxiety, eating disorders, depression, anxiety,
eating disorders, gambling addictions, and more. Navigating their site is simple. Search by
insurance coverage, location, treatment type, you name it. Plus, you can read reviews from
former patients to help you decide. Whether you're a busy exec, a parent of a struggling teen,
or battling addiction yourself.
I feel you.
I empathize with you.
I really do.
And they have treatment options for you.
Life in recovery is wonderful, and recovery.com is your partner in starting that journey.
When you or a loved one need help, go to recovery.com and take the first step towards recovery. To find the best treatment option for you or a loved one, again, go to recovery.com.
All right. So without further ado, here we go. This is me and Max Fisher.
So excited to have you here today to talk about super important matters.
You've written this incredible book,
The Chaos Machine.
I think this is gonna be a huge book.
I think it talks about
some of the most important issues of our time.
I mean, the terrain of your reporting
and what your book canvases, I've really become increasingly convinced of our time. I mean, the terrain of your reporting
and what your book canvases,
I've really become increasingly convinced
really sits atop the great existential problems
that we face as a race.
It's a threat to our time.
And despite the fact that there is plenty being written
about this subject matter,
a lot that's being said about it,
I still think it's an issue or a set of issues
that are underappreciated, undervalued, underrated
in terms of their implications for the future of humanity.
So I applaud you for writing this book.
It really is like the most comprehensive accounting
of what's going on, like from this forensic perspective
of how the internet is warping our minds,
driving us apart from each other,
increasing radicalization and the like.
And so I guess the first question I have for you is,
it's an interesting subject matter for you to tackle
because you're not a tech reporter,
you're like a foreign affairs reporter,
international affairs, et cetera.
So talk a little bit about what brought you into this world
and why it captured you.
So to spend years writing, putting this book together.
It wasn't something that I initially,
I will be honest, took seriously.
I thought of social media as kind of this thing
that was off to the side.
I know that after 2016, there was this kind of question thing that was off to the side. I know that after 2016,
there was this kind of question out there about,
did social media have something to do
with the Trump phenomenon and with the election?
And I, maybe like a lot of people,
somewhat naively thought, well,
maybe it's kind of a neutral amplifier
for things that are already out there.
Maybe it exaggerates certain tendencies a little bit,
but it's just the internet, it's just a website.
How influential could it possibly be? And it started to change for me about a year after Trump's election when I was
in Myanmar to report on the genocide there, which was this really horrible and very sudden explosion
of violence between the country's ethnic majority and an ethnic minority that partly was led by the
state, but a lot of it was also ground up and spontaneous.
And being there on the ground,
I was far from the only person to notice
that social media seemed to be playing
some kind of a huge role in what was happening.
In every conversation you would have with someone,
whatever side they were on,
they would always bring it back to social media.
And even the United Nations, which still blows my mind,
the United Nations said this, blows my mind the United Nations said
this, one of their top officials later said that Facebook had played a determining role
in the genocide, not just in hosting hate speech, not just in being a platform that extremists have
manipulated, but in actually driving it by what their systems promoted and how effective they
were at inculcating so many people in that country into this point of
view. And that started to really get my mind going because it felt in a lot of ways, somewhat similar
to what had happened in the United States, this kind of like social upheaval that seemed to be
linked back to this technology. And because shortly after that, I started noticing that
pretty much everywhere I went as a reporter for the Times, you know, traveling around to different parts of the world, I would hear really similar stories to the Trump phenomenon or to Myanmar that always seemed to link back to social media.
And it would be smaller in scale.
It would be a village that had erupted into violence.
It would be a far right figure who had suddenly jumped from the fringes to completely dominating their local society
based on social media.
And that was when I started to think,
this is more than just a bunch of websites.
It's more than just a bunch of apps.
It's tapping into and it's changing something
really powerful.
And these big instances that we're seeing
are probably just the tip of the iceberg
because if it's something whose effects
are being cycled through all of us
at this very individual granular level,
then these changes are probably way deeper
than we appreciate.
And that became this four or five year quest
to try to understand as rigorously and empirically
as I could what that actual effect is.
Was Myanmar the first international flashpoint for this?
I mean, it's sort of belies the idealistic promise
of social media as born out by Arab Spring, right?
It's like a mirror reflection of that, the opposite of it.
But there is this sense
or sort of a prevailing conventional wisdom
that this is predominantly
an American problem, right?
Like just last week, the other day,
Mark Zuckerberg on Joe Rogan's podcast,
he was asked about polarization and his response was that
there was something uniquely American about it.
And your book is this global adventure
where you basically put that lie to the test.
And it's very clear that this is, although it's, you know,
manifesting metastasizing in the United States,
it's an international problem.
Like you're in Germany or in Austria, you're in Myanmar,
you're in Sri Lanka, like all of these places
where you're seeing incidents of animosity and violence
and polarization that can directly be tracked back
to the algorithmic power of these, you know,
mega social media platforms
and how that's driving behavior.
You really did read the book.
I did, yeah.
Most of it, I told you, I'm about halfway through it.
I know, well, it's very flattering to hear it,
that it resonated.
So to my great shame, there were actually instances
that I covered well before 2016,
that in retrospect were exactly what you're saying,
were these very similar patterns of polarization
and turning society against itself.
There was this case in 2013 in India,
so a hundred years ago, basically,
that looked a lot like the Trump phenomenon
and Myanmar combined,
where it was rumors that started
with a couple of very small accounts on Facebook
about Muslims and Hindus potentially threatening each other
in one very remote part of the country.
And the system, even that far back
when it was much more rudimentary
and much less powerful than it is now, identified that rumor as something that was going to really drive engagement and that was really going to excite users and not just glue them to the platform, but get them to share the rumor and to get more people clicking.
this cycle that is now very familiar of misinformation leading to hate speech, leading to incitement,
every step of it being promoted
very assiduously by the platform
led to this explosion of violence.
And I think tens of thousands of people
pushed out of their homes, which that had happened now,
we would very readily identify
this was something driven by social media,
but it was a time, like you said,
just after the Arab Spring,
when we still thought of social media
as something that was going to bring freedom and revolution.
And the Obama administration actually stepped in
to tell the Indian government,
which had blocked a bunch of the websites,
that you need to unblock them
because we believe in freedom and freedom of speech,
which is a perfectly reasonable position 10 years ago,
but now we would maybe look at a little bit differently.
So maybe Mark Zuckerberg believes that polarization
is something that is just happening in America
and what could his platform possibly have to do with it?
But he has been presented,
including by Facebook's own researchers
with reams and reams of evidence
that its systems are not neutral
and that it drives this behavior.
Sure, I mean, the book opens with this whistleblower
that you named Jacob, who becomes kind of a harbinger
of what's to unfold in terms of the scale of the problem
and the degree of difficulty in tackling it.
So talk a little bit about that.
I mean, that seems to also be,
you rooted this guy out and he becomes kind of a bit
of a slight protagonist in terms of how you tell the story.
He actually found me and he like a lot of people
in the book who are the whistleblowers,
whether they're in Silicon Valley, outside of it,
the people who study the platforms,
he was someone who started as a true believer.
He lives in a country where,
I'm trying to think about how much I can say
about where he lives,
cause I don't want him to get identified, where it was very important to him that social media
was there to bring the freedom and democracy that people in Silicon Valley were promising.
And he really believed in that. His job was to click through posts on the platform to say,
this breaks the rules, this doesn't break the rules, that Facebook and the other platforms
employ thousands of these peoples to basically moderate and kind of manage the platforms. And he was really worried because he saw that the
hate and the lies and the misinformation seemed to be getting worse and worse all the time. And
he had a fuzzy sense in the way that a lot of us did, but not proof that the system itself
was driving it. And he also had this giant stack of documents
that were Facebook's internal rule books
for how to govern the platform.
And he passed those on to me because he said,
they're nonsense, they're gibberish.
A lot of them don't make sense, they don't fit together,
there are mistakes in them.
And it's not that the rules are nefarious.
When I wrote a story for the Times on them,
I think there were a lot of people looking for,
well, they're trying to control free speech
and trying to control politics.
I think they're not.
They're really just trying to tamp down PR crises
in a lot of cases.
Some of the documents say explicitly is the goal,
but the attention and care and given to them
was at an alarmingly low level.
And so that was why he wanted me to broadcast them out
so that the people in the company who he thought
and really believed wanted to do the right thing
would come in and save the day and fix them.
Right, and so that was that article that you wrote in 2018
for the Times, right?
Where you kind of exposed all of this,
these manuals that are almost impossible to consume,
let alone memorize that are often contradictory
in terms of their rules about how to police this stuff.
And then you have these moderators
who are working in sort of boiler room situations,
having to make snap decisions in mere seconds
based on posts, like it's an impossible task.
This is not the way that you're gonna be able to combat
these problems.
Eight seconds of post was what he
had to do. And not only can you not combat the problems, but they're janitors trying to clean
up this mess that Facebook is making faster than they can possibly clean up. They're kind of like
air fresheners outside of a giant factory waste disposal dump. They could have more and more
moderators and better rule books,
but the problem was still
what was coming out of the platforms.
Right, so maybe steel man your case for the problem.
Like what is the problem?
Oh man, what is the problem?
I mean, in some level,
I think the most sympathetic case
that you can make for the platforms
is that on some level, the problem is human nature.
And an enormous amount of the book ended up being focused on something I did not think it was going to be focused on, which are the frailties in our nature, basically. And some of those are things
that evolved in, some of them are ways that we constructed society, that we deal with our own
impulses and instincts. And the basic problem I would say
is that because these social media platforms
are designed over all else to maximize engagement,
which means just whatever they have to do
to get you to spend more time scrolling,
tapping and sharing
so that they can sell more ads against you
and so that you will hook in other users
that they can sell ads to.
Because of the ruthlessness of their systems,
the scale of their systems,
and the fact that they work by playing on
our most basic social impulses and our social needs
has made them incredibly powerful
at bringing out instincts that we have tried to suppress
in what you would call modernity
and at bringing out things that we have tried to suppress in what you would call modernity
and it bringing out things that we know are harmful for us,
but that are tough to resist when we have this platform
that is giving us a reality where it feels like
it's something that we have to do,
it feels like it's something that's necessary
or pleasurable to do.
Right, so we have these huge companies
powered by incredibly complex algorithms
that even the people who manage them
don't understand how they work.
And these are driving people towards content,
rabbit holes, et cetera, that are forming opinions
and often translating into behaviors in the real world.
And you analogize this to hell in 2001,
where man has made a machine
that it doesn't quite understand, machine destroys man.
And we're in some version of that dystopic reality
that's unfolding in real time amongst us.
And I think there are people like yourself who are sounding the alarm bell, And I think there is, you know,
there are people like yourself
who are sounding the alarm bell,
but I think there also is this sense,
well, that's the internet.
These are, you know, sort of fringe people.
This isn't what most people think.
Maybe that's true.
Maybe that isn't,
but you can't deny the real world ramifications
that we're seeing getting played out through politics
and just the decline or the denigration
of public discourse and decorum across the world.
Right, and it's something that I really wanted to
emphasize in the book is that the kind of extremists,
the QAnon, stop the steal, the extreme edge cases,
that's a really important part of the story.
But I think in some ways, the more important part and the harder to confront part is the subtler, but still very consequential
ways that it affects those of us who might think we're separate from that. That's those of those
people- We all think we are.
Right, exactly. We all think we're immune from it.
Right. We all think that we're immune to it. We're smart enough to understand it. We see it.
Or maybe it affects us a little bit in the
sense that we know some habits are not healthy for us, but we do it anyway. And I, even someone
who went in having made the decision to write a book about the harms of social media, so who could
possibly be more overconfident in their own knowledge of it? I was shocked to learn about
the ways that it was affecting me as someone who thinks they were one of the good ones
and just me specifically,
because it really, how do I put this?
It is something that its influence can be so invisible
and it's designed to be invisible.
You open up your Instagram or your Twitter
and you think you're seeing your community.
You think you're seeing your friends or your family, or you think you're seeing the community, you think you're seeing your friends or your family,
or you think you're seeing the voices you follow,
the people you want to hear from.
And when you interact with them,
the feedback that you get in terms of what travels,
what gets likes, what doesn't get likes,
you think that that is feedback from real people.
But these are overwhelmingly the choices
of these automated systems that are deciding
what's going to get attention, what deciding what's gonna get attention,
what's not gonna get attention
because the amount of content that's on social media is huge
and the amount of content you can see is tiny.
So it can tell almost any story at once
by what it pulls out and what presents to you
and how it presents it to you.
And it smuggles its choices through your community
in a way that feels neutral, but is not.
And in a way that has much more power
to affect how you think and how you feel
than you might believe
because we derive so much of our emotional state
and so much of our sense of how to behave
and even what we think we want internally
or how we feel internally
from the cues that we take from people around us
or that we think are from the people around us.
And that's a big piece in the book,
like our group identity and how we kind of gauge status
amongst our peers, et cetera.
And when you look at what's being served up to you,
even understanding that that choice is being made
by an algorithm and that algorithm is selecting
for engagement with the overriding goal
being the platform maximizing as much time as possible.
And also understanding that what drives engagement
often are things that are at odds with our wellbeing,
things like moral outrage, outgroup dynamics,
all of these sort of psychological,
you know, kind of buzzwords and keywords
that drive engagement, that are definitely understood
by these, you know, equations, correct,
and the people that design them.
But I think it's also important,
and your book makes, you know, a big point of this to understand that this is not
a reductive situation.
Like we can't just say, well,
we need to change the algorithms or we need to change,
like this is systemic.
It's baked into the very origin story
of Silicon Valley altogether.
And it was really fascinating as somebody who's, you know,
I used to live in that area and I'm familiar with, you know, many of the people and the places that you describe in the kind of formation of this industry altogether and how that makes solving this problem so difficult because it's bred into the DNA of how these companies were built from the outset.
So talk a little bit about that.
I think it's super interesting. So, I mean, like we were talking about beforehand, in some ways, it's a story of
capitalism. And I mean, that's when you trace it back, that's in some ways the original sin,
where these companies were built on this very specific financial model that turns out to
determine basically everything about how they work and what they do to you. And that financial model is called, I mean, of course, you know, this is called venture
capitalism. And the idea is that someone comes in, an investor, and they give a kid, because we're
talking about the internet era. It used to be they would give someone who designed a widget
a bunch of money and say, go build a factory and sell those widgets. And they're, you know,
semiconductors, something like that. So it might be a $10 million and it takes 20 years
before it turns around and makes money. So you have to be really thoughtful about what's your
business plan. How are you going to bring in revenue with the customers that changed at the
start of the internet era, where the way that the model now worked for venture capitalists is you
wanted to find a kid with a website. And theists is you wanted to find a kid with a website.
And the reason that you wanted to find a kid with a website is that you could give them very, very little money, a really, really small investment because it's basically free to start a website.
And if they got enough users, whether or not they made a dime in profit, and some of the biggest success stories in Silicon Valley never made any profit whatsoever like Netscape, you could then sell it to a bigger company or you could go public. And then you would
have what's called an exit where your initial investment for $10,000, now it's worth a million
dollars or your initial investment for $300,000 is now worth $30 million. And what this created
was this incentive where if you are an investor looking for the next big hit,
or if you were the kid who's trying to start the next big tech company,
what you want to get is as many users as possible, as quickly as possible.
And you do not care about what your business model is because that's someone else's problem
after Oracle buys you for half a billion dollars.
And that leads directly to the social media companies
because their whole thing is owning your attention.
That's the product that they're selling
because more attention means more users,
which means it's more attractive when you go for that exit.
But the problem is that attention is finite.
There are only so many people in the world
and we only have so many seconds per day.
So it created this arms race for how can, I mean, first it was MySpace or these other companies
that we've never even heard of now, and then Facebook and Twitter and YouTube, how can they
compete to get that next second, that next flick of an eyelid that they can sell ads against.
And as they start putting more and more money into this,
because this is now the entire economy of Silicon Valley
is fighting for your attention,
these systems get more and more sophisticated.
And as you said, even the engineers who are creating them,
because these are automated, AI-driven systems,
they don't even know how they're working
or the choices that they're making.
But because they are hiring really, really good people
because they could pay whatever they want,
they become incredibly powerful and effective
at hooking people in.
Yeah, but even beneath that,
like going another layer deeper
into the origin story here,
what's real, you know, I wanna get it
like kind of the philosophy and the psychology
of these people which tracks back all the way
to the defense contractors
who fled the East coast,
where they could kind of innovate in a way
that wasn't possible given the constraints
of the way business was being done on the Eastern seaboard
created a free spirited kind of iconoclastic mentality
that with the advent of the semiconductor chip
and everything that followed bred a sort of sensibility
of us against them and we're rewriting the rule book.
We're creating a revolution of ideas.
We are free of these traditional constraints.
Fast forwarding to that famous Apple ad
where the hammer is being thrown through the screen and that Orwellian kind of thing,
like it's, fuck tradition.
And we're creating a new way of being in the world
that is liberated in the truest sense of that definition
from these traditional ways of being, I guess.
I don't know if I'm saying that correctly.
No, no, that's exactly right. Yeah, I misunderstood what you were asking about, but yes. No, I mean. I don't know if I'm saying that correctly.
That's exactly right.
I misunderstood what you were asking about, but yes.
No, I mean, not what you said is important,
but I think like that core philosophy
makes solving the problem so difficult because this,
whether you call it libertarianism
or being an iconoclast,
there is a certain hubris that's kind of stacked into that, that we know better than you, right?
I guess it's the top level way of saying it.
It's a we know better than you that originates in,
like you were saying,
the very first companies in Silicon Valley,
which was basically like,
I think it was like peach packing plants,
like not very long ago, like 60 years ago and it was this total
backwater that a couple companies moved out to because they were run by people who were
for lack of a better word assholes or cranks and they kind of couldn't make it back east this was
um shockley semiconductor and this sounds like this is this old name this must be you know 100
years ago completely removed but the thing about the Valley is that it is so small
that just a couple companies that set the culture
in the 50s and 60s, Shockley Semiconductor
and Hewlett Packard,
which are both these companies run by these econoclasts
where they said, we're gonna empower the engineer
because the suits back East
who were trying to tell us what to do,
you know, fuck them.
We know better because we know how to do math
and we're good at math science.
And that's what really matters.
These people set the culture that still rules today
because the people who worked at Shockley and at HP
went on to go found the computer companies, basically.
And the people who founded the computer companies
acted as these venture capitalists.
So these are the people who are picking the winners
of the next generation,
designing the companies of the next generation.
The engineers become the funders.
Exactly.
The engineers,
and it's very closed.
And it's funny
because it's this enormous
center of wealth,
but you're dealing with
like a hundred boldface names
that have selected
and guided all the major winners
from that first
semiconductor generation
up through the computers,
up through the internet era,
and then up through social media. And they have been passing on these ideas. And part of it
also comes from the venture capital model, which says, I'm not going to find a traditional company.
I'm going to go find an engineer who makes a really good widget, and I'm just going to give
him a ton of money, and he's going to know what to do best. And as an engineer, you internalize
that and you say, well, I'm the best at making widgets. That means that I should also be in charge of building the company, designing the products. And what happens is the
products, which start as just technical stuff, when you're making semiconductors, if you have a
big ego, that's fine. What's the danger in that? The products become the infrastructure that run
the world, basically, with computers first, and then the internet, and then with social media,
it becomes the infrastructure that sets our social mores and it sets our political norms. And when you have
that engineer mentality, which still holds that says, if I, as an engineer think that this is the
right way to run things, then who are these other people to tell me? And it's, who was this line
from Mark Zuckerberg? I think society is a engineering problem.
Exactly, right.
Society is a problem that engineering can solve.
Right, right.
And he's said before that I bet there's a math equation
for how we balance the,
or the balance of relationships in our life
and what we care about.
And it's a little scary.
And it's a little scary when you have 23 year old,
24 year old Mark Zuckerberg and around the Arab Spring
saying we are going to set out
to fundamentally rewire society from the ground up
because we're the smartest ones
and we're the people to do it.
Right, so you have that sort of engineering mentality
and at some point that intersects
with social psychology, right?
And so much of this can be tracked back to BJ Fogg's class.
What's the lab called that he-
Persuasion lab.
Yeah, so all of these students,
Tristan Harris being among them,
Kevin Systrom, founder of Instagram,
like a lot of people took his class
that went on to found these companies.
And he was really steeped in this social science
of how to drive engagement
and how the human mind operates
to kind of create products and tools
that will basically addict people.
It's so crazy to read the transcripts of conferences
from like 10 years ago,
which is kind of the peak of this era
of like the persuasion lab and this guy near AL
who wrote this book, I think it was called Persuasion,
that was basically explaining how to implement these ideas,
how open a lot of these consultants in the Valley were
and a lot of the CEOs in the Valley were about,
we wanna design products
that are going to be deliberately addictive
and not just addictive in the sense that it's like a game
and it'll be fun to use, but chemically addictive.
And that we are going to exploit dopamine
and that we're going to model our products
on casinos and on slot machines.
And that's why your phone,
it looks like a slot machine very deliberately
where it has these little buttons and these widgets
and it makes the little like vibration haptic feedback
when you use it.
And that is designed like your Pavlov's dog to train you to associate certain feedback with
using your phone. And there's one case from one of these lectures that was, you know, they would
never say this now because they know how bad it sounds. But the 10 years ago in the Valley, it was
very common where this consultant who is teaching companies how to do persuasion, which means getting
your customers to change their behavior.
It sounds a little scarier when you define it.
This example that he used was that Facebook creates a sensation when you go onto it.
He uses this hypothetical in Barbara.
Barbara goes onto Facebook because she wants to connect with her friends and family.
And Facebook creates a simulation of that
that actually feels like it is really exaggerated in scale.
And it feels like a much more extreme version
of that connection.
Because when you connect with someone in the real world,
you have, you know, when you chat with people
in the real world, you're dealing with maybe five people
at once and the feedback that you'll get
will be kind of implicit.
When you go onto social media,
if you post something, you might get positive feedback from 20 people or a hundred people or
a thousand people. So it chases this desire for human connection, then serves you an exaggerated
version of it. And it serves it to you at this incredible convenience so that when you want that
feeling, you'll pick up your phone instead of going to find a person. But the thing about this,
that's really pernicious
is that it doesn't actually deliver
that feeling of connection.
And this consultant was very open about this.
He said that what it gives you is a dopamine boost
that you get that initially makes you think
that you were satisfying the social urge
to connect with other people.
But in fact, because it does not fulfill it,
you will continue to be lonely and crave that. So you would go back to Facebook more and more. And then over time,
Barbara, this woman who we invented, will learn to pick up her phone and to really chase this
feeling of connection through her phone that she will never get by design so that she'll keep
picking it up more and more, which I think resonated with me when I heard that.
Yeah. I mean, that is dark.
It is. It's like so dark. I mean, you're essentially setting up a digital environment
that is equivalent to chasing the dragon. Like you're trying to get hot. You're never going to
quite get that high, but it's luring you back time and time again. I really think that if there's one thing
I want people to take away from this book,
it's not about the political ramifications.
It's not about regulation with tech companies.
All that stuff is really important.
But I would just really like for people to take away
that social media acts like a drug
and to think of it like a drug.
And I think when you start to see
that this is something that doesn't just
deliberately, deliberately by design, addict you like a drug, but that it also changes your
behavior and it changes the way that your mind works akin to a drug. It makes it much easier
to understand because it's a drug that hides itself because you don't realize that that's
what it's doing. You think you're just going on and you're talking to your friends, but you're
actually not. You're interfacing with this technology.
And it's also a very pernicious drug
because whereas say alcohol will play on
or distort your mood, your sense of balance,
it's mostly a hormonal physical response.
Social media plays through social impulses
that we are not used to thinking of as affected by drugs,
but it is something that absolutely changes that.
And it is a drug that each of us takes,
I think the average is like 15 times a day,
every single day.
I think American life makes a little bit more sense
if you think that 80% of Americans
are taking a drug 15 times a day
without realizing it's a drug,
it's something start to kind of click into place. Yeah.
And I think that if you see it that way
as a consumer trying to navigate life that is, you know,
dominated by social media,
it becomes a little easier to understand
what's the difference between what's coming from me
and things that I actually want or emotions or urges
that I'm actually feeling and what's being
instilled in me by this drug that's delivered to me by the largest companies in human history.
Sure. And it goes back to that point you made earlier about how we all think that we're
independent minded and we're not prone to being impulsed in that way. Like I'm strong enough,
I'm of sound minded body, I can resist this,, or I'm not gonna get lured down some crazy rabbit hole
because I'm sentient and independent in thought
and all of the like.
But as movies like The Social Dilemma
did such a good job of showing,
we are powerless against the amount of science
and technology that's gone into the addictive mechanism here.
So there was two things that you said,
there's the addiction piece,
which I think we've explored.
I've had plenty of conversations on this podcast
with people like Johann Hari about that very thing.
I'm more interested in this behavior piece
that you alluded to, like it is changing our minds.
It's literally changing how we think.
And of course, when you change someone's mind,
it's going to change their behavior.
And so there's tons of interviews that you've done
in this book and people that you've spoken to
about this very thing.
So maybe like help us understand that
and flush that out a little bit.
Yeah, let me give you two examples.
One that is kind of small
and one that's a little bit bigger.
There is this very specific type of sentiment
called a moral emotional sentiment.
And that's a complicated way of saying
any emotion or sentiment that touches
on both an emotional feeling
and also the moral component means something
that implicates what we think of
as social mores or right and wrong.
So if I said that you were a really kind person
or I said that you were a liar, those are moral emotional sentiments. Another way to think of it is just
outrage. Outrage is the most powerful moral emotional sentiment. And outrage is different
from anger because it means I think that you have transgressed a wrong against the community.
And I think that you've transgressed a wrong against the social norms that hold us together.
There was this fascinating experiment that I write about in the book where these researchers tried all sorts of different words on a fake Twitter platform
to see what were the words that were going to increase engagement. And basically every kind
of sentiment was much closer to neutral that they thought. Angry words didn't seem to have
much of an effect. Left-leaning or right-leaning words, sad words, happy words, even words like car wreck, like exciting words didn't have much of a
difference. But every moral, emotional word had a 20% increase in the reach of any tweet that
contained it, which is huge. That's nuts. Although if you have spent time on Twitter, that's something
that you very quickly learn is that that is the kind of sentiment that travels.
But the piece of this experiment that really stuck with me because honestly it hit a little
close to home is that they would take these research subjects and they would try to pull
them and basically gauge their sense of internal outrage as a person, how prone to outrage
were they at people
who they thought of as their outgroup?
You know, if they're liberal,
people they thought of conservatives,
fight for versa, you know,
that can be a division by race, by religion,
and people who had a very low propensity to outrage.
If the researchers nudge them to say, you know,
send a couple tweets with outraged words in them,
what would happen is that subject would send those tweets and they would get a lot of engagement because that's what the platform does. It
identifies those as engaging and pushes them out to a lot of people and gives you a social reward
in terms of lots of likes and retweets, is that the subject would internalize that and first start to
send more tweets like that just to chase the high. But then soon enough, that urge would start to
come from within. The research subject would send outrage tweets, even if they didn't get a reward,
because their nature had fundamentally been changed. And what they found in the polls and
studies they would take with these individual people is the outrage they felt internally,
even when they were offline, had increased. So this training effect of the platforms in terms of what works is so powerful that just spending time on it, you inevitably
end up not just serving the platform when you're online, but in your offline life,
your emotional balance changes. It's really terrifying when you think about that.
Yeah.
when you think about that. Yeah.
The fact that something performative
could become so internalized
that it changes your sensibility and your identity
and how you think and how you behave
is essentially what you're saying, right?
I can't remember, what was the phrase that was used?
It was something really benign,
like let the cat chases the dog or something like that.
And then you just, the cat chases the angry dog,
or you would start inserting like those
sort of moral outrage keywords.
The quick brown fox.
And then gauge, you know, engagement
or how that tweet would traffic
and it would go way up, right?
But the fact that somebody could,
from an addictive point of view,
or from a external validation point of view,
start tweeting outrageous stuff for attention,
then over time, start to believe those sentiments.
And then obviously, if you believe those things,
you're gonna behave differently.
That's what I try to tell when I talk to people
who spend a lot of time online, you can always tell,
and I used to be one of these people,
that they're a little bit divided between thinking like, well, I know it's not good for me, but I
like being on it, or I think that there's a lot that's good about it. The thing that I would tell
them is, look, even if you discount the effect that your tweets have on other people, even if
you discount the effect that having these kind of emotional high valence tweets will have on the public square
and broader discourse, you were training yourself.
You were turning yourself into the rat hitting the lever
and you don't realize it, but you were changing the way
that you think and feel in order to serve these companies
that already have plenty of money.
Yeah, it's wild.
We were talking earlier today actually kind of about some
of the comments that we get.
Like when the show reached a certain level of scale,
like you can't escape negative criticism,
like it's just part of the deal, right?
But I've noticed the tone and tenor of the criticism
getting darker and you know, it's words like, you know,
outrageous or I'm so disappointed in you,
like you didn't talk about this and you talked about that.
Like, I'm sorry, are you really that disappointed?
Like, is this that meaning?
Like the kind of level of negative acrimony
is at a pitch that feels different, new and unique,
but it makes sense, right?
The thing that really stayed with me or struck me
was the story about Tay,
the Microsoft AI trained Twitter bot and how it,
because like this, we're getting into this,
the conversation about radicalization,
like this is how it begins, right?
So I think tell that story because I think it really
illustrates and underscores like how this can happen
and how quickly it can happen. This is
in a computer example, but then we can talk about how this happens in human beings.
So this was a chatbot that Microsoft designed where it was a robot run Twitter account that
would talk to people. And there've been a lot of variations on this, but what was different about
this one is because it was on this platform that is the largest open forum in
human history. It was absorbing a huge amount of data in the form of people interacting with it
and in seeing what engagements did well and what didn't. And within, I don't remember what it was,
it was like a few days, like very, very quickly, this bot that was just meant to have friendly
conversations with people had turned into an actual neo-Nazi.
Right, it was literally tweeting out insane Nazi aphorisms.
Yes, yeah.
And some of that was like-
And they had to pull the plug on it.
Like right away.
Yeah, and some of that was trolls basically
who knew how to kind of push the buttons
to get it to say crazy things.
Was that coming from 4chan then?
Like, let's see if we can radicalize this bot
and how quickly we can. I'm sure some of that. And that was from 4chan then? Like, let's see if we can radicalize this bot and how quickly we can.
I'm sure some of that.
And that was the kind of defenses I like,
look, some bad actors distorted this thing,
but okay, that's what your algorithm is doing.
That's what the algorithms on these platforms
are just a version of this bot,
except instead of talking to us,
it's determining what are the things that will win engagement
and then subtly nudging us towards that.
So it was this kind of look behind the curtain,
so to speak, at what these systems are absorbing
and what's actually out there on the platforms
and the accurate what direction it points in.
It's pretty ugly.
Yeah.
Well, in this archeology of radicalization,
you kind of create this timeline of how this came to be.
And then today how pernicious it is.
And I can't help but think back to what Kevin Roos did
with rabbit hole and had him on the podcast
talking about that.
It feels quaint.
That was during COVID so we did that virtually,
but I was riveted by that podcast series.
I encourage you should be doing this every year.
Like these stories never end
and it's becoming more and more intense obviously.
But that feels like very quaint now
compared to like the way that you describe it.
And it feels like this funnel,
it's really a funnel situation that, you know,
begins in places like 4chan and then 8chan,
filters to Reddit, then Reddit filters onto Facebook.
And maybe it bifurcates at that point
between Facebook and YouTube,
and then obviously spills into the real world.
But a real early flashpoint in the understanding
of this radicalization process begins with Gamergate.
Right, so Gamergate is something
that many of us have probably forgotten about.
And if we do remember, we probably remember it
as a kind of fringe internet weirdo thing.
But I think it's important to think about
both because it seeds internet culture as we know it,
and especially the kind of alt-right troll,
Pepe the Frog internet culture.
And also because it becomes the model
for so many patterns that are to follow after that.
And basically how it starts is you have a lot of gamers
are one of the first big early adopters on social networks,
which is why this happens first with gamers.
It's why Gamergate comes before the alt-right.
It comes before anti-vaxxers, these other things.
And a lot of them are young white guys who maybe spend a lot of time at home,
maybe feel a little bit lonely,
which is why they play video games or not.
I shouldn't say it's why they play video games.
In some cases, it might be why it is more attractive
as an identity than other forms of identity.
And something that the systems figured out really early on.
And I talked to a guy at YouTube
who worked inside the company on the algorithms
and identified this and tried to, you know,
set off a siren basically to say,
this is really dangerous.
Something that the systems picked up
is that you could show them gaming videos
and maybe they watch for 10 minutes.
And if you showed them something
that spoke to their sense of identity
and presented it as under siege,
they would watch for much, much longer.
So anything that said gamers under attack, and that happened to dovetail very effectively with
the sense of, I don't know if confusion is the right word, anime maybe, that a lot of young men
feel about just looking for their place in the world, which is perfectly normal, of course.
But again, what YouTube identified is that if you are a 13-year-old
white guy, something that will really powerfully speak to you is a sense that your identity is
under threat as a 13-year-old white guy. So what was the villain that these algorithms came up with
was feminists and so-called social justice warriors. And it constructed partly on YouTube,
partly on 4chan happening somewhat more
organically, but by basically the same mechanisms, this conspiracy that sounds ridiculous when you
say it out loud, which is that video game developers and video game journalists were in
league to suppress the white male spirit by engineering masculinity out of video games and imbuing them with feminist and LGBT characters and ideas
which is crazy, right?
It's nuts.
But these systems are really effective
at gradually nudging people into ideas.
And in fact, that's their favorite way.
That's a key point.
Right. Yeah.
And it's their favorite way to nudge people in
because if they just show you an extremist video,
maybe you watch for 20 minutes.
But if they show you 10 videos
that gradually bring you up to the idea,
then maybe you're watching for four hours
and that's all they want.
They just want you watching more, that's it.
So the intent isn't to gradually radicalize you.
The intent is just to keep you on the platform
for as long as possible.
And these algorithms had discovered
that the best way to do that
is to nudge you in a certain direction.
And as that tap kind of slowly broadens,
that's gonna keep people engaged in the best way.
Right, people who research extremism,
terrorism recruitment,
call that the crisis solution construct.
And the idea is that if you are a recruiter for,
let's say ISIS, you go to someone
and you say, I can see that you were in crisis because you're depressed because you don't have
a job because you're not sure of your place in the world. That crisis is not actually an
individual one. It's one that afflicts our entire community of, if it's ISIS, you know,
you would say Muslims, if it's gamer gay, you would say young white male gamers.
It's a collective fight, but don't worry
because I have the solution to your crisis,
which is that we are going to come together
and collectively fight the enemy,
whether it's Christians, feminists,
who, I mean, it doesn't really matter who it is.
It's just an in-group out-group thing,
the aggrieved group, and then the solution
to solving the aggrievement.
Exactly, right.
And it's terrorist recruiters love this
because it is something that radicalizes people into action
by turning this idea of this all consuming struggle
into their entire identity.
And algorithms love it because that community
becomes something that can be super engaging
because it makes you wanna spend more and more and more time
on the platform, engaging with the ideas,
fighting this
battle. In the case of Gamergate, the way that it was fought was just a campaign of horrific
harassment against pretty much every woman involved in the video game industry or video
game journalism. I interviewed this guy who was a kid who had a hearing problem, so he had a stutter.
So he spent a lot of time online because that was the place where he was comfortable
connecting to other people,
but he was on 4chan and Reddit when this was happening.
And he just got swept up,
not because he innately hates women
or because he's innately a monster,
but because it provided the most supercharged sense
of community and togetherness
that these platforms could possibly deliver.
Purpose.
Purpose, yeah, exactly.
Yeah.
So Gamergate happens and everything you need to know
about what's happening now is kind of packed
into that story.
But at the time it was sort of considered
a fringe internet thing.
Like this isn't the real world.
Like, yeah, that's interesting that that happened,
but you know, we're all living over here
and that's more of a curiosity than anything else, right?
And it takes a long time, despite many Cassandras
like sounding the alarm bell saying,
we should pay attention to this
because this is only escalating
and there are gonna be bigger
and larger ramifications to this.
It wasn't really given proper consideration at the time.
I didn't, I didn't take it seriously.
And then, you know, even things like Pizzagate and the inception of QAnon and all of that,
like it's just weird, right?
Like, well, normal people aren't, you know,
thinking that this pizza parlor is, you know,
really secretly a place where human trafficking,
it's just like, you'd have to be kind of crazy
to believe that.
So it's easy to dismiss,
but really it is the roots of any other kind of
radicalization funnel that we're saying,
like the principles remain the same.
Right, and the platforms are really good.
And this is not something that people at the companies
are doing consciously at identifying those conspiracy
or crazy ideas that are going to be the most
engaging, whether it's Gamergate, whether it's Pizzagate, and pushing them out to as many people
as possible and creating a larger community around them. One of the examples that I think about a lot
from YouTube, which is actually quite similar to Gamergate, but it's a version that I think
it certainly felt much more familiar to me,
is if you, for a long time,
if you search depression on YouTube,
you get a Jordan Peterson video,
and he is someone who will reframe your depression
as part of this larger struggle.
And he doesn't say, therefore you should join the alt-right.
That's not, you know, to be fair to him,
he says that it's, you know, social justice warriors
or feminism or suppressing the
masculine spirit. But YouTube system has learned how to make this connection that if you were
depressed or if you're interested in learning about depression, this is a video that you watch.
It's very long because it has this crisis solution construct. And once it shows you that,
it can then show you a video that is slightly more alt-right. And then one that goes further
and further right from that. And it will link them all together
and create this community,
this kind of downward funnel of extremism
that even the people who are making the videos
aren't conscious of and are not deliberately engendering.
Right, they're just independently making their thing,
but it's one puzzle piece
in this larger algorithmic picture
that leads people in a certain one way direction
for the most part, right?
And when I think about this,
whether it's ISIS or incels,
the inception point for almost all of these problems
can be pinpointed to the disenfranchised young male
amidst a culture of,
you know, widening income disparity and lack of opportunity
and perhaps a dearth of strong, healthy male mentors
or guidance in their lives who become susceptible
to influence, particularly strong male,
sometimes authoritarian influence, or just strong males who are like, get your shit together,
like make your bed Jordan Peterson style or whatever it is,
who like kind of need that voice in their life, right?
And there are healthy versions of that.
And there are unhealthy versions of that.
And maybe one that's, you know,
slightly bending in a certain direction
is gonna tilt that algorithmic arc
into an unhealthy destination ultimately.
But what is your response to the argument
that this is a mental health problem
as much as anything else?
Like we have to solve the depression situation
and from an economic perspective,
we have to create more opportunity for these young males
so that they don't end up so vulnerable
to these information silos
and get radicalized in that way.
Obviously, it's a huge problem.
It isn't one thing or the other,
but this is what's fueling so much of the problems
that we're seeing kind of metastasize in the real world.
I think that that is absolutely true.
I think that it is absolutely,
it's not something that you can pin entirely on social media and the vulnerability that people feel
because of a loss of control,
a loss of autonomy in society.
I feel like that is something that I see over and over
in the people who get pulled furthest down this rabbit hole.
And that sense of needing a way to feel like you have some
agency in the world is, it's not a need that social media created. It's very good at exploiting,
but you're right that as long as that is there, there's going to be some mechanism,
whether it's a social media algorithm or something else that finds a way to exploit that,
to turn it towards some other nefarious end. So I think
there are a lot of vulnerabilities that social media exploits, but does not create. And in some
ways, the answer is to think about those vulnerabilities and not just the platforms.
Yeah. I mean, that has to get addressed in tandem with the other problem that we're talking about. But it's like, it's amazing.
You have vast communities of somewhat wayward people
who are extremely online
and are looking for connection and belonging, right?
And a lot of these communities,
whether they're on 4chan or Reddit or Facebook groups
or what have you, are serving that purpose.
They're making people connected to other humans
through some sense of shared agreement
or a sense of identity at defining who they are
in opposition to maybe a more traditional in-group.
Yeah, yes.
And something that you hear a lot
when you talk to people who work in de-radicalization,
whether it's online or not online, is that often the answer, even though it sounds pat, it sounds simplistic, is just to help someone find a new sense of community and connection.
And a lot of the stories end with, oh, he was in an extremist group, but then he got a girlfriend.
And it sounds like it couldn't possibly be that simple or it couldn't possibly
be that superficial. And it does, when it's weighed up against a great political cause,
you say, well, how could you de-radicalize someone from believing in literal race genocide
by getting a partner, getting a romantic partner? But the need that is driving it is ultimately
rarely political and it is usually personal in that sense of connection.
Someone finding a church or finding another sense of community.
I mean, the thing that I did to kind of unplug myself
from social media and to, you know,
I don't think I was radicalized,
but we're all micro radicalized by social media.
The thing that I did to de-micro radicalize myself
was just find group chats
and just set up a like Slack of friend groups to de-micro radicalize myself was just find group chats
and just set up a like Slack of friend groups
that I've used to replace social media
and just spend more time with people in person,
which it sounds so pat.
It sounds like they couldn't possibly be the answer,
but I think it really is.
Yeah, so you changed your media.
Exactly.
Diet habits.
Right.
So what changes did you make?
So the answer that you usually will hear people say,
I was just listening to an interview with someone
who was talking about this and she was saying,
well, spend some time on social media
and then spend some time reading traditional news outlets
and try to determine which one
is giving you better information.
And that made me laugh because we know the answer
and also that's not how people work.
Yeah. You know, that's not how people work.
It's not why people are spending more time on social media.
I think the replacement that I made and that I would really encourage people to make
is to identify the thing that social media is giving
that you're finding addictive
is the feeling of validation,
the feeling of connection,
the feeling of having a community whose support,
maybe you're getting or not getting,
but you were definitely chasing.
And if you can replace that with a group of runners
or a group that you go cycling with,
or what I did and just had a big like group chat
that I check in with 30 times a day
and instead of checking Facebook and Twitter 30 times a day,
it will supplant the need that is driving you
to the social platforms.
And I think that will take care of a lot
of the downstream
effects of how it's changing your politics,
how it's changing your emotional valence.
Yeah, so there's that piece, that feeling of connection
and the external validation that comes with social media use
but sort of related but separate is the issue
of validating your information feed, right?
There's misinformation, there's disinformation,
there's propaganda, there's traditional news sources.
And more and more people are consuming their news online.
They're not going offline
to actually get a physical newspaper
or less and less are they watching cable news?
It's more about like what shows up in their feed, right?
And these new, you know, quote unquote news sources,
some of some legit and some of ill repute,
it's becoming more and more,
you have to become a lot more savvy and sophisticated
to discern the difference between what is vetted
and legitimate versus something that is not, right?
And I don't see that problem getting any easier to solve.
I mean, I think, especially young users
are more sophisticated in certain ways
of they're more internet literate,
but I see this problem becoming more problematic,
especially as technology advances
and we're gonna see deep fakes
and all this other kind of stuff.
It's gonna get super crazy compared to what,
it's gonna feel quaint right now
compared to where it's headed.
And I don't see a lot of discussion
about how we're gonna manage that
or what we're doing right now to kind of vet news sources.
Right, so this is, I think,
one of my most kind of categorical takes on social media and how to deal with it is that you are, I think, full stop not going to be accurately informed on social media.
And it's not just, are you following the right accounts? Are you careful about which news sources you read?
Because you could do that, but the ways that things are framed on social media is still going to be consistently misleading.
that things are framed on social media is still going to be consistently misleading.
And it's more fundamental than that too.
The way that you consume information
completely flips and changes
when it's in a social context
than when you're just reading an article.
There's this really fascinating study
that I write about in the book
where a bunch of researchers
got some Republicans, conservative voters,
and they showed them a, what was it? It was a fake
headline about Central American refugees showing up at the border. It said something outrageous,
like all the refugees are actually criminals with ISIS. It's something clearly false.
And if they would ask people, even very serious, committed Republicans, whether or not they
considered the headline to be true,
just the headline, do you think it's true? The overwhelming majority would say, no, that's not
true. And then they would ask them, do you want to share that on Facebook? And they would say,
no, I don't want to share it on Facebook because it's not true. But if they showed another set of
Republicans, conservatives, the same headline, but they dressed it up as a Facebook post.
And if the first question they asked was, do you want to share that on Facebook? The overwhelming majority would say,
yes. And then if they asked, is that true? They would say, yes, that's true because I just shared
it on Facebook. So when we are in a social context, we're not judging for accuracy. We might
think we are, we might be trying to, we might have set up the right outlets to follow, but it's not
where we're looking for. We're looking for his identity.
We're looking for validation.
We're looking for things that we think other people
in our community wanna see.
So I think my advice for individuals,
which doesn't solve the problem systemically, I know,
is just don't read headlines or news in a social context.
Just don't do it.
Good luck with that.
I know.
That's like, you're trying to climb the waterfall
with that on some level.
And I think an added kind of ripple or nuance
to what you just shared is it's not just the posts
that you see on Facebook, it's who shared the post, right?
How they shared it.
Who is that person?
Do you identify with them?
And then this terrain that you get into in the book about,
what are they called?
Super shares?
Super posters.
Super posters, right?
Who have like this air of legitimacy, like,
oh, I wanna align with that person, I believe.
Like they're credible in other areas.
So if they're sharing this,
I don't need to read the article, I can read the headline.
And because I feel aligned with them,
or I wanna be aligned with that person,
I will just reflexively reshare it.
Right, can I tell you about the school bullying thing
that kind of led me to that?
Cause this unlocked so much for me
and I feel like helped me really understand
how social media can change your sense of morality
and right or wrong.
So I was working on a story that was related to this research
that areas with higher Facebook usage saw more racist attacks on refugees, basically.
And I was trying to figure out, could that be true?
And could social media actually change your sense of right and wrong, which is something that I think a lot of us suspected.
But how can you actually demonstrate that?
And I went to this woman named Betsy Levy Pollack, who's a researcher, I think, at Princeton.
And she's got a genius grant and
she's a very smart lady. And I said, how would you try to go about figuring this out? And she
had worked a lot in like Rwanda and Genesee D'Aries, which I thought she would want to talk
to me about. But she said, actually, the biggest comparable case here is school bullying. This
really interesting study that she did where she went to a bunch of high schools and she had this
theory that people determine whether or not they think
bullying is okay to do, not based on any internal moral compass or not based on what their teachers
tell them, but by what they think their peers collectively believe. But you can't put out a
poll if you're a kid to know what all of your peers believe. So she said that she thinks the
shorthand people were using was trying to spot the few people in their community
who seem to be the most influential and the most visible,
judge what they think about bullying,
and then they would mirror their internal beliefs,
their sense of morality, right and wrong off of that.
And she tested this by identifying the like 20 kids
out of like every thousand, a really small number who seemed to be the most visible and who had been, I guess you would say supportive of bullying.
Maybe they were bullying people on Instagram, online, and got them to just stand up and say to their community in some context, I actually think bullying is wrong and I don't think that you should do it. And just doing this, just having the influential kind of norm setters
in the community change what they were conveying out
to the rest of the community,
completely flipped everyone in the school.
And not only was there much less bullying after that,
but the surveys that she would take of the students found
that they internally had come to think
that bullying was wrong.
So this is this idea of social reference
that we derive our norms from a
few influential people that we use to infer the whole community. And on social media, the way that
those social reference get picked out is the platforms designate a few people who they believe,
their systems believe are going to be very engaging to us. These are super posters.
And it puts those people in front of us over and over again. I bet if you opened up your Twitter
feed, the algorithm would show a few people on the top over and over again. 100%.
Because those are the people who you engage with, right? I mean, the same thing happens on my feed.
And it's a few people who the system has just learned. I look at their tweets a little bit
longer. Maybe I'm likelier to like it or reply to it.
These are people who I pay attention to.
And it's going to surface posts from those people and those individuals who deliver the type of content that is most engaging, which means the most outraged, which means the most kind of us versus them, the most tribal.
the most kind of us versus them, the most tribal.
And these are also people who have been studied a little bit in the last two years
and kind of their psychological temperament.
And it's a little concerning.
They're basically bullies themselves.
They have a few tendencies.
Grandiose narcissism is one of them,
which is associated with insecurity.
And that's why these people use social media a lot
because they really need the validation.
Something called negative social potency,
which means someone who derives pleasure
from seeing other people in pain.
This is also something that plays really well
on social media
because if you're someone who goes on
and yells at like,
hey, look at this idiot,
look at what this person said,
that's something that will drive a lot of engagement.
So these are the people who offline are not influential.
And I went and interviewed a few of these people
who were elevated by the platforms
and they're not basement dwellers.
They're just like regular people
who happen to be really high-taractive
and a little bit mean on social media,
but they set the norms for these entire communities.
Yeah, and that's something that is true
on the right and the left.
Absolutely.
If you're looking at the political sphere,
there is similar like strategy or tactics that are at play.
Like there's schadenfreude at like pointing out
like the misstep of the out, you know,
the party that they're making fun of, et cetera.
Sometimes it's mean-spirited, sometimes it's comedy.
It's a ferocity of volume, like it's lots of posts, right?
And it's interesting, no matter what silo you go into,
it's those are the people that rise to the top
and that the algorithm is gonna elevate.
And they do, they set the tone, like,
oh, this is what's acceptable in this discourse
to be part of this in-group
and to signal your membership in good standing
by resharing it or doing your version of that post.
Right, and these are people who in the offline world,
if they're especially noisy or kind of obnoxious
would be maybe shunned
or maybe would not be that influential,
but it is behavior that serves the algorithms
rather than the social norms. So it's what gets elevated. but it is behavior that serves the algorithms rather than the social
norms. So it's what gets elevated. And it's something that happens not just in political
spheres, but one of the big cases that I talk about in the book is this happening in basically
mom blog groups and where this leads over time, where you have these forums of young parents who
are sharing information.
And this was like gamers, one of the first really big communities to migrate online
because you have a lot of Gen X moms who are, you know,
trying to figure out this parenting thing.
It seems hard.
That the people who would rise in that community were the people who were conspiratorial,
who had the kind of sharpest criticism of doctors or of government policy around childcare.
And where this ultimately ended up leading over time
was towards anti-vaxxers,
which grew out of these mom Facebook groups,
mom YouTube channels,
because it was the endpoint
of all of that kind of negative social potency
and this tilt towards conspiracy and conspiracism
and extremism that would get promoted on the pages. Right, And Renee DiResta being kind of the tip of the spear and rooting
that out and helping people to really understand how that transpired. Yeah. I was very happy to
talk to her a lot in the book. And she is someone else who started as a real true believer. I mean,
she was an investor in Silicon Valley. She was going to the conferences. She was involved with some of these companies
and she started to see this kind of interesting parallel.
On the one hand, she was online
and she would see what the hell is going on.
These Facebook groups all of a sudden
have crazy anti-vaccine sentiment
that is a complete minority offline,
but is totally dominant online,
which she later figured out
was because it just performs so well.
And at the same time,
she's in Silicon Valley and she's going to these investment conferences and she's saying, well,
why do these companies that have no revenue model, that have no way to bring in any money,
but just say, oh, I'm going to get a billion users and then sell ads against it. Why are they
getting all of the investment money? So it was funny to talk to her to see these kind of two
things converging in one somewhere and it ended up up. Right. And of course, similarly, this gets played out in all kinds of other issues.
You talk about, you know, the anti-immigration situation is Germany or Austria.
Germany, yeah.
Yeah, where, you know, there's this sense if you're online that there is a tremendous amount of acrimony,
you know, towards these communities of people
that doesn't really represent accurately
the true sentiment of the greater population,
but ultimately ends up driving a lot of really bad behavior.
Yeah, it was wild going to these towns
and this is a story I did for the Times of the Time
and it's a chapter in the book
where everything in the town would seem great.
Like people were really friendly towards it.
They would be big refugee populations
because this is like 2015, 2016,
when a lot of refugees had just come from Syria,
Iraq and Afghanistan.
So there's a lot of resettlement happening in small towns.
And if you went into the city streets,
people were really nice to them.
They would be helping them to get groceries.
You would go to these community centers that were really well-funded. But then if you would go online or
talk to people who would go online, it was just completely flipped. And it was just this racist
hatred and vitriol. And I thought what was so striking about that is it was a reminder of,
despite what Mark Zuckerberg might say, the platforms are not actually neutral shows
of what is already in the community.
Or maybe it picks out something that is there,
but it's kind of small and it's on the fringes
and it blasts it out and widens it out
because online social norms that say refugees are great
and we accept them is not going to win that much engagement.
But something that says you have something to fear
in your community.
These new neighbors are really scary
and you should be really upset about them.
It's going to surface every time.
Sure, and that's not just opinion
because it was being fueled by fake news stories
about rapes and crimes and all that kind of stuff
that actually wasn't factually correct.
Right, yes, it was a lot of misinformation
that would go viral because it would kind of hook in
to the sense of fear and the sense of identity threat.
And it would lead to violence.
There were a few cases that I talk about in the book
of people who were fine
and then would start spending a lot of time online,
maybe weren't even political
and would start to drift towards these ideas
because they felt online like it's the norm
and it's accepted and this is how everybody feels.
And they're doing, they're serving the greater good
because they're solving this problem
on behalf of everybody,
which goes to that group identity piece.
And, you know, suddenly having a purpose, you know,
some meaning in your life to redress
what they legitimately feel to be a harm.
Right, it's actualizing, it gives you an identity,
something to hold on to.
And if you talk to people who kind of fall down that,
they're often legitimately surprised
that their real actual real life community
is not gonna greet them as champions and heroes
for what they're doing.
And when you become so invested in that,
it becomes very difficult to disabuse people of it,
no matter what facts you put in front of them.
Like you see this with QAnon,
there's still people holding onto this idea, right?
Yeah, right.
With QAnon, it's sad because it's,
I mean, for a lot of reasons, obviously,
but a lot of them really are just lonely.
Well, that was the big piece there,
was the community piece and the gamification of all of it
that made it this mass group activity
where they could get together
and try to solve these puzzles.
Right, right.
It gives you a purpose.
It helps you make sense of the world
and it helps you understand
if something bad had happened to you.
Maybe you've lost your home, you've been displaced,
you're alienated from a family member.
It's all part of this larger
struggle where you can find all of these other people to help you with it. But it started with,
this is like the one, I tried not to hold back any reporting for the book, but there was one thing
that I just didn't get a chance to put into an article that I put into the book, which is that
the origins of QAnon is of course, it's the Pizzagate conspiracy theory, which is, and again, it sounds so crazy
when you say it out loud.
It's the idea that democratic elites
are harvesting the blood
and the organs of children.
It's so stupid.
Adrenochrome.
Adrenochrome, right.
Yeah, exactly.
And that was something that had,
that rumor had actually spread spontaneously
in about a half dozen different
countries before it ever came to the United States as Pizzagate. And I had like just coincidentally
I was reporting on social media extremism in Indonesia, which is this giant Southeast Asian
country where social media is a very powerful force. And this rumor that was like identical
to Pizzagate had spread there before it came to the US.
It had started out on this like very obscure account where someone just, I'm sure just made
up this rumor. They're like, oh, the elites are coming to steal our kids and our baby's bloods
and organs, excuse me. And the Facebook algorithm just identified it as something that was going to
be tenaciously powerful and started spreading it and went viral on the platform.
So other people picked it up and started sharing it.
And within a couple of weeks of this,
seven different villages in different parts of the country,
all rose up in spontaneous mob violence
and lynched some guy who just happened
to be traveling through all at the same time,
which I had never heard of anything like that happening.
And there's no link or shared DNA
with the Pizzagate origin story.
This just happened completely independently.
That is kind of what's crazy about it is it's this,
it popped up there, it popped up in Mexico,
popped up in Guatemala.
I think it was a similar version in Nigeria.
So there's something about this specific conspiracy that just-
Locks into people.
Locks into people. Yeah. And these algorithms, which are really smart and because they have
the largest data sets in human history of what will tap into our minds and impulses and what
won't, it had just figured out that this is something that is going gonna spread and spread it one time after another until it became QAnon.
Well, there was also a gamification of the Pizzagate thing
because it involved this dump of emails, right?
That then ended up on 4chan
and all these 4chan sleuths are going through them
and kind of coming up with coded messages
about what means what and anagrams and sluice are going through them and kind of coming up with coded messages
about what means what and anagrams and all that kind of stuff that led to this grander theory
that gets played out when that guy actually shows up
at Comet Ping Pong, what's it called?
Comet Ping Pong Pizza with his AR-15
and what went down, went down.
He bursts into what he thinks is gonna be the door to the basement
to discover that it's a closet after all.
What always blows my mind about the way that story ends
is he shoots into the closet,
which he thinks is the pizza gate,
child harvesting dungeon, turns out it's a computer room.
And so he just puts down his gun and surrenders, that's it.
He just, he knows that it has led nowhere,
but the conspiracy is still out there and still spreading,
even though he is kind of realized
that there's nothing actually to it.
Right.
And the platforms, of course, could not care less.
Right.
All right, so we have a sense of the problem, right?
In your reporting on this and all the work
and that you've done and the people that you've spoken to,
what is the thing that most shocked you
or surprised you about the extent of this problem?
The thing that most shocked me,
I think it was honestly learning that using the platforms,
it was what learning that using the platforms,
it was what we talked about, using the platforms influences your own sense
of right and wrong, even when you're not on them.
And it was this sense that it's not just the Pizzagate guy,
it's not just the QAnon people,
that it's all of us who are being distorted by it.
And when I started to learn about that,
I really scaled back my social media usage.
I turned off a lot of the features
and I really felt the change.
I felt the change in my mood.
I felt the change in how I think about politics
and the news that happens.
I don't read it through social media.
If I just read it through regular old websites or podcasts.
And I think that maybe that just shook me the most
because it implicated me. You know, probably objectively, I should be most upset about the
fact that Facebook played a major role in a genocide that expelled a huge population of
Myanmar from the country. But it's hard not to come out of this thinking that I was affected
too without realizing it. And that if that is true for billions of people,
including like 80% of Americans,
then the effect must be pretty profound.
Yeah, this mass unprecedented experiment
that is being performed on all of us in real time.
Right. Without precedent.
I mean, it almost it's,
I'm gonna sound like that guy from Dr. Strangelove.
It's almost this like
pulp B sci-fi movie where they're putting drugs
in the water and nobody realizes it
and we're all taking it without realizing it.
I mean, that's more true than not.
Right.
So when we try to understand
what the possible solution to this is,
it very quickly becomes incredibly complex and tricky. And it seems to
my mind that this battle is being fought on several fronts. On the one hand, we have this
argument about free speech, right? Free speech being a core fundamental premise of all of these
platforms that you track all the way back to the inception of Silicon Valley
and the sensibility of these founders and these engineers.
So we have the free speech piece.
We have this piece about how everything can be solved
through engineering, which brings up the algorithm.
We just need a better algorithm to solve this.
And then the third piece to my mind is the reluctance of the people who are running these
companies to at least publicly acknowledge the problem or the extent of the problem and to get
into action in terms of solving it. Right. It's tough. It's really tough when you think about
the corporate incentives, because at some point,
it's like asking cigarette companies to say that.
They're antithetical to each other, yeah.
Right, right.
And it just, the way that our system and our economy works
is you're asking these giant companies,
or we're asking these giant companies
to disavow the thing that makes the money,
to say that not only are cigarettes addictive,
but you shouldn't smoke them,
or not only is pumping oil bad for the environment,
but in fact, we should probably radically downgrade
the amount of energy that we use.
This is a question of,
I think it's more helpful to think about
where you want to end up
and then try to figure out how you get there
than the other way around.
Because when you start with how do we get there,
it's easy to say, we'll just tweak the algorithm.
That was the solution after the 2016 election
is they said, well, we'll get better engineers to come in
and have even more sophisticated algorithms,
which of course just made the problem fundamentally worse.
And I would ask people who study this,
including a lot of people who are still in Silicon Valley
and are true believers, what's the place
that we should try to aim for to end up?
It would always be some version of turn it off,
not turn off the entire thing
because social media does provide a lot of good
that we don't wanna give up,
but turn off the engagement maximizing features,
turn off algorithms, turn off the,
even Jack Dorsey, the former head of Twitter,
he for a while was saying,
maybe having a little light counter
at the bottom of a tweet
is incredibly destructive to our society and politics,
and maybe we should turn that off.
And there was-
And the later, I mean, he was kind of late to come around,
but in the final stages of his tenure
and the twilight of his reign,
he had some real epiphanies
that ultimately were not made real.
Right, right.
And there was a period where he was pushing against the
financial incentives of his own company and all the other companies were growing by leaps and
bounds and his wasn't because he was kind of constraining its growth because he thought that
it wasn't healthy. There is a version of social media that like any platform of any kind can have
some negative effects to it. But the pre-, 2006 pre newsfeed, pre likes social media,
it didn't have a lot of these harms.
And it was something that contained a lot of the good
that we hope for and like and appreciate from social media
without these distorting effects
and without this kind of addictiveness
and the changes to our behavior
because they didn't have engagement maximizing features. So it's perfectly possible to have that. Then it just becomes a question of
how do you actually bring that about? Yeah. I mean, on the incentives piece,
to my mind, the genie's out of the bottle and the incentives are driven by engagement and the
purpose of the engagement is ad revenue, right. So it's an ad supported model.
And this is predominantly the reason that these platforms
are built the way that they are built.
So one solution without eradicating capitalism altogether,
which is, if we have a revolution and we become,
our government gets overthrown,
that would change things of course,
but assuming that that's not happening, well, who knows?
I mean, maybe we are on the tip of Civil War Revolution.
I don't know.
That'd be two episodes of Abolish Capitalism too.
But one thing that could accomplish this
would be to sidestep the ad model
and go to a subscription model.
Like if Twitter became subscription only
and everybody had to pay two bucks or nine bucks
or whatever it is,
that would have a revolutionary impact on it.
I mean, Sam Harris had said that,
Jack Dorsey would have won the Nobel peace prize
if when he stepped, he just turned off the switch, right?
Like that's probably not gonna happen,
but I'm not the first to suggest that a subscription model
would create a healthier ecosystem.
Yeah, I think that's right.
And I myself have switched from,
in the media companies that I've worked for,
from one that was primarily advertising driven
to one that was subscription driven.
And it's not the same scale.
And also there is a lot of human input on media
in the way that there's not with social media.
But I felt the difference in incentives for sure.
When you're just trying to say,
I wanna provide a product
that people will find meaningful enough
to hit subscribe every month,
rather than how can I hook as many people
for as long as possible.
And it doesn't mean that you have to eradicate
the algorithm because when you look at,
like look at Netflix.
Netflix is totally algorithm driven.
But I don't feel like I'm being, you know, manipulate. Like I feel like when I wash algorithm driven, but I don't feel like I'm being manipulated.
Like I feel like when I watch certain things,
then that algorithm does a pretty good job
of showing me what I might like to watch next.
And the reason for that algorithm
is to drive up my engagement.
Like obviously they wanna keep me on Netflix
for as long as possible,
but it doesn't have the same pull.
Like I feel like I'm being,
maybe I'm being manipulated and I don't know what I'm sure I am, right? But it doesn't have the same pull. Like I feel like I'm being, maybe I'm being manipulated
and I don't know what I'm sure I am, right?
But it doesn't feel pernicious
in the way that the other platforms do
because it is a subscription model
and it's not trying to, you know,
I don't feel like it's,
I don't know what I'm trying to say.
Like, is it trying to radicalize me?
And I don't know, you know, like, I don't know,
but it just, it feels a little bit more benign.
No, I agree. I agree. I, but it just, it feels a little bit more benign. No, I agree.
I agree.
I mean, it's, I think that it's a difference in,
if they were really just trying to maximize your watch time,
they would be showing you soft core porn
and car wrecks all the time.
Because that's the thing that would keep you watching
for long periods of time, but they're not,
they're trying to show you the gray man,
which is also not great, but for different reasons.
Oh, is that that?
That's that Ryan Gosling movie that was terrible.
Yeah, I had such high hopes for him.
He'll come back.
Yeah, that was bad.
Yeah, he's got another Damien Chazelle movie,
it will be okay.
But it's Spotify is the same way where they ultimately,
what they want is for you to be happy with the experience. And it's a very powerful, very sophisticated algorithm
that is trying to make you happy
with having a subscription to Spotify,
which they can achieve things other than trying
to make you addicted and listen
to many, many hours of content.
Let's talk about the free speech piece.
Sure.
So with this, I mean, obviously this is the debate
that's raging across all of these platforms.
Like what are the parameters of free speech?
How much free speech should be allowed?
And with the maturation of these platforms,
we've seen a grappling with this issue in real time
and some legitimate movement from free speech absolutism
to recognizing that that's not gonna work.
Look at 4chan, look at 8chan,
see what happens when you completely shirk
any responsibility for managing that
to a real conundrum around what we should accept,
what we shouldn't and the differences between the platforms
and how that's managed. Right.
So you alluded to this, but the free speech absolutism has this kind of founding ideal in Silicon Valley that if you completely remove the gatekeepers, you completely remove the rules, which you should do because ideas should be completely free, then the best ideas will naturally surface. And this is actually something that comes very specifically out of engineering, where it's just whoever makes the best widget, they should be in
charge of making the widget, which makes perfect sense when you're making a semiconductor. But it
has never worked that way because what it always favors is, first of all, raw majoritarianism,
which we have learned can be a very destructive force, especially on these early social platforms like 4chan and Reddit,
but also that the best idea is not going to be the most engaging idea.
And so this is why we've seen over and over
that pure free speech leads to very intense
and extreme version of majoritarianism.
There's still a question.
Sorry to interrupt, but like just a thought,
if that premise was true,
then in the Reddit ecosystem, for example,
where you're upvoting the best comments,
that it would follow that the most insightful,
most intelligent comment would always be the most upvoted.
Right, and go on Reddit and I defy you too.
And what's ironic is that Reddit
has actually gotten much better
and the quality of the discussions
and the content on there has improved drastically
because they imposed gatekeepers.
And because they have people who work very hard
at moderating the conversations,
which is what we do in the real world,
is we have a sense of norms that we enforce formally
and informally to try to encourage what we think
is gonna be a constructive conversation.
So even Reddit, the Ur-Free speech has come around to this.
But the story that you tell around the CEO,
what was her name?
Powell.
Ellen Powell.
Powell, yeah.
Like her story was pretty illustrative
as being kind of at the forefront of that debate
and how she kind of got churned out.
Right, yeah, she was someone who earlier than most of us,
certainly earlier than I did,
saw where that raw majoritarianism was going. And she saw it partly because she's a woman of color in an industry that at that point,
especially, and still is overwhelmingly white and male. She also saw it because she'd been an
investor at Kleiner Perkins, this huge tech investment firm, and it had this discrimination
lawsuit against them that had made her more
thoughtful about the ways that the Valley is not welcoming to people who are outside of the Valley's
majority. And she did try sometimes haltingly, sometimes very bravely to change the way
fundamentally that the platform and the technology worked to privilege healthy conversation
over lots and lots of engagement.
Like we talked about super posters.
She actually identified the super posters on Reddit
who were posting, yeah, toxic stuff.
And she got rid of them.
It was like, I forget the number, it was very small.
It was like a couple thousand users out of millions.
And it was this incredible change
where there's a study that some academics conducted
and they found that the amount of hate speech
overall on the site dropped substantially.
Like 80%, I think.
Was it 80%?
Something like that.
It was a crazy statistic.
It's a really high number.
Which just goes to show the power that these platforms have
to tilt an entire community one way or another,
even if we don't. But she got excoriated for it and ultimately fired. She did. She was lambasted as destroying free speech. She was a big target of the movement that at that point was kind of
burgeoning. Gamergate was harassed pretty severely and in ways that were very sexist and racist on
the platform. And for that and other reasons was quickly pushed out
of Reddit.
So that was an early experiment
that kind of went sideways, right?
But kind of a canary in the coal mine, right?
So here we are, and we still have the Zuckerbergs
out in front saying,
listen, we're not in the business of policing speech.
We may find this speech to be deplorable,
but we're not, these platforms are not about moderating that
or quality control on speech.
And yet they've been forced to reckon with this
because of the violence that we've seen play out,
which puts us in this really weird, murky place
where moderation being an imperfect science,
whether it's human driven or AI driven,
is going to have scattershot results
that are gonna ultimately make nobody happy.
Everybody's mad, right?
I'm being shadow banned, I'm being downvoted,
I post, nobody sees it.
It's a conspiracy of the left
to suppress conservative voices, vice versa.
All of this we're seeing, you know, being discussed,
you know, ad nauseum online.
Nobody seems to have a good grasp
on how to truly solve the problem.
Meanwhile, we're kind of spinning out of control
without any real solution in our sights.
Right, and the where you draw the parameters of speech,
acceptable speech, unacceptable speech,
is in some ways an unsolvable problem.
And it's one that we've been continuously litigating
in this society and every society
for as long as we've had social norms in a society.
It's also, I think in some ways,
it's a little bit of a,
it's a red herring that the Silicon Valley companies
actually really like to draw attention to
because it's sympathetic and because when you isolate it,
it's like, well, should this be allowed
or should this be allowed?
Is this post okay?
Is that post okay?
The problem is not which posts are allowed and not allowed.
And this is something Renee DiResta has really emphasized.
It's what gets promoted and it's what gets amplified.
And if you had platforms that did not so reliably and consistently amplify the
things that were so harmful and so distorting, the free speech question just becomes a lot lower
stakes. The potential damage of making the wrong decision one way or another becomes a lot less
dangerous because if you leave something up, if you leave up a piece of hate speech that's getting
engagement from three people,
then it's not good for those three people who saw it,
but it's ultimately not going to be.
You may be crying fire in a theater,
but there's nobody in the theater to hear you.
That's a great way to put it actually.
I like that.
Yeah.
It's such an interesting thing.
Like this idea of the obligations of the platform.
I mean, free speech has to do
with government regulation of speech.
Not social regulation.
Social regulation defined how?
So if I were to use a racial slur on your podcast,
which I'm not gonna do, that's not illegal,
but I would face,
and you would probably face social consequences for that.
Oh, sure, okay, I understand what you mean.
Right, and so that's this idea of social sanction,
the way that we informally police one another's behavior
and social norms in ways that are not necessarily illegal.
You know, someone cuts you in line for the bus
and you kind of tut-tut them, that's social sanctions. Or if
someone, one of your friends joins a political group that you think is damaging for society,
and you say, well, I'm not going to be your friend, or I'm going to distance myself from
you to punish you for that, that's social sanctions. So it's this form of informal
regulation that we all participate in. And it's messy and it's imperfect and it gets stuff wrong.
But because it's something
that we evolved in over millions of years, we're reasonably good at it. And that is something that
you see on platforms that don't have these distorting algorithms, that don't have these
engagement maximizing features, is they don't naturally reset to a naturally healthy discourse.
But like this kind of more recent Reddit, after they've done a lot of work on this,
if you can think thoughtfully
about how you have responsible gatekeepers on the platform,
how you can have your social reference,
your super posters be people who are setting positive norms
instead of negative norms,
then the question of what are the moderators have to remove,
that also becomes a lot lower stakes
because it's the community that is doing
a lot of the enforcement in terms of
if someone says something terrible on a platform
that's not Facebook, that is more legitimately neutral,
then the other people on the platform will say,
well, you know, you shouldn't say that
because that's a hateful thing to say.
Right, right.
I mean, that, a related matter to that
is the impact
that that socialization of speech can have
on somebody who transgresses it, right?
So this is the Justine Sacco example of,
so you've been publicly shamed,
like the people who misstep sometimes with intent,
sometimes without intent,
who then suffer a disproportionate amount of, you know,
alienation and outgroup dissonance, you know,
or even outward hate as a result.
And this is something that is created by a social dynamic.
It's not driven by the guide posts
of the platforms themselves,
but ultimately is having a real pernicious effect culturally.
And I think a chilling effect on, you know,
people feeling like they should even say anything.
Like, what is, why should I?
Like, if I'm gonna suffer this kind of consequence,
who wants to go through that?
Like, I mean, nobody does, it's terrible.
And that deep sense of being cast out of the tribe
is a very primal, painful thing
that you talk about in the book.
Like, you experience it as physical pain.
Right, because we evolved in an environment
where you literally cannot survive
without the approval and support of your community.
That's tribes of like
100, 150 people that we evolved in where this was something that was, you really needed to maintain.
And if you lost it, if you upset the rest of your tribe, it could often be fatal. So that is why
when we, if you're online and you're getting yelled at by 200 people, it feels really, really bad because your brain has evolved to experiencing that
as a mortal threat, akin to being stabbed with a spear,
akin to losing access to food.
But because of the way that social media works,
if you do something transgressive in real life,
maybe eight people who see it will say,
"'You shouldn't do that,'
and will kind of modulate their voice
based on what they think is appropriate
based on your transgression.
If you do it online, it might be 2000 people
or it might be 20,000 people who get really mad at you.
Probably not 20,000, but it'll be a really large number.
Or you're caught on video and then it goes viral
in millions of people.
Right, yeah, right.
Which sometimes it's a real transgression
and sometimes it is not a real transgression
because these things move so quickly on social media
and the incentives for each individual user
are to participate because participating
in a pile on someone, it feels really good.
It is physically pleasurable to do.
And it is something that you feel an urge to do
because you know it will win you the approval
of your community to say,
I'm one of the good ones too.
I share this sense of morality
and I share this sense of moral outrage
against this transgressor.
And sometimes that can be productive.
I mean, I talk about the Central Park incident
from summer 2020.
The bird watching guy and the dog walker.
Yeah, the dog walker, this woman who she had, I'm sure people remember it,
she had her dog off.
There was a black guy who was a bird watcher
who said, please leash your dog.
And she called the police
and pretended to be in mortal danger,
which at that particular moment was something
that she knew carried the threat of the police coming
and killing this guy basically for the sin
of asking her to leash her dog.
And that is something that pre-social media
would not have gotten punished and therefore is activity
that would have been allowed to perpetuate
that's dangerous and bad.
And social media allowed for the social sanctioning of that
in a way that would not have been possible before
at a huge scale that sent a message,
not just to this woman,
but to other people who might be seeing it
and who might be one day tempted to do something similar
that you shouldn't do this
because you will pay a social cost for it.
So that's good.
Right, weaponizing your privilege to sick the police
on somebody who actually didn't do you any harm or wrong.
What I didn't know about that,
I mean, obviously I remember that when that happened.
What I didn't realize was that the gentleman,
the birdwatcher, they were both called Cooper.
He didn't even wanna make a thing about it.
It was his sister who shared the video.
Right, yeah, she shared it.
And it was funny reading interviews with him about it after the fact where he would say,
what she did, this woman, Amy Cooper was bad
and she was trying to unleash certain dark forces.
But he said, I don't know that her life
needed to be destroyed over this
because the punishment that got meted out
was just determined by the raw group dynamics
of the platforms.
Well, it was also timing because it happened to coincide
with the kind of outset of Black Lives Matter
and all of that.
So it was a flashpoint.
It was like a perfect storm of events
that met the criteria for what people were interested
in talking about.
Right, interested in talking about,
had legitimate real anger that they wanted to express. It just happened at this scale where she gave up her dog at one point, the shelter that she had adopted it from. It's unclear exactly how, like recalled the dog from her.
was a striking example to me because I think, I think we're all vaguely aware,
but it's tough to articulate sometimes
that even when this online social sanction
is towards something that is deserving
and might be used towards ends that are helpful
for society that it quickly, quickly can run out of control
because the scale and the incentives of the platform
just tilt towards these extremes.
And because it's something that happens a lot to people
who don't deserve it,
but they were caught in a photo or in a video in a way
that it looked like something bad was happening
or they were maliciously portrayed in a certain way
as having done something.
Like the Sandy Hook parents in the Alex Jones example.
And then you tell the story of Cecil the lion,
which was another big one, right?
Or the dentist who, you know, I mean, that was a whole quagmire shit show.
It was both. I would, yeah. That would, to me felt like, I think looking back, I think that was the
tipping point. I think that was the last, in my mind, that was the last moment when it felt like
just the internet. And that was, if people don't remember, if you were online at the mind, that was the last moment when it felt like just the internet. And that was,
if people don't remember, if you were online at the time, it was a really big thing. I think it
was early 2016, late 2015. An American guy, a dentist from, I think, St. Louis was in Zimbabwe.
He was a big game hunter and he shot a lion under circumstances that were in a legal gray area,
and he shot a lion under circumstances that were in a legal gray area,
but in a near a park where it is legal to shoot lions.
Right, and the lion had been lured out of the perimeter
of the park where it was legal.
Exactly.
He was also a beloved lion who had a name
and there were a lot of people.
And listen, a big game hunter is the least sympathetic person
you're gonna come across.
Yeah, right, right.
It's hard to say.
And why are they always dentists?
Why are dentists always the big game hunter guys?
Dentists are more,
it's also always the dentist down at Mar-a-Lago.
I think this is kind of the secret like swing vote
in American life.
Next time you're a dentist, have a talk with them
and say, you know, are you doing okay?
Right, so yeah, so this guy, this Dennis shoots a lion.
This goes crazy viral on the internet.
He has to go into hiding basically.
Yes, he has to, as the kind of outrage cycles
grow and grow and grow first on Reddit
and then on Twitter and then on YouTube.
I think I probably retweeted
or reshared something about that at the time.
Because I mean, it's when you're one-
It is outrageous anyway. Right, no you're one- It is outrageous. It is, right.
No, I agree, it is outrageous.
And when you're one individual,
what could be the harm of calling out something
that is outrageous and saying that that's what it is?
It's a kind of collective action problem
because for any one individual person,
I mean, except for the people who took it
to physical threats, I would argue,
it makes perfect sense and is maybe even the moral thing
to do, but when it gets amplified through these platforms
and at this insane scale of millions of people,
it becomes distorting, I think, out of scale
with what he actually deserved,
because he and his family have to flee their home,
his dentistry practice has to shut down,
which means the people he employ are out of work. He was in hiding for weeks because there were people who were spray painting
his home. There were people who were threatening to kill him. It was this like national hysteria,
like Jimmy Kimmel cried on air over the lion, which it's not to say that it's not sad,
but it felt to me, and especially feels looking back like this moment when the moral outrage
of the internet and the moral outrage incentives of the internet, because how many of these people
really cared about Suss of the Lion before this happened? Like zero, right? It's something that
can be so infectious and it can spread to this insane scale that have these consequences that
are first of all, beyond what anything is planning. Cause there's no central arbiter saying,
you know, this dentist deserves this punishment,
this punishment, but not that punishment.
It's just happening through collective action,
but also because we're all complicit in it.
So no one wants to say,
maybe we shouldn't threaten to kill this dentist,
which I think is partly why that incident
has been kind of forgotten.
I mean, the New York Times ran like 10 articles about it.
Yeah, and for every story like Cecil or the birdwatcher,
there's a million other ones, right?
So here we have this problem.
This problem's not going away.
The weaponization of speech, the policing of speech,
the mob unleashed, the doxing.
You know, there's clear cases where we can all,
maybe not all of us, but like reasonable minds would say,
this is probably not good.
Too far, too far.
Yeah, but where do we draw that line?
And once you get into drawing those lines,
it becomes infinitely problematic.
So this is the problem that all of these founders
are grappling with.
This is the problem that we're grappling with.
This is society.
I don't see it getting any easier to solve.
Like, where do you come down in terms of your thoughts
on the best way to manage and rectify
what is wildly spinning out of control?
I think that it's,
I think it starts with asking a slightly different question
or kind of reframing the question in our minds.
The kind of the version of the question of,
what's the amount of outrage that we allow
or what are the posts that we allow or don't allow?
It's a little bit like thinking,
where do we put all of this toxic sludge
that's coming out of this factory that's in our backyard? Should we put it in drums or should we put it in, you know, plastic containers
or should we dump them in the bay? Right. Yeah. Or go dump it in the bay. And another way to think
about it is, should the factory be producing this much toxic sludge? Is there a way that we can
maybe just dial things back in the factory so that it is not polluting our homes?
And I think that those questions get a lot easier to solve when you think of it less in terms of how do we manage the outputs of the system and more in terms of how do we get the system to stop pumping out the sludge?
And that's partly a high-level regulatory question, which there's more movement on than you might think.
There's a real chance that some of these platforms
might get kicked out of big parts of the EU.
And that's something that could really happen.
Wow, what gives you that sense?
The regulators there have imposed some pretty strict fines.
The social attitudes towards the platform
are a little bit different, especially in Germany,
which is a big deciding vote in a lot of EU matters. There is really a sense that this is
like this foreign thing that has come into our country is polluting it. So it feels easier to
kind of say we should just get rid of it. But I think if it does happen, the way it would happen
would be imposing rules that are impossible for the platforms to
follow without fundamentally changing their nature. And you've already started to see the
start of this where there's these German laws that Facebook is legally culpable for certain
kinds of speech if they don't preemptively remove it, which is not possible under their current
model where the posts go out and they review it after the fact. And then I think the choice will be this kind of game of chicken where the EU will say,
well, you can stay, but you have to completely re-engineer your platforms to meet the standards
that we set. And then these companies will say, well, Europe is a shrinking market for us anyway,
which is true because adoption rates are going down and population rates in Europe are pretty
static or declining and our future is in the developing world.
So we'll see you later.
Right, but that would also set up
a sort of cold war domino thing, right?
Like if Europe falls, what's next?
They've set a precedent.
My mind goes towards what happens when every country
sets up different laws and regulations and rules around how these entities
can operate in their territory,
which becomes impossible
because these are global entities, right?
So how are they gonna sort of partition
how they're managing it based upon geographical boundaries?
Right, and the companies have started to get
pretty aggressive about fighting back.
Do you remember this thing in Australia?
I think it was about a year ago.
The Australian government
was setting these new rules.
Oh God, what was it?
Oh, it was Facebook had to,
Facebook and Google both had to sign agreements
with major Australian media companies
to basically say, we Facebook or we Google
will give you a portion of ad revenue
because we are selling ads against your content.
Right, which was, there are differing views
on whether or not that was a good law.
It was promoted very heavily by News Corp,
which is a major media presence there.
But Facebook's response, Google said, okay, we'll do it.
And they cut deals with media companies
to pay them a bunch of money for the rights to host their content. Facebook said, okay, we'll do it. And they cut deals with media companies to pay them a bunch of money
for the rights to host their content.
Facebook said, okay, we won't host any news.
And they just flipped the switch,
which blew my mind for a couple of reasons.
And one was that I remember very clearly
three years before that,
in the middle of the genocide in Myanmar,
when the United Nations is screaming their heads off
to say, Facebook, you are driving actively
this ongoing genocide,
and Facebook refused to turn off the platform.
They were getting asked by journalists,
just please, would you consider switching it off?
And they said, no, we won't do it.
But when someone targeted their ad revenue in Australia,
all of a sudden it's lights off.
And it had a real impact on the country
because these platforms have been so
successful at dominating how people consume news and get information. All of these domestic abuse
groups that were based on Facebook, like weather groups, couldn't get information out. And there
are other websites that people could obviously access to get the news. But of course, once news
publishers were blocked in Australia, what filled in the void for people on there
who wanted to learn what was going on in the world,
but rumor and misinformation.
Sure.
And so it was, I think, a telling blow that there,
or a telling incident that there are going to be more fights like this
between governments and the major companies,
which we don't experience here as much because it's unthinkable.
How could we shut down our own, you know,
three or four largest companies in the country?
But in other countries, it's a much more live question.
It was interesting how Facebook got its hold
in some of these developing countries, right?
Where you describe the, you know,
distribution of low-cost mobile devices
that were pre, you know,
pre-programmed with a rudimentary Facebook app
already built into them and deals that Facebook struck
with cell providers to basically give them free data plans
for a period of time.
And so that basically, you know, created a situation
where their only way of communicating with each other
and for purposes
of how they're sourcing their information in news
was through Facebook.
Right, which they presented at the time
as this great gift to the world that we're going to go out
and pay for cell phone, internet data.
It's like the drug dealer giving you the freebie.
Exactly, right.
And if you look at the way that they would talk
about internally, it was all about owning the market. It's all about, and this is what they did in
Myanmar. People have cell phones for the first time. People have internet for the first time.
We're going to make sure that they 100% access the internet through Facebook. So that when this
market matures and becomes worth something. They're already so inured to it. Right. And
they're completely addicted and we control it. And what that, one of the many things that means
is that local publishers, like let's say a newspaper
and a media company can't possibly compete
because it's free.
How can you compete with free?
And how can you compete with also it's on your phone
and it's so shiny.
And the displacement that happened was just devastating
to a lot of reporters, a lot of media companies,
a lot of publishers in these countries.
And if you talk to people in these countries,
they think that Facebook is the internet,
that little F button and what it pulls up,
they think that's what the internet is.
So there is really not as much of a standard
or a practice of, well, you have your Facebook,
but you also go load up the Myanmar Times website,
or you also go load up these other eight ways. You load up your Gmail account to go email with
people. It's all through these one or two companies that control all the information
discussion, which is kind of nuts. It's really about what sacrifices we're willing to make for
the sake of convenience, right? So when those phones come preloaded with Facebook, you can
probably go out and get other apps.
You're free to do what you want with that.
But when it's so convenient,
that's gonna create habit formation around that.
So, and this is playing out for all of us.
Like I had an experience just the other day.
I went for the first time into an Amazon Fresh.
Have you had this experience?
I've been into them once. Yeah.
So, I mean, for people that don't know, like, okay,
pull out your Amazon app on your phone
and hit the in-store locator thing.
It pulls up a QR code and you walk into the place
and you scan the QR code and then you just go
and put whatever you want in the bag and then you leave.
And there's, and then you look up at this,
there's a scaffolding just like we have up here in the studio and then you leave and there's, and then you look up at this, there's a scaffolding,
just like we have up here in the studio.
And there's so many cameras that are looking at you.
There is nothing.
I mean, if you, you know, pulled a note,
a hair out of your nose, it would notice it, right?
Like, and I had this strange sense of like,
I am being so hyper watched right now.
And you consented to it.
You chose that. I chose it. This is the point that I'm trying to make. So I went in, I was like, I want to see hyper watched right now. And you consented to it.
You chose that. I chose it.
This is the point that I'm trying to make.
So I went in, I was like,
I wanna see what this is like.
I went in and then I did it.
And it is weird to like, okay, I can leave now.
Like I think there's that Saturday Night Live skit
where Keenan, what's his name?
Keenan Thompson.
Yeah, he's like there and it's like,
he thinks he's gonna get arrested
because he's black. He's like, no, you can leave. Like, I was like, and it's like, he thinks he's gonna get arrested because he's black.
He's like, no, you can leave.
I was like, what is happening?
And I thought that is incredible.
It's so unbelievably convenient,
but am I comfortable with that?
I don't know and I keep thinking about it.
Like this is something that's like it right
at that friction point of like convenience
and the sacrifice of privacy. Of course, Amazon is
collecting the information about what I bought and is then going to use that data to serve up
whatever it's going to serve up to me. Yeah, you're all upon in this greater thing, right?
And that is when I have heard, I'm generally skeptical of arguments that breaking up the
companies will improve things because it does nothing to change
the underlying business model of maximizing engagement to sell ads. But one of the versions
of the argument for breaking up the companies that I have heard that does make, I think,
some logical sense is that their enormous market power, just the billions and billions of dollars
they have and the fact that they control all of these different aspects
of how we relate to the outside world
and engage with the world around us
makes it very easy for them
to introduce something like this,
to say, well, now there's a store everywhere
and it's in every community
and there's all these cameras
and you normally would never say yes to it,
but we're gonna make it so easy
and maybe we're gonna give you
a little bit of a discount for going there.
So eventually you're going to acclimate to it.
And that is something, I mean, it's like Facebook
going into Myanmar for free and saying,
we're gonna lose money for 20 years
so that we can own it for the next hundred.
It is interesting.
I mean, it seems like such an easy lift to say
that WhatsApp and Instagram should be divorced from Facebook, right?
And to me, the fact that the justice department
or the FTC or the antitrust department
of the justice department can't make that case
or has not yet made that case.
Like it feels feckless on my part,
from my perspective that they haven't proceeded
in the antitrust actions that they have gone,
that they have pursued proceeded. And the antitrust actions that they have pursued
seem like weak versions of what the real problem is.
You mean in terms of changing
the underlying mechanics of the system?
Yeah, in terms, right, yeah, yeah.
There is this sense where it's a blunt instrument
for what they're trying to do.
It's a hammer when maybe you need a scalpel.
Some of the cases, I'm not a lawyer.
It's tough for me to say how compelling they are
or not you would know that better than I would.
Some of them do, at least people tell me
that they are proceeding
and that they might be a little bit more promising
than we think.
There's some argument that they work as a deterrent,
that it's just, you better stay on our good side
or else we're gonna bring down this enforcement action.
But that's kind of a scary approach.
I mean, come on, look at how these,
they continue with wild abandon to build
and grow and consolidate.
It did look around like 2017,
like they were really gonna try to make peace with Washington.
And I actually, as I say that, they really did.
And they were actually enormously successful even as Trump and a lot of senior Republicans were railing constantly
against the sort of social media companies at cultivating a lot of people in power in Washington.
But what they were not prepared for was the flip to a democratic administration. Although
it's tough for me to say there's a lot of talk out of this White House and Justice Department that now that we're in power,
we're gonna use our 48 years to do the thing
that they managed, that these companies managed
to buy a reprieve from in 2016
when they kind of got Trump on their side.
I don't know if it's actually gonna happen or not.
I honestly couldn't say.
It doesn't feel like it.
If it is gonna happen, it's happening quietly,
which may mean it's not happening.
Yeah, so as we kind of start to round this out,
where do you, if things continue on this trajectory,
the way that they've been playing out,
like what is the, where is this headed?
Like, can you paint the picture
of the near and distant future?
So I got, it's hard to say partly
because the platforms change a lot.
So we don't have a great sense of TikTok's influence
on politics and society.
Well, that's a whole other podcast,
especially with Chinese ownership, et cetera.
And the black box nature of it all.
Right, it's so black box.
And what it surfaces and doesn't
is so different from other platforms
that we don't have a great sense
for what the implications of that
are for users individually or collectively.
And there could always be another platform
that has some entirely different effect.
But I will say that I spent a lot of time
for this book in Brazil,
which is a country that is very similar
to the United States in so many ways,
in terms of its politics, its social divisions,
its racial divisions, its economic makeup,
and felt to me like it is a few years ahead
of the United States in terms of social media's influence,
which is just as big there as it is here.
And I profiled several people
who were YouTube influencers, basically,
who are now in high positions of government.
Bolsonaro, the president,
he kind of got a start as like a YouTube guy.
He was just like this fringe person
with very little constituency in the country,
but a huge influence on YouTube.
And I think the hints of the future
that I got from Brazil are that,
I think the relationship between politics
and social media is going to flip.
And I think that we are going to see less
of social media as a place
where some political influence plays out
and then feeds into politics
from kind of the bottom up
in the way that we saw in 2016. And I think that we are going to see more politics driven actively by and mirroring
social media in the way that in Brazil, you don't have lawmakers who are trying to appease Facebook.
You have people from Facebook who are now in government and the constituencies that they're
serving are the social media algorithms because that's what put them in power in the first place.
That's interesting.
It's hard to imagine that here though, right now.
You think so?
I don't know.
I don't think we're that far from it.
I don't think that we're far from,
I mean, in some ways with the like stop the steal candidates
and the Republican party who are rising up
and some of the QAnon people,
like these are already people
who came from social media into politics.
And I think that that is something that is,
we're gonna see more capture of that
in the Republican Party.
I don't know exactly what it would look like
in the Democratic Party.
I just can't say,
but it would not shock me
if we saw a version of that soon enough.
And what does that forebode from your perspective?
I mean, when you think about
the incentives of political leaders,
what are they trying to do?
They're trying to get reelected, right?
Or they're trying to go from the minority party
to the majority party.
And those incentives have been changing
in American politics for a long time
because of the collapse and weakening of the party system,
because of the way that weakening of the party system, because of the
way that politics are increasingly nationalized, where you look at your lawmaker as a representative
of the president's party now, instead of as someone who you might have a more personal,
local relationship with. But I think we could be facing a much larger flip where
it's not just that politics are nationalized. If you're running for a house
seat in Congress, you're thinking of how do I fit into the national political conversation,
but you're thinking, how do I fit into the social media conversation? And that means,
how do I get the right number of people on Facebook or Twitter or YouTube who are going to be
into what I'm doing? And that's just, it's a different market.
It's a different constituency and it's a different set of interests and incentives
than you have in traditional retail politics.
Sure, you have to understand the vicissitudes
of the algorithm and play that game
to literally game it in your favor,
which means it's gonna push both sides
towards the more radical fringes
of their respective parties and views.
And that's gonna then push the constituents
in that direction as well until we're so far apart
that it doesn't feel like the nation can cohere.
I mean, this is the dystopian,
endpoint of this whole thing. And it's like insane to even
contemplate that that could be a reality, but it does feel more and more prescient.
Yeah. I mean, I think it's, if that is going to be slowed down, that's a change that I think
can't just come from whatever we do with these technology companies. That is something where
it's also a symptom of,
like you were talking about,
one of the symptoms of extremism
is people feeling left behind in society.
This is also a symptom of our political systems gatekeepers
really weakening.
And speaking of whole other podcasts,
like strengthening the party system
and strengthening political gatekeepers
is definitely a whole other podcast,
but that feels to me like the only way to kind of slow
or even start to reverse this trend.
Yeah, do you feel like the gatekeepers on Capitol Hill
are becoming more educated and savvy
about the perniciousness and ills of social media?
I mean, famously there was these hearings,
but sort of aside from that, there was this sense like,
these guys don't actually really know what's going on.
So I remember the hearings where some of the questions
were like, what's a Facebook page?
How do I post on YouTube?
Sir, we sell ads.
Right.
There's that whole thing.
There has actually been,
especially in the last like two years
and especially after January 6th,
I think a really impressive education on-
That's encouraging.
It really is.
And even the member of the house
who represents Silicon Valley,
which I was kind of blown away by,
have really come around
to a much more sophisticated understanding
of what the roots of the problems are
and what pressure points on the company to point to.
I did a story about YouTube's algorithm
plucking out home movies of little kids
in various states of undress
that looked innocent on their own,
but then stringing them together
in a way that was meant to sexualize them
and to present them as soft core pornography, basically.
And after that came out,
there were letters from a couple of
senators that I was kind of blown away by how on point they were, where they were demanding Google
and YouTube change where in the product design process, safety and social impact people would
be involved, which is something that you, I wouldn't have even known to ask about, but I ran
that by someone and they were like,
wow, this is like, they talked to someone
maybe probably who left YouTube or left Google.
But I think they're starting to see
that they need to understand how the companies work
and how the companies make decisions
and then find little pressure points like this
that they can try to bring attention to,
to try to steer them in better directions.
And how receptive are the powers that be
at the YouTubes, Facebooks, and Googles?
Oh, zero.
Right.
Zero receptive.
Who's the worst?
It depends on how you measure it,
but the answer is YouTube.
I mean, just everybody who I have talked to,
if I would ask,
which is the one that is most pernicious
in its social impact
and which is the company that is most hostilenicious in its social impact? And which is the company that is most hostile
towards social pressures or towards political pressures?
No matter who I asked, their answer was always YouTube.
Sorry.
Yeah, it's unfortunate.
It doesn't mean that, of course, that everything,
I mean, I watch YouTube.
It doesn't mean everything on it is bad.
And it doesn't mean everybody who works there is bad.
But just the way that the company happens to be structured, the way that its systems
happen to work, their destructiveness, maybe because their destructiveness is less apparent
and has gotten less attention than Facebook. That's the argument some people make is that
they've been allowed to skate a little bit in a way that other platforms haven't.
They're segregated out and they're this kind of like
moneymaker for Meta and for Google,
whereas Facebook is the main marquee thing.
It's like the ATM for Google, right?
In many ways.
What's interesting is as somebody who's on YouTube
and we're filming this for YouTube right now, right?
It's a constant conversation that we have here
about how we position our content on YouTube.
And of course I know that if we come up with a crazy
clickbait title and some wacky thumbnail,
we'll get more views and I just refuse to do it.
And yet, and so we suffer the consequences.
Like we don't get the views that I think
that our content deserves by and large
because we're opting out of playing that game.
But I need to sleep at night too, right?
And so, well, maybe we can,
so it's a constant thing of like, can we say that?
Like, is that okay with you?
Or is that, you know?
And it's like, it's not even a conversation I wanna have.
Like I just wanna put, you know,
episode number, blah, blah, blah, Max Fisher.
And what's tough is that you can wrestle with it
and torture yourself and make the right choice.
And someone else is gonna make the wrong one
because anybody can put anything on YouTube.
Someone will go the 10 extra steps, be more extreme,
be more provocative.
And the algorithm will find them even if it's, you know,
someone who is using a $70 camera and microphone.
Yeah, but I still, call me naive, Max,
but I still believe that,
you're calling me naive before I've even shared my thought
that over the, if you play the long game over the long haul,
like ultimately, you know, water rises to its own level.
How do you mean?
Meaning that if I, like, from my perspective,
if I keep just trying to put out the best conversations
that I can in the nuanced form that they, you know,
often are, that ultimately it will find its audience.
Sure, that's true, yeah.
But it won't happen quickly.
And it definitely won't happen as quickly.
There's no short cutting to that.
Yeah. And there's a trust
and a faith that comes with that,
that exists outside of any algorithm.
Right, and I mean, you are still finding an audience
on YouTube, thanks partly to the algorithm.
I mean, it's still helping.
And it occasionally, you know, like we'll hit a,
we've, you know, we've had a couple ones
that have hit the algorithm lottery and go crazy.
And then of course you try to reverse engineer it.
You're like, I don't know why that happened.
Yeah, I talked to some political activists who used YouTube
and they were in their view, trying to use it for good,
but they wanted to say, well,
we need to reach a lot of people because we are trying to,
this is in Brazil, we're trying to change politics
in a healthier direction.
And one of them was telling me that after like six months,
they looked up and realized that they had endorsed
all these views that they had hated beforehand
because it was just so easy to follow those incentives.
But I mean, the thing is none of these platforms are all bad
and promotion is not inherently bad.
So you can still have a really good podcast
that reaches a really great audience on YouTube
and does it
in terms of the promotion on YouTube, basically for free, which is pretty cool.
Yeah, a hundred percent. I mean, I think that it means that you have to shoulder responsibility for
how you use it and understanding that you are, that the deck is stacked against you because it's
so powerful and addictive, but to the extent that you can manage to restrict yourself
or to come up with rules that work for you,
it can be a positive.
My whole career is based on the internet.
It's on YouTube and Instagram and all these places.
And as a result of Twitter and Instagram,
I've been able to connect with an unbelievable number
of extraordinary people that have enriched my life.
So it isn't a black or white thing,
but I think it is just more important than ever
for people to realize how powerful it is.
And as your book so beautifully illustrates,
it is changing our behavior, whether we like it or not,
or how resistant we are to that idea,
because it's an assault on our agency
and sentience and all of that, this is happening.
And so please, you know, reevaluate your relationship
with these platforms and maybe you can kind of end this.
You already alluded to how you've made
some behavioral modifications, but leave us with,
you know, some poignant thoughts on just how powerful it is
and maybe some other ways in which we can modulate
or think about our information diet
and the extent to which these platforms are useful or not.
I think that you hit the nail on the head.
I think that's the dilemma that you described
is the dilemma, a microcosm.
It's like food addiction, you still have to eat, right?
Right, yes. So nobody's gonna be, we're not asking you to be a Luddite. I'm not gonna be a microcosm. It's like food addiction, you still have to eat, right? Right.
So nobody's gonna be, we're not asking you to be a Luddite.
I'm not gonna be a Luddite.
No, I don't, I don't, you know,
I sometimes will hear people say,
throw away your smartphone
and you can't throw away your smartphone.
That's these- You can be like Johan
and go to Fire Island for a summer,
but you know, ultimately he's back on the phone.
Yeah, right.
We live in a world that is dominated by these platforms
and maybe one day that will change.
But as an individual living in that world,
we have to make peace with these systems
that we are gonna have to use
to understand the world around us, to relate to people.
And we're all facing a microcosm of the same dilemma
that you described, which is how to be moral on them,
knowing that the platforms make it more appealing
and enticing and rewarding than it's ever been
to do things that we would otherwise consider immoral.
And they make it harder and harder
and more of a disincentive to try to be responsible
with even just individual words that we might use
on dashing off a tweet.
Because each of these things that might feel like you're just at a stoplight
and you have a few seconds and you're just going to dash off a comment,
but they all play into this broader system.
They're all training the algorithm.
They're all training yourself.
They're training your peers, other users.
And maybe the best thing that you can do,
I mean, you should try to modulate your usage.
You should try to be really thoughtful about
how often do I really have to be on it? I post like 5% as much as I used to. But probably the
best thing you can do is just to be really aware about when you go to tweet something or post
something or to record something on YouTube, being thoughtful about, is this because I think this is
a good idea and what I want to do, or is it because this giant corporation
has nudged me towards doing that
because they wanna make a little bit more money off of me
and try to addict other users
to make more money off of them?
And what do I want my input into this giant system
that is shaping everything in the entire world to be?
And conversely, applying that same rubric
to your consumption.
Like, do I really need to watch this video
or click on this link or whatever?
Like I think just being much more mindful
and responsive versus reactive.
And there are little tricks you can do.
You can set up times where your phone
will shut off certain apps.
You can turn your phone onto grayscale,
which is actually incredibly helpful
at making it less addictive.
It doesn't take away too much from it.
But ultimately, I think it's just about knowing.
And it's just about trying to find that will
to decide what the right thing is gonna be
and then to do it,
even if it might feel really good
to look at some more tweets.
It's tough, man.
My eldest stepson, he's 27 now.
Him and his brother are among the most analog people
that I know, which speaks to, you know,
that generation has a different, it's interesting.
Like some young people are more analog
than a lot of people that are my peers, right?
Yeah.
So he went and got the light phone.
You know the light phone?
No.
So it's a, it's a So it's a cell phone that comes
with just a rudimentary stack of apps.
Like you can listen to podcasts on it.
That's cool.
But it's a black and white, like LCD screen.
Oh, I think I have seen this.
And you can do texts, but they make it all very hard, right?
Like, and it's a tiny little thing.
And he's like, this is what I'm doing now.
Like, I forget the iPhone.
And he lasted a couple months,
but it just became impossible for him to be in the world
and communicate effectively with people.
Like, and he ultimately ended up going back
because this is how powerful these things are.
To these things.
I bet that you know exactly where your phone is right now.
Yeah, I think so. Yeah, I bet everybody, I exactly where your phone is right now. Yeah, I think so.
Yeah, I bet everybody, I know where my phone is.
They've done these studies.
It is an appendage.
It might as well be embedded inside my brain.
You know what is a really fascinating thing to do?
Sit down with someone if they have their phone out
and take their phone off the table and watch them.
It's like you took their arm off.
Just watch how they react
to seeing their phone taken away from them
because it does feel like an extension of ourselves.
There's not something that we acknowledge to ourselves
because it's not fun to admit that.
But if you become aware of it,
I think it makes it a little easier
to be kind of thoughtful.
It's like, okay, that phone is not actually me.
It's not actually my community.
It's not my social network.
It's not my friends and family. It's just this little beeping device that has some cool stuff on it occasionally.
And oh, so rudimentary in the singularity to which we're headed, right? We'll look back.
Can you believe they carried these things around? I know. And it's amazing when you put it that way,
it's amazing how vast and rapid the changes have been, given that smartphones, the kind of modern smartphones that we have now, they're so new.
So new.
That was what was crazy about going to Myanmar.
It's hard to remember pre-iPhone, and yet it was kind of yesterday.
It was kind of yesterday.
And you can go to countries where, like Myanmar I went to right before they opened up to the outside world and
there were no smartphones. You couldn't get a SIM card. And then I went three years later and
everybody had a smartphone and it's the same people, but the way that they think, the way that
they relate, the way that they talk had completely changed over three years do largely in part. And
this is something that people there would tell you and told me, this just little piece of technology that had dropped into their hands.
Yeah. So the iPhone, was that 2006, 2007 when that debuted?
That sounds right. I got mine in 08.
Yeah. But it was still a number of years before the app economy was really up on its feet. So
it wasn't necessarily the iPhone per se, because that was just a BlackBerry
with a touchscreen until the app store really kicked into gear and all these third-party apps
started to become mobile first. So that wasn't really until, that didn't really kick in until
2010, right? So we're talking about 12 years. And everything has completely changed.
Completely changed. I don't know what it's gonna be like in another 20
or we'll all be farmers in 20 years.
Yeah, is it gonna be Mad Max Fisher
or are we gonna go back to some kind of agrarian,
bucolic society?
Or we're gonna find out.
I don't know.
You're the soothsayer, you're the truth teller.
I don't know, maybe.
That's the next part.
Well, we'll have you back to explain that.
Like I said, at the outset,
this is a really important book,
"'The Chaos Machine."
As of the recording of this, it hasn't come out yet.
I'm sure it's gonna be huge.
You got a beautiful, amazing review
in the New York Times today, right?
I saw that. It just came out.
I knew it was coming out this morning.
I did not sleep at all. It's pretty cool.
I was so nervous, but I was happy.
I was very relieved.
No, it was very good.
It was very good.
And this is gonna put you right in the center
of this important conversation
that so many of us are having.
And like I said, it is, you know,
if not the most important,
one of the most important conversations
that we need to be having, this affects all of us.
And this book really does an incredible job
of not only unearthing truths,
but really taking the reader
through the most comprehensive kind of traverse
of all the issues that come into play here.
So, you know, well done.
It's, I mean, I can't imagine the amount of work
that went into putting this book together.
It's incredible.
You've written a book.
Yeah, but not a book like this.
This is a different thing.
This took you all over the world
and hundreds of interviews with all kinds of people.
And it's a book about foreign international diplomacy.
It's about technology.
It's about history, sociology, psychology, neuroscience.
Like everything comes into play here.
Well, I'm excited to see it out in the world. And thank you for shining a spotlight on it. And thank you for having me on. I really appreciate it. Yeah, absolutely, like everything comes into play here. Well, I'm excited to see it out in the world.
And thank you for shining a spotlight on it.
And thank you for having me on.
I really appreciate it.
Yeah, absolutely, man.
It's a pleasure talking to you.
So if people wanna connect with you,
do you want them to connect with you on social media?
Listen, I'm as addicted as everyone else.
Well, you got a book coming out.
So I'm sure you're online now in a way
that probably you're not generally.
I'm already thinking through how much outrage
can I get into my book promotion tweets
to get the most engagement.
Right, are you AB testing?
Yeah, that's right.
I've got a room full of monkeys typing tweets
to see what's getting the most.
In Russia.
That's right.
Yeah, Vladimir Putin is helping me out.
Excellent, excellent.
Okay, so if people wanna connect with you,
where do you wanna send them?
Twitter, max underscore Fisher is probably the best place.
Or if you wanna email me, max.fisher at nytimes.com.
Right, and they can read your writing in the New York Times.
nytimes.com.
Yeah, cool.
All right, man.
Thanks, man.
Yeah, thank you.
Peace.
That's it for today. Thank you for listening listening i truly hope you enjoyed the conversation
to learn more about today's guest including links and resources related to everything discussed
today visit the episode page at richroll.com where you can find the entire podcast archive
as well as podcast merch my books books, Finding Ultra, Voicing
Change in the Plant Power Way, as well as the Plant Power Meal Planner at meals.richroll.com.
If you'd like to support the podcast, the easiest and most impactful thing you can do
is to subscribe to the show on Apple Podcasts, on Spotify, and on YouTube,
and leave a review and or comment.
Supporting the sponsors who support the show
is also important and appreciated.
And sharing the show or your favorite episode
with friends or on social media
is of course awesome and very helpful.
And finally, for podcast updates,
special offers on books, the meal planner,
and other subjects, please subscribe to our newsletter, special offers on books, the meal planner, and other subjects,
please subscribe to our newsletter, which you can find on the footer of any page at richroll.com.
Today's show was produced and engineered by Jason Camiolo, with additional audio engineering by
Cale Curtis. The video edition of the podcast was created by Blake Curtis with assistance by our creative director,
Dan Drake. Portraits by Davy Greenberg and Grayson Wilder. Graphic and social media assets, courtesy of Jessica Miranda, Daniel Solis, Dan Drake, and AJ Akpodiete. Thank you, Georgia Whaley,
for copywriting and website management. And of course, our theme music was created by Tyler Pyatt, Trapper Pyatt, and Harry Mathis.
Appreciate the love, love the support.
See you back here soon.
Peace.
Plants.
Namaste. Thank you.