Tech Won't Save Us - How YouTube Normalizes Right-Wing Extremism w/ Becca Lewis
Episode Date: January 21, 2021Paris Marx is joined by Becca Lewis to discuss YouTube’s history of incentivizing extreme content, how the storming of the US Capitol shows the power of media spectacle, and why we should see social... media platforms as media companies.Becca Lewis is a PhD candidate in Communication at Stanford University. She’s also written for a number of publications, including NBC News, Vice News, and New York Magazine. Follow Becca on Twitter as @beccalew.Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Follow the podcast (@techwontsaveus) and host Paris Marx (@parismarx) on Twitter, and support the show on Patreon.Find out more about Harbinger Media Network at harbingermedianetwork.com.Also mentioned in this episode:Read Becca’s report for Data & Society, “Alternative Influence: Broadcasting the Reactionary Right on YouTube.” You can also read her articles on YouTube radicalization, the final report on the Christchurch shooting, and why Trump’s Twitter ban was an editorial decision.Jacob Hamburger explains why the “intellectual dark web” and its claims about political correctness are nothing new.Alex Nichols explains how New Atheism was a precursor to the IDW and alt-right influencers.The video of Ben Affleck pushing back against Sam Harris’ Islamophobia on Bill Maher’s show, which was supposedly Dave Rubin’s “classical liberal” awakening.Zeynep Tufekci describes how YouTube’s recommendation algorithm recommends increasingly more extreme videos.Twitter workers demanded Trump be banned before Jack Dorsey announced the decision.People who inspire how Becca thinks about platforms: Robyn Caplan at Data & Society and Tarleton Gillespie at Microsoft Research.Support the show
Transcript
Discussion (0)
One doesn't have to go all the way down to like the darkest parts of 8chan to get into a space where people are willing and eager to commit these acts of violence.
Like that's right there out in the open.
Hello and welcome to Tech Won't Save Us. I'm your host, Paris Marks, and this week my guest
is Becca Lewis. Becca is a PhD candidate in communication at Stanford University,
and she's written for a number of publications, including NBC News, Vice News, and New York
Magazine. In 2018, she wrote a report for Data & Society called Alternative Influence,
Broadcasting the Reactionary
Right on YouTube, where she looked at the ways that right-wing YouTubers build their audiences,
spread their messages, and introduce people to even more extreme ideas, even without the need
for recommendation algorithms or other technologies. These are the incentives that are built into the platform itself.
With the Trump presidency coming to an end and the role that social media has played during his presidency, and especially with the recent events with his supporters storming the US Capitol
building, I thought this would be a great time to chat with Becca and get her insights on the
impact that social media and YouTube in particular has had on
growing these movements during the Trump presidency, but even well before that.
This is a fascinating conversation, and I think you're really going to like it.
Tech Won't Save Us is part of the Harbinger Media Network, and you can find more information
about that at harbingermedianetwork.com.
If you like this conversation, make sure to leave a five star review on Apple podcasts,
and make sure to share it so other people will find it and hopefully listen in.
If you appreciate the work that I put into making this show every week, you can join supporters like
Isaac a Stephanie Hardman and Kevin Mahoney by going to patreon.com slash tech won't save us
and becoming a supporter. Thanks so much and enjoy the conversation. Becca, welcome to Tech Won't Save Us.
Thanks so much for having me.
I'm excited to speak with you. Your research on YouTube and how that influences radicalization,
I think is super important, like especially right now, unfortunately. So obviously,
I want to dig into that with you. And before we get into your recent findings and recent events,
I wanted us to kind of get some insight on how we got to this time, I guess, or how we arrived here,
you know? And so I did want to start because, you know, I feel like social media and its role in
radicalization has become a really important narrative during the Trump campaign, right?
But obviously, it has been having this effect long before Trump came to power, right?
And it had to start somewhere for it to build to that point anyway.
So when you look back before, say, 2016, 2015, what do we see happening on YouTube and other social media platforms, if you're aware of
that, that kind of builds up to this point where we have a Trump figure benefiting from this
radicalization that has occurred over the past number of years through YouTube and these platforms?
It's such a good question. And I think for an academic who like loves to talk about history stuff that I could go back like way before the start of the Internet even, but I'll try to limit myself.
Even from the early days of the Internet, racist we have this stereotype, I guess, of white supremac was taking advantage of mass media technologies in really sophisticated ways. she calls them, which I think is a great way of describing it because, you know, Stormfront,
which for a long time was kind of the biggest white supremacist forum online, they were really early ahead of the curve in terms of recruiting people and getting this big presence where they
could start really networking across lines. I tend to, as you mentioned, look at YouTube specifically. And so I think the role of social media platforms is huge. And the signs were there with YouTube quite early on as well. I think that perhaps the more recent focus partially is because of, you know, Trump getting elected, of course, and so everyone turned their focus to information online. But also, because I think that there's an outsized focus on the role of the YouTube
algorithm, the recommendation algorithm specifically, which does play a really important
role in radicalizing people, because for those who aren't familiar, the way it works is when you're
done watching a video, it immediately either will autoplay a next selection, or also
there's a series of options for people to click on to what to view next. And the theory is that
the YouTube algorithm continually services kind of more and more extremist content.
And that the recommendation algorithm is relatively young in terms of YouTube's history. But actually,
even before that, there's been kind of a strain of, I guess you could call it like reactionary
celebrity culture on YouTube. And that's the piece of it that actually I'm more interested in,
and that I focus on more in my research. And so early examples of this, I think probably the two biggest are people who coalesced around Ron Paul as a candidate early on. And even though Ron Paul wasn't deeply active on YouTube, then kind of the next iteration was people who did start being, I think, kind of towards the end of George Bush's presidency, when I think a lot of the way that liberalism at the time was expressing became more than anything else this culture of
atheists who were worshipping these well worshipping is a very pejorative term but
we all worship celebrities I should say that up front to certain degrees but it was a movement of
these few celebrity atheist men so in particular Richard Dawkins, Christopher Hitchens, Sam Harris, and I always forget the name of the fourth one. There's four of them.
I always forget his name too. four horsemen of atheism. And so it really was a movement built around them. And when some women
in this movement started voicing kind of concerns about the ways they had been treated at conferences
and kind of the way that women's voices were marginalized within this group, then it very
quickly shifted into kind of an openly reactionary space with at times, you know, Richard Dawkins was leading the
charge on that. He was, you know, kind of openly criticizing feminism and women who brought their
concerns to the fore. So I think of that in some ways as the first real precursor to all of this
stuff on YouTube. And then right along with it was Gamergate, which actually also to a large extent
unfolded on YouTube. I think that's fascinating. And I'm happy you brought up the new atheists as
well. Because I feel like when we talk about, you know, what's going on now, and the people who are
participating in the intellectual dark web, the alt-right, like these communities, I feel like
the link is often drawn back to Gamergate, rightfully,
but I feel like it's not drawn back further, right? And the New Atheists were the first one
that I became aware of as well, right? And that I noticed. And when I was reading through your
research, what you were describing were things that I remembered seeing all the way back then,
right? With the reaction videos and the way that
people were taking advantage of the things that Sam Harris and Richard Dawkins were saying in
particular, to then kind of build on it, pull in larger communities around those kind of ideas or
thoughts or whatever and become YouTube, like mini celebrities or whatever, by kind of echoing the
same sorts of things that they were saying and collaborating with others. And this whole kind
of community formed around these really reactionary kind of atheist views that were kind of focused on
Islamophobia and were often very supportive of the US State Department and the Iraq War and
things like that, that a more
progressive atheism would would be against or something, right. And so I think one of the
interesting kind of points that I see that kind of links them together is, you know, in your report,
you talked about how Dave Rubin's kind of like, enlightening moment, or at least as he describes,
it was seeing Sam Harris go back and forth with Ben Affleck on Bill Maher, right, about this topic of Islam and Islamophobia and
things like that. But then there are so many people who come from new atheism that then show
up in the intellectual dark web, Sam Harris, Richard Dawkins, Ayaan Hirsi Ali, all of these
people, right? So I wonder what you make of those kind of links
going all that way back. And I guess these similar ways of building community that existed then,
that still are really important to what drives YouTube and these really far right communities
now, and how that has helped them grow and kind of increase their influence over time.
Yeah, I'm glad you mentioned the connections to the intellectual dark web, because I think
perhaps more than anything else, the direct link is there. I mean, the intellectual dark web is
a direct ancestor, if you will, of the new atheist movement with, yeah, many of the same people
involved. And, you know, for anyone who hasn't heard of the intellectual dark web, because I
know that can be a bit hit or miss, it's kind of this hodgepodge of media
personalities and academics and ex-academics. And their big claim is that they come from across the
ideological spectrum, but that they're all very upset with how closed off our conversation has
become because of a certain strain of progressive leftist who will shut
down conversation by calling things racist or sexist. And Dave Rubin, who's a talk show host
on YouTube, it is interesting watching his early content. So he had worked at the Young Turks and
then struck out on his own. And that's exactly right. He's really good at self mythologizing.
And he used this as kind of an origin story. He talked about this time that Sam Harris,
the venerable atheist, was on Bill Maher's show. And so was the actor Ben Affleck. And
Sam Harris said, we need to be able to criticize Islam in a free society. And Ben Affleck said that that
was racist and bigoted for him to say that. And Dave says this is an incident where, you know,
someone shut down speech and actually that the tenants of a liberal society or that we should
be having these conversations, and then basically use that as a justification for, you know, the
first several months of his show, he had on guests that essentially were just
criticizing Islam. And it went from, you know, people who were identified as like ex-Muslims,
all the way to people like Tommy Robinson, who is kind of a well-known extremist in England,
who's been to jail and, you know, has been affiliated with various like
far right movements there. And, you know, all kind of under this guise of, oh, we're just having kind
of these open conversations. So anyways, that's all a way of saying that, yes, there are direct
lines. But there's also more, which I think you're also getting at. There's lines to be drawn
or similarities in terms of like the way that the phenomena unfolded. And so there's lines to be drawn, or similarities in terms of like, the way that the
phenomena unfolded. And so there's, there's two aspects that I really focus on in terms of like
YouTube politics. One is this power of YouTube celebrities. And you know, academics like to use
the term micro celebrities, sometimes influencers, YouTubers themselves often call
themselves creators. So there's any number of words you could use to describe it. But essentially,
the phenomenon is that, you know, you build up kind of a niche audience on YouTube. And I mean,
sometimes niche, sometimes quite large. A lot of the people that I've looked at had anywhere from
hundreds of 1000s of viewers at a time to millions
of viewers at a time. But because it's on YouTube, there's an intimacy there that isn't there kind of
on cable news, for example, or in the byline of a New York Times article. And so people talk
directly to their audiences, they foster a sense of connection with them.
And the audiences do legitimately have a way to speak back in the comments.
And so they feel that they're being heard.
And for a long time, starting as early as the new Atheist, I think that there's been
this sense that people are coalescing around a shared set of ideas when actually what's
happening is they're coalescing around a shared set of ideas when actually what's happening is they're coalescing
around a shared set of celebrities. Not to say that the celebrities don't have ideas, but there's
a lot more meta conversation about ideas than there are actual ideas. And you know, if you look
up scholarly writing that addresses new atheism, basically they say there's no actually new like
intellectual concepts that have been introduced.
It's really just a popularized version of atheism,
which is fine, but there's a tension there.
So the celebrity piece is the first piece.
And then social networking is the other one.
So it can be very advantageous
for individual creators on YouTube
to collaborate with other creators,
because you get to get exposed to bigger audiences, right? If you go on someone else's show,
and they have a bigger platform than you do, that's great exposure for you. And also, it helps
kind of as a branding mechanism, right? Like we are all these creators who operate in this space,
and we kind of have this shared set of beliefs. And at the same time, viewers who can leave comments and go to whatever subreddit page
is about this, they can actually take part and become part of a community as well around it.
And so really, it's the combination of celebrity and social networking that I've tended to look at
and see as kind of some of the most important factors. Yeah. Now, I think that's a fascinating piece of it, right? Because going
back to what you said about the algorithm, and how there's a lot of focus on the recommendations
algorithm on YouTube being a source of radicalization. What you explained in your work
is that, obviously, that certainly is a source of it, right? But there is a much bigger thing at play here
where these YouTubers are collaborating with one another
to increase their reputation, to share audiences.
There are all these things built into the platform
to incentivize this behavior,
but then that also kind of leaves the door open to,
I don't know, say you watch one creator
and they're not so radical, they're a libertarian or whatever.
But then they talk to a white nationalist and then you start watching the white nationalist
videos and then they collaborate with a fascist or something.
And then you go to those videos.
And there's not a lot of pushback on these ideas in some cases.
So, you know, these people come across as seeming
really intelligent if you don't have kind of the knowledge to back it up. So how do you see that
kind of playing out? How do you see these kind of incentives in the platform on YouTube, kind of
encouraging radicalization beyond just algorithms pushing videos on people? Yeah, I think that's a
really good way of summarizing it that there are
all of these incentives there and they can be resisted. But a lot of people do find themselves
leaning into them. Building an audience on YouTube is not an easy task. You know, a lot of people try
it and are not successful. On top of that, it's really, really difficult to know why YouTube is making certain decisions that it makes. And so
a lot of creators really are kind of operating somewhat in the dark and trying to make whatever
decisions they can based on the cues that they get. So what happens is, for example, if there
are already audiences out there, which there have been, that are hungry for kind of far right content
that, you know, dunks on different feminists or all of these different things. Creators often will
see that coming through in their comments. And then we'll act on that and say, you know,
the audience has been asking for a video criticizing this feminist or addressing the
issue of Black Lives Matter or any of these things.
And so they lean into that because that's what their audience is asking for.
Not to say that they don't believe those things themselves. But I think one of the biggest issues with YouTube is that it actually can become impossible to tell kind of who is genuinely
espousing these ideologies from a place of pure belief and who is doing it because
it's profitable, right? The profit incentive is there. And so you have people that, you know,
have a reputation for being like, quote unquote, grifters in the space who are just doing it for
money or fame. You have people who seem to be like genuine adherence to these ideologies.
And then I think there's everything in between, right? There are people who may kind of convince themselves over time that they believe in what
they're saying, or who may genuinely get radicalized over time. So another thing I've
observed is different influencers radicalizing each other. And this is something that, you know,
you see on Dave Rubin quite a bit as well. I would also argue that he's leaned quite a bit into what his audience demands are,
but he'll have on prominent conservative or far right figures.
And he is kind of a pinnacle of, you know, claims to be giving interviews, but really
doesn't push back on anyone at all, right?
It's about the friendliest interview anyone can hope to get. And people will start espousing kind of openly racist, often disinformation or
conspiracy theories. And, you know, he kind of then incorporates that later on into his talking
points. And so you can see the way that these things snowball. And it's a boon for him to have some of these controversial figures on because
controversy generates viewership, right? Whether you're hate watching or watching out of love,
that money still going into his pocket. Absolutely. And with the way that these ideas are
promoted, pushed, I don't know exactly how radicalization works, how it takes place, but
I guess you watch more and more of this and come to believe the things that these creators are
saying if you haven't already believed them already and are just seeking it out because
it's something that you're interested in. But obviously, in the past couple months,
there has been a final report, I think, on the events in Christchurch where this man went into a mosque and shot 50
odd people. But that was something that was streamed online at the time it was happening.
And in this report, it actually said that YouTube was the source of this person's radicalization.
So when you hear things like that, what does that tell you about the
material impacts that YouTube is actually having? And is YouTube doing anything to address these
really systemic problems from its platform? Such a great question. Yeah, that was so
infuriating and heartbreaking to see that report because not only have kind of researchers in this space been
talking about the threats posed by YouTube in terms of Islamophobia and bigotry, but, you know,
even more importantly, Muslim communities have been talking about this forever and, you know,
have not been taken seriously or listened to. I think that his radicalization showed a couple of things. We don't have the full data on like
every single video he watched. But what that report did publish was that he actually contributed
money, he sent donations to multiple outlets and people that, you know, are YouTubers who I've
looked at, and you know, who are well known within that far right space.
And so clearly what that shows is that he had developed some sort of investment in these people whose content he was watching.
Right. There seems to have been that parasocial relationship there that gets developed where a viewer comes to feel very devoted to the people he's watching. In particular, a couple of the figures are like infamous at this
point for being kind of radicalization vectors. So Stefan Molyneux is one of the main ones who
YouTube has taken off of its platform at this point. But he, for the longest time was incredibly
strategic about presenting racist views only through the lens of what's
known as scientific racism, presenting racist ideas as if they are scientific fact. And based
on his presentation, was able to get him to stick around for many, many years on YouTube.
In terms of what YouTube is and isn't doing to address it, you know, it's, it's tough, because
they have in the past year or so taken a bit more of an aggressive stance on, you know,
kicking off some of the worst offenders in terms of, in particular, kind of anti semitic content,
openly white supremacist content, and so on. But there's a whole lot of others that they haven't done. Let me put it that way. And one thing they haven't done actually is they really haven't
grappled with the Islamophobia on the platform. And I think part of the reason behind that is
because Islamophobia is so normalized in our political culture right now, that a lot of quote unquote, mainstream YouTube creators who
have mainstream, again, quote unquote, legitimacy are promoting Islamophobia. So I'm thinking of
people like Ben Shapiro, and Steven Crowder, and Dennis Prager of Prager University. These are
people that get millions and millions of views on YouTube, and who often have like cozy relationships with Republican lawmakers and stuff. These are not
kind of people on the fringes of YouTube. And I think that YouTube is too scared to or, you know,
has no incentive to grapple with that issue, right? Because they are terrified of seeming political. And to grapple with an
account like Ben Shapiro's is to appear as if you're being more political than just to leave
him alone, which is very silly. But you know, inaction strikes people as less political than
action. So I think that that's one element of it. But then on top of that, I think that,
you know, de-platforming or taking down certain creators, it's very good and effective. But someone like Alex Jones has been banned now
from basically all of the mainstream social media platforms. And what has come out is that it has
significantly cut his viewership. He made a big stink at the time of, oh, this is only going to
make me stronger and more powerful than you can possibly imagine. And that's not what happened,
right? So de-platforming can be an effective way of
cutting down the influence of individual extremists. But what it doesn't do is get at
those kind of baked in incentives, right? So it can very quickly become a bit of a game of whackable
in terms of, you know, okay, we'll ban these few channels, and then other channels become more
popular in their place, espousing
similar ideas, and then they ban those and so on and so forth. As long as people kind of still have
that incentive there. And frankly, as long as kind of we're continuing to live through this
political moment dominated by some of these ideas, you know, it's going to continue to be a problem.
Yeah, no, I think you're absolutely right. And, you know,
I would just say to follow up on your points on like the Christchurch shooting and what came out on that report, you know, there was a similar, not so deadly, but a similar shooting in Quebec as
well. And in that case, the person who carried out that attack, Ben Shapiro was the person who
he was interacting with the most, right? Obviously,
there was a long line of people, you know, whose videos he was watching and things like that.
But Ben Shapiro was the main one. And Ben Shapiro still promotes really Islamophobic stuff.
And when they were interviewing him later after the attack, obviously the police,
you know, what he was saying was that he thought that these people were going to kill his family.
And that's why he, you know, went and did this and it was used as as this way of showing the degree to which
these videos and and just this like rampant islamophobia can like warp people's minds and
what they really think is happening and like then these just nice muslims going to like pray
just the same as like i don't, anyone's family will go to church
are like seen as this threat to the community by these people who have consumed so much of
this content. And I think one of the pieces of that report that stood out to me and that we've
seen time and time again, is that the New Zealand government, and again, it's not just the New
Zealand government's problem, was that they were paying attention or that they were looking for Muslim radicals or people inspired by ISIS or
whatever, right? And not paying attention at all to people who were white nationalists or whatever,
and that it seems like it's come up time and time again as being a problem. That is just ignored.
Absolutely. It's so infuriating. It's still considered in the
United States as well that when you talk about radicalization and whatnot, the dominant narrative
is still around Islamic extremism and so on and so forth. And I think it's partly because it's
easier for Americans to wrap their head around because Muslims are already kind of treated as the other
and all of these different things. And because there are a lot of Muslim immigrants, you know,
there's also able to frame it as like, oh, you know, in terms of xenophobia and all these things.
When you start looking at the huge number of people getting radicalized into things like QAnon
and whatnot, it's much tougher, first of all, because it's predominantly white men who do have the privilege of being white men and being treated
seriously. Also, because so much of it is happening within the confines of kind of mainstream politics,
right? So I think I struggle with this a lot. And I know a lot of other researchers in the space
struggle with this, that, you know, we talk about radicalization, and I think it is important to talk about it, but it's tricky to talk about it in a way that doesn't start making it seem like people are moving towards the fringes of society. a lot of what's happening is simultaneously these ideas are getting more and more mainstream,
or they've been latent, and now it's becoming, you know, more and more openly stated and all
these things. So to your point about, you know, the shooter who was watching Ben Shapiro,
I think that's a perfect example, right, that one doesn't have to go all the way down to like,
you know, the darkest parts of 8chan or whatever, to get into a space where,
you know, people are willing and eager to commit these acts of violence. Like that's
right there out in the open. And it's on a lot of cable news too, right? Tucker Carlson,
in particular, is someone who a lot of the YouTube creators I look at, they're huge fans of his. And they try specifically to
kind of feed ideas to him for coverage. They'll sometimes tag him in their YouTube videos and so
on. Who knows whether he's actually, you know, getting talking points directly from them or from
elsewhere. But a lot of the talking points are
exactly the same. And he has an even bigger audience than a lot of these YouTubers. And
again, kind of that seeming credibility that comes with having kind of a primetime cable news slot.
Not to mention, of course, all of the people that have been elected to Congress who kind of
either flirt with some of this stuff or have outright embraced, Congress who kind of either flirt with some of
this stuff or have outright embraced, you know, QAnon and some other conspiracy theories.
No, absolutely. And I think that sets us up really well to discuss the events of, I don't know,
a week or two ago, where this group of QAnon Trump supporters stormed the Capitol building
in the United States. And I think one of the
really notable aspects of that is how many were streaming live or taking photos to post on
social media and how this was really an event. Obviously, they were storming a seat of power.
But for a lot of these people, it seemed like an important piece of that was to use it in
some way to kind of bolster their online presence.
And obviously, that was not all of it.
There were people who were there who did want to attack Congress people or senators, potentially
kill them.
I'm not putting that to the side.
But there were also a lot of people who seemed motivated by online communities or growing their followings
or their legitimacy or whatever, right? So, you know, that's a general observation. But I wonder
when you observed what happened there, what you were thinking about and what stood out to you
about, you know, that whole event, I guess. Yeah, I mean, I agree completely with your point that
it was in many ways, like, first and foremost, a media spectacle. And, you know, when you think about it, why wouldn't it be? Because that is the way that Trump himself exerts his power and influence is through the use of media spectacle. And that, in fact, can be incredibly powerful in and of itself. So I think that, as you said, like not to take away from the immediate threats of danger
that could have taken place and that did to a certain extent take place. But also, I think that,
you know, there are some people that I've heard talking about this as like a bungled attempt,
and you know, that actually, they didn't know what to do with themselves when they went in.
I think there's certainly validity to that. But I also would say that I think that we shouldn't minimize the role
that media spectacle plays in all of this, right? That now it has become the complete focus of
conversation for everyone. It has, you know, for certain people, it has kind of increased their
fame. But even more than that, I think it has put these images into our minds that show them kind of, you know, literally like desecrating
what is supposed to be considered this, you know, symbolic place. And even though that's kind of
like a soft power to be exerted, what Trump has shown is that actually it's possible to build
kind of very real and concrete power through that. And so I think like, to me,
it's part of a continuation of a trend of media spectacles that like potentially will continue.
And we have no reason to think that that that sort of phenomenon would stop.
Absolutely. I think another thing that stood out to me was just as we were just saying about,
you know, the way in which many governments pay attention to or seek out the
kind of radical Islamic terrorists and ignore white supremacy, fascism, things like that,
that are very real and in our midst. And, you know, in the United States, reports have shown
that many more people have died of white nationalists and far right terrorists than,
you know, Islamic radicals or whatever. But it also kind of really backed up that point, right? Because
law enforcement the day of and the days after said that they knew nothing about this. And the way
that the people who stormed the Capitol felt that they had impunity, that they could stream all this
stuff, that they could take images and that there would be no consequences, right? And we saw it even the way that many police officers reacted.
And in some cases, we're even taking photos with them and assisting them possibly.
And I know that they're still looking into the degree to which police or even congresspeople
might have assisted with what happened that day.
But I wonder if that's another aspect of this as well,
in the way that they were able to show that off, it kind of shows their power in a way,
because they could do this and at least immediately not face consequences. Some of
them are getting arrested now. But that does seem to be, even though they didn't take over
the government, that does still seem to be a way to show power and force.
Absolutely. Yeah. And I think, you know, like some of the people were chanting as they went in,
like, this is our house, this is our house, right? And that really is a gut punch in the sense of like, well, historically, yes, it has been a white supremacist branch of government, right? And so
there is kind of this thing that I think makes,
it's uncomfortable to talk about, but it is the relationship between kind of the structural
racism within the United States and the United States government, and then kind of the overt,
explicit racism of the group storming the Capitol, and how actually there's kind of
an uneasy symbiosis there. And so, you know, that I think
that not only manifested in terms of kind of the police presence or lack thereof, but also in terms
of tracking kind of the media narratives that emerged in conservative spaces after the fact.
So I was keeping track of like, Fox News and some of the other, you know, Newsmax
and One America News, these, you know, hyper partisan cable networks, as well as some YouTubers
who are considered a bit more mainstream conservative than some others. But overall,
the trend was, you know, they denounced the people that went in, they kind of, you know,
said they didn't support any of the violence, they didn't support that small group. But at the same time, they did a lot of the rhetorical heavy
lifting for this group by kind of indulging in a range of different narratives, one of which was
trying to equate it to the Black Lives Matter protests that happened throughout the summer,
which involves like a whole slew of false equivalencies, right? But to me, that's like a very clear case of like a larger media structure
carrying water for kind of this more overt white supremacist group.
Absolutely. But I wanted to ask you, like, obviously, you described part of their response
there. But another part of it was to talk about free speech and how they're being silenced yet
again, right?
And, you know, obviously, in the days after that, Twitter closed something like 60,000
or 70,000 QAnon accounts.
And it seemed like Facebook finally took action.
YouTube seemed to belatedly like delete Trump's account a little bit after everyone else.
But I wonder what you what you make of this argument that
closing these accounts is like a violation of free speech or is censorship or something like that,
because for so long, these platforms have done so little, it seems, to kind of take any action
against these kind of people or discussions that ultimately do lead to violence.
We see with Charlottesville, obviously the storming of the Capitol, what happened in Christchurch in
Quebec, so many examples of this, right? So I guess what do you make of that argument that
closing these accounts is some kind of violation or censorship or something like that? Do you think
that's a legitimate argument? Yes and no. A lot of my thinking on this is basically deferring to scholars who focus
specifically on platform governance. So in particular, I'm thinking of like Robin Kaplan
at Data and Society and Charlton Gillespie at Microsoft Research and a few others. But
what they talk about is kind of the way that social media platforms are and always have been media companies. And so when you think about it through that lens, it's less about a form of censorship and he feels like saying something, right? Often they do if
there's, you know, going to be a presidential statement, but they also have the option not to
do that. That's what Twitter is, and Facebook and some of the others are exerting their right
to as well. We don't think about them as media companies as much because they've spent a really
long time telling us that they're not media companies, that there's something completely different. And I think that
society and you know, all of us to a certain extent, kind of drank that Kool-Aid for a while.
And we're hoping that it would play out to be kind of this more democratic thing where platforms
aren't kind of holding this power. But I think that that dream is fairly gone.
So I think there's that element of it. At the same time, they're media companies, and unlike any other media companies that we've known or experienced before, right? So I think part of
the problem is that we don't actually have like, a solid, stable set of vocabulary to discuss what
these things actually mean yet, because we've never observed anything fully like
it. And so I think it's possible to say like, no, this wasn't an act of censorship. But yes,
it is troubling how much power these companies have over public discourse right now.
I would absolutely agree with that, though, you know, like, I think it's completely fair to say
that this isn't censorship, but at the same time, be worried that these incredibly powerful companies have a lot of power and we don't even completely understand the way that that power works.
And so I think that's a really big issue.
In your work as well, you talk about the sexism and the racism that is in Silicon Valley itself, right? That is within these companies themselves.
How do you think that has affected the way that they have responded to these things over the course of a number of years now? Obviously, they have taken action in the past couple weeks,
but for a long time, they kept saying that they were a neutral platform, that they weren't going
to do anything because they had to let free speech play out or whatever. What do you make of that?
Yeah, totally.
I mean, I think white supremacy in kind of the broadest sense, it manifests in Silicon
Valley in a lot of ways.
One of the main kind of most baked in naturalized ways is the kind of deeply libertarian approach
that went into building both the internet more broadly and social media platforms explicitly. So there are a
lot of different ways to think about free speech. And I'm no free speech scholar. So I won't claim
expertise. But my understanding is that, you know, the libertarian view of free speech is the one
that says there should be absolutely no constraints on what any individual says, right? Whereas there are other approaches that focus on, okay, how can we actually foster
the most healthy conversation, the most speech for most in the group, right?
Silicon Valley historically chose the first option. And what that doesn't account for is
all of the voices of marginalized people who kind of automatically get silenced through bullying or
harassment or all of these different things. So I think you have that element of it that going into
this, there were actually like deeply held libertarian beliefs that shaped the way that
these companies approach these issues. Of course, conveniently, those libertarian beliefs like
dovetail with their business interests quite nicely. Because, first of all, you know, it can help keep government off their backs. And they've had a very
pushy setup legally for a while now. Also, it means that they don't have to invest as much
in expensive content moderation systems and all of that. So there's that kind of baked in piece.
And the reason I put that under the umbrella of white supremacy is because what we found is in those systems, often what ends up happening is the voices of the people who are already powerful offline are the ones that get kind of amplified and reproduced online, with some notable exceptions, of course, but have individuals within Silicon Valley who somewhat adhere to these ideas and believe in a lot of the same ideas that are being espoused on the platforms.
And so it permeates the culture in a lot of ways.
And I do think that makes it kind of an especially big challenge.
You know, not only those individuals, but also just kind of like the lack of, I hate the word diversity, because I know it does flatten a lot of stuff. But you know, the lack of diverse workforce and the lack of ability for a widespread workers movement to be built. And actually, one kind of exciting development is starting to see that shift a little bit. So in fact, when people talk about someone like Jack Dorsey having
the ability to remove Trump at his own whim, yes, that's true to a certain extent, but also it
erases the fact that when he did remove Trump, it was after a huge amount of Twitter employees
signed a letter kind of demanding that he be taken down. And so I think all of those factors play a role as well.
As you say, I think we can't forget the history, right? And the kind of logics that were built into
these infrastructures, both digital and physical, from the very beginning, because those things do
kind of inform what comes out of it. But I do want to return to the point that you
made about them being media companies, right? Because for the past decade or so, they have been
pushing it into our heads that they are not media companies. They shouldn't be treated like media
companies. We've been using the language that they created and gave us to discuss them, right?
And so if we kind of change how we see them and
recognize that they are media companies, maybe not in the same way that CNN is a media company,
but are still media companies in a way, how does that force us to rethink how they operate and the
responsibilities that they have to, I guess, the broader society and ensuring that
their platform isn't hurting the society in general, I guess? Yeah, such a good question.
And I think, sadly, I don't have the perfect answer to solve everything. But I think part
of that is a broader issue, which is that we've lost the thread more broadly around what the roles and responsibility of media as a whole are.
And so I think, you know, you still get it to a certain extent with kind of print publications, but cable news and just TV news more generally, in many ways, you know, there still are lots of journalistic norms within that space. But also, it has become
entertainment as much as news. And that's almost a cliche to say at this point. But really, what
that goes back to is deregulation of the media that happened in the 1980s and the 1990s. So
again, going back to history. But you know, what happened is historically, the media and the government have kind of been deeply entwined in their relationship. So not only in terms of the government regulating the media to make sure that, you know, there's kind of public interest at stake and all of these things. But also, actually, the government has like, for a long time been a subsidizer of media from the very beginning.
And so media, as we know it, wouldn't exist without the government.
And I think that's really what has gotten lost kind of in the past couple of decades.
And perhaps in some ways, the conversation around social media platforms is kind of reviving that conversation in a new light.
So maybe that actually is a silver lining that we can start to think about these things again. I don't see ourselves being in a political situation soon
where there would be mass support for, you know, re-regulating the media, nor do I pretend to be
like, you know, I'm not a policy expert. I don't know what that would look like or anything. But I
just looking historically, we can see kind of how that trend of deregulation,
I think, also kind of got us away from that idea that the media do have this responsibility to
kind of inform the public and aren't just in it for profits.
No, I absolutely agree with that. And I've looked at deregulation in a slightly different way,
focusing less on news and more on like film, television, what's happening there, or what has happened there. And so I completely agree with you.
And to end, I think I'd like to get your perspective on what you think happens next,
where you think we go from here, because we've talked about the history and how we kind of
arrived at this place. We talked about how these companies have been reluctant to do anything about these platforms. And now in the past few weeks, they seem to be ready to take some action, maybe inspired by the fact that the Democrats are about to retake power and they need to do something if they want to try to avoid regulation. discuss action being taken on these major social media platforms, Facebook, YouTube, Twitter,
things like that, your research shows us that it's not just about algorithms. We can't just say it's
the algorithm that is radicalizing people, that the issue is much deeper than that. But at the
same time, turning those things off just isn't going to end radicalization. Because when people
leave these platforms, they go to WhatsApp or Telegram or
these places that, you know, do not have the algorithmic amplification, but also aren't so
easy to track and to see the way that these things are spreading. Right. So, you know, I know that's
a lot of pieces, but I wonder if, you know, there are any aspects of it that you think are going to
be important moving forward or that you'll be watching? I will try to shy away from making any actual predictions because I'm very bad at that. But
I do think that there are a couple of trends that I'll point to. One, absolutely, is I think that
it will be interesting to see the shift that will happen now that Democrats will be in power.
And I actually think that the Trump ban also was as
much a signal towards that as towards anything else. Of course, the storming of the Capitol and
Trump's role in that was worse than anything that had come before in terms of, you know,
his abuse of the platform. But also, they had kind of four years to make this choice or even longer
than that during his campaign and did it kind of a couple of weeks before he'll presumably be out of power.
So anyways, I think that it at least raises an interesting question.
So, you know, it will be interesting to see the extent to which, you know, tech platforms do or do not kind of shift the way they approach these things to fit better into a democratic model of how Democrats
think these platforms should run. I also think that kind of the MAGA movement is not going away
anytime soon. And in fact, part of what we've seen pretty clearly throughout the past couple of
months is even when Trump remains silent, which he has done more in these past couple of months is even when Trump remains silent, which he has done more in these past
couple of months than he has previously, you know, he'll dip in and out. But the movement
doesn't always need him there to survive. In a lot of ways, it's built around him,
but it also transcends him. De-platforming or making these kind of, you know, surface level
changes on the platforms, that doesn't help de-radicalize. It doesn't help kind of, you know, surface level changes on the platforms, that doesn't help de-radicalize,
it doesn't help kind of change people's viewpoints about larger institutions.
And so I guess my main thing is that I still think it's worth kind of studying and keeping
our eye on that space, unfortunately, and recognizing that, you know, it may be the
end for Trump, but it's not the end for kind of
this movement. Although hopefully, we can move towards a better future eventually.
It's not going to be kind of, you know, flipping a switch and suddenly all the problems are gone.
Yeah, I completely agree. But I still think, you know, it's important to have these insights to
know what to be able to pay attention to and to be able to dig in and try to identify the issues so then we can address them to try to build something
better, right? Absolutely. Becca, I really appreciate you taking the time to chat, to discuss your
research and to give your insights on everything that's been going on. Thanks so much for taking
the time. Thank you so much. This was great. Appreciate it. Becca Lewis is a PhD candidate in communication at Stanford University, and you can find links to
her report, Alternative Influence, Broadcasting the Reactionary Right on YouTube, and some of
her recent articles in the show notes. You can also follow her on Twitter at at Becca Liu. You
can follow me at at Paris Marks, and you can follow the show at at Tech Won't Save Us. Tech
Won't Save Us is part of the Harbinger Media Network, a group of left-wing podcasts that are Thanks for listening. Thank you.