Front Burner - How far right influencers thrive on YouTube
Episode Date: March 22, 2019The Christchurch mosque shooter formed his radical views online. Today, an examination of how far right communities spread their toxic messages on the Internet and how they use YouTube to do it....
Transcript
Discussion (0)
Hey there, I'm Kathleen Goltar and I have a confession to make. I am a true crime fanatic.
I devour books and films and most of all true crime podcasts. But sometimes I just want to
know more. I want to go deeper. And that's where my podcast Crime Story comes in. Every week I go
behind the scenes with the creators of the best in true crime. I chat with the host of Scamanda, Teacher's Pet, Bone Valley,
the list goes on. For the insider scoop, find Crime Story in your podcast app.
This is a CBC Podcast.
CPA 21, how do you read?
I really want to know what happened, and it makes me extraordinarily angry that it's always been a big secret.
Uncover Bomb on Board. Investigating the biggest unsolved mass murder in Canada.
CP Flight 21.
Get the Uncover podcast for free on Apple Podcasts and Google Podcasts.
Available now.
Hello, I'm Jamie Poisson.
It's been a week now since 50 people were killed in attacks on two mosques in Christchurch, New Zealand.
And since then, a lot has been said about how the white supremacist terrorist who carried out those murders was a product of the internet.
Under his name online is a racist manifesto claiming that white Christians
are under threat from other religions and races.
And then mixed in there are references that people who maybe are on 8chan
would recognize as memes.
What's most striking is his clear desperation to be adored by the people on those message boards.
He said himself that he was radicalized online.
We talked about this on Monday.
But today, I want to go a bit deeper.
How do people become radicalized online?
Where do they find a far-right community?
How have far-right figures adapted the language and tools of social media influencers to spread their toxic ideas?
And what role does YouTube play in all of this?
That's today on FrontBurner. In a few minutes, I'll be talking to Zeynep Tefekci.
You might have seen her TED Talk about YouTube, which she calls the Great Radicalizer.
But first, Becca Lewis.
Becca spent hundreds of hours watching content made by 65 different people.
Some of them were pretty mainstream. Others were not.
Anti-Semites, white supremacists, anti-feminists,
people worried about the collapse of Western civilization.
And she ended up writing a big report
on something called the Alternative Influence Network.
Becca, thanks so much for joining us today.
Thanks for having me.
So as I mentioned, you study online political subcultures
and you
wrote about the Extreme Rights Alternative Influence Network, as you call it. Can you
tell me about this network? What is an alternative influence network? Yeah, absolutely. So I've been
studying far-right subcultures for a few years now. And initially, I was mostly researching with a couple of my colleagues these anonymous spaces like forums, 4chan, 8chan.
They've been discussed in the wake of the New Zealand shooting.
A user in a chat room on the fringe website 8chan posted a link to the gunman's Facebook page, where the live stream was published.
4chan, 8chan are picking up on this and sharing the video, and as say in many cases endorsement of this individual. But through that research we started to find
that a lot of propaganda was getting spread kind of out in the open by these
more well-known figures and it was happening on Twitter happening on
Facebook but I was surprised by how much it was happening on YouTube and what we
found was there's this major celebrity happening on YouTube. And what we found was
there's this major celebrity culture on YouTube. It's a super popular platform for particularly
young people to get entertainment and news. Massive audience between 18 and 34.
Absolutely. I think something like 94% of people 18 to 24 end up watching it in the United States. And so I assume
it's similar numbers for Canada. But what I found was that all of these influencers on YouTube kind
of have all these strategies for connecting with their audiences seeming super authentic and, you know, interacting directly
with them, broadcasting from their bedrooms, telling these heartfelt stories. And that's kind
of natural to the YouTube culture. But what I found was all of these far right YouTubers were
adopting the same strategies to essentially deliver racist and sexist messaging to their audience, but do it in this way that
seemed really authentic, heartfelt, appealing and emotional. And the other thing that I noticed them
doing was another aspect of this YouTube culture, which is they were all collaborating on content
together. And so what you got was the social network that emerged, where if you have
one influencer that you're a big fan of, and they host someone else on their show, and suggest that
you go check them out, that's automatically a recommendation. And it's kind of doing some of
the work that the YouTube algorithm does for you as well. And so what I've ended up finding with this network that I was researching
was you could watch these fairly mainstream conservative and libertarian influencers,
but they were hosting people that were maybe slightly more extreme than them.
And some people would, you know, even they would host a range of different people of different
political viewpoints, but they would include some slightly more extreme conservative viewpoints in their channels.
And so it was really easy to start going down a rabbit hole for any viewers watching that content.
People say that this is kind of a natural aspect of interviewing.
You interview
people that have different viewpoints than you. Right. And sometimes you push back in order to
either knock their arguments down or they hold them up. Exactly. But what I found in this network
was more often than not, people weren't pushing back at all. Interesting. Can you give me an
example of that? When you say people weren't pushing back, what are we talking about here?
Yeah, absolutely. So one of the most troubling examples I saw that I wrote about in my report was a fairly mainstream libertarian
YouTube talk show host named Dave Rubin. That's exactly what we do here. And this is the Rubin
report. And he actually doesn't even refer to himself as libertarian. He calls himself classical
liberal. But he hosts a wide
range of different guests. And one of the people that he hosted was a man named Stefan Molyneux,
who is Canadian and a talk show host. You know, a few years ago, he was mostly making libertarian
content. Within the past few months, he actually has kind of embraced full on white nationalism.
Within the past few months, he actually has kind of embraced full-on white nationalism.
So he's been radicalized over the course of the past couple of years.
But Dave Rubin had him on a while ago.
Stefan Molyneux, welcome to The Rubin Report.
Thank you. Great pleasure to be here. It's been a while.
It has been a while. I've been on your show a couple times.
During the course of this interview, Stefan Molyneux said that black people's brains were smaller than white people's brains.
And he said it as if it was scientific fact, which, of course, it's absolutely not.
That's something called scientific racism. It's racism kind of masquerading as science.
But Dave Rubin didn't push back at all in the interview.
He continued to go along with it and, you know, just continue to ask more questions. And what often happens at the end of these interviews is the person hosting it on their
channel will recommend, you know, go check out this person that I hosted. Here's the link to
their YouTube page. Right. So then it's sort of like a promotional thing. All right. For more on
Stefan, you can find him right here on YouTube. It's YouTube dot com slash. You know, I'm not trying
to necessarily call out any individual people as saying that they're particularly nefarious.
But I think when you have a giant network where this is all happening time and time again,
you get these pathways created where there's this potential for radicalization.
And so I understand what you're saying there, this idea that there are people who will come
to Rubin, who is known as this classical liberal or libertarian guy, even though he doesn't call
himself a libertarian guy. I think that is primarily what he's known for. And that if
they come to Rubin and they're exposed to Molyneux, they may then go and seek out Molyneux. I think that you've made clear
for me why this could be helpful for Molyneux and that it would get him more viewers or more
people sort of curious in his ideas. But how does this benefit someone like Dave Rubin?
Well, sometimes for the content creators in this world, they're expected to create
a huge amount of content. And so sometimes they're just looking for more people to fill in their
space. You know, they're looking for guests and they want someone to come on. But even more than
that, they're operating within an attention economy where their potential advertising revenue that they can get,
the way that they would make money off of this usually, is by getting as many views as possible on a video that they post.
And then if they get enough views, YouTube will place an advertisement in front of the video and they get a cut of that advertising.
And so people all over YouTube are competing for
clicks, they're competing for views, and a lot of times controversy sells. And so it can actually
be really profitable for YouTubers to invite on people that, you know, are quote-unquote
controversial. And a lot of times what that means is that they say racist and sexist things.
And I just want to get a sense of the scope of this alternate influence network.
How many people are we talking about here?
And how different are their ideologies?
In my report, I ended up finding 65 influencers in the course of this network. And I got to that by starting with an original account that I was
looking at, and that was Dave Rubin. And then looking at everyone who had been a guest on his
channel and everyone who had been a guest on their channel and so on. And for everyone that appeared
alongside at least four other influencers,
that was what kind of bounded the network.
So when I then looked at who I was left with in that network,
it was a real hodgepodge of different ideologies and political viewpoints.
There were open white nationalists and white supremacists in there.
So Richard Spencer is in there.
open white nationalists and white supremacists in there. So Richard Spencer is in there,
a couple of other people that, you know, outright will, you know, say anti-Semitic conspiracy theories, use the language of white supremacy, and in some cases, even identify themselves as such.
But it also includes extremely mainstream people who have, you know, mainstream libertarian, conservative, and sometimes even like socially liberal views. And so it was really hard to and I was specific not to categorize this in any specific way other than to say that the one overarching similarity that I could find in political viewpoints was that they were all
opposed to this idea that they called, you know, social justice, or particularly they would
refer to social justice warriors. With a social justice warrior. The social justice warrior
mentality. Emotional reactionary in a knee-jerk social justice warrior fashion. Right. Who they
believe are, you know, they use that term to refer
to everyone from feminists to the Black Lives Matter movement, to Islamic immigrants, to the
LGBTQ movement. It kind of is this catch all term to refer to this broad swath of progressive
political movements. And so the one thing that I ended up saying about
this group is that they're all they all have reactionary politics, because they're fighting
for the status quo when it comes to these progressive issues. And then the argument
here is that it has like a cyclical effect. So they go on, they watch the YouTube channel of
one of these people, they talk about how they feel under siege.
You know, then people feel under siege and it just kind of goes around and around.
That's exactly right.
And, you know, part of the nature of YouTube culture
is that influencers speak directly with their audiences.
And that's built in technically to the platform.
So when influencers post videos,
then the audience can respond in the
comments section, and then the influencer can respond to them in the comments section and so on.
And a lot of influencers will talk about how they take audience feedback into account when
they're making content. So you actually see this system where it's not only influencers
radicalizing their audience,
but sometimes when influencers have found an audience that likes their content and also
kind of demands similar content or even more extremist content, there's actually an incentive
for these influencers to keep making more and more extreme content.
And I know this is really strange to say, but, you know, watching some of these videos on of the violent undertones of what influencers are talking about.
A lot of times they'll phrase things in terms of deeply personal stories. the way that a fashion influencer will make a product testimonial or a beauty influencer will
do the same thing talking about how a skincare product made their life so much better and will
tell a personal story about that. Even my boyfriend couldn't stop complimenting my skin. He's like,
what do you do different? Your skin just looks so fresh and just so dewy and supple. And I'm like,
yes, thank you. Here we have influencers who talk about how they used to be liberal or left leaning and how now
that they have come to see the world through conservative or even white nationalist standpoint,
that now their life is so much better. They can see things so much clearly. And they talk about
it in this deeply personal sense that frankly is, you know, a lot of people have talked about like a
disinformation crisis and how we need to be more careful about fact checking. But these are people
telling personal stories. And so it's impossible to fact check in a lot of ways. And it really does
a good job of making these issues seem like they're natural, like they're nonviolent, like they're just something that happens on one's personal journey.
Becca, thank you so much.
Thanks for having me.
Becca Lewis's report for Data and Society is called Alternative Influence, Broadcasting the Reactionary Right on YouTube.
You can find it online.
Okay, let's talk more about YouTube itself.
Zeynep Tefekci has written about how she believes YouTube is the great radicalizer.
Zeynep is an associate professor at the University of North Carolina.
She studies big data and algorithmic decision making. Zeynep, how are you today?
I am fine. Thank you for inviting me.
So at the top of the show, we talked with Becca Lewis, and I know that you're familiar with her
work on this alternative influence network. And during our discussion, we talked about YouTube
and the role of algorithms,
but we did sort of a drive-by. So I'm hoping that we can dive into that with you now. Me,
personally, I've had this experience on YouTube where I type in things like jogging.
And in this video, I'm going to show you five simple tips that you can start to implement today
to allow you to have proper running form. And then after a few minutes,
sort of, I don't even know how it happens.
I'm watching these Ironman competitions.
In a few days, Nike is staging a race in Monza, Italy,
where competitors will be attempting one of the greatest feats of athleticism in history,
a sub-two-hour marathon.
And, you know, can you explain to me what's happening there?
So if you go and start watching something on YouTube,
on the right-hand side of your screen are these recommended videos that are listed as up next.
And they're set to autoplay, they're going to play automatically. And they are chosen from all the
many videos that are available on YouTube by an artificial intelligence engine, right? It's one
of the best in the world. It's Google's people, so they're really good at this.
And what it's doing is,
in accordance with YouTube's business model,
which is to keep you on the site as long as possible
and then serve you ads,
it's trying to engage you, engage your attention.
And it appears that the algorithm has discovered a vulnerability in
humans. We're attracted very often to edgier stuff, right? It's kind of like can't stop looking at a
traffic accident. But what they're doing is not just sort of terrible stuff, but it's also edgier
stuff or conspiracy theories or things that appear to explain the world or its secrets to
you. Now to you and me, that might seem like, oh, that's terrible. That's so stupid. And we
might just sort of click and move on. But for a lot of young people, it's interesting, right?
Somebody is promising to reveal secrets to you. Somebody is promising to tell you things
that they're saying adults aren't telling you. Somebody's promising to tell you things that they're saying adults
aren't telling you or somebody's promising answers. And, you know, stuff that's made up,
it's usually more interesting than reality. The reality might be mundane. So the algorithm,
the recommender algorithm, kind of picks stuff that gets edgier and edgier. So what you were saying about you start watching
jogging, and all of a sudden, it's showing you, you know, Iron Man stuff, you want to stay
someplace mild and just sort of see stuff. But it's kind of like, that would just might be in
this direction. Yes, because it might mean that if you're just sort of going to look up a few things
about, you know, how to correct your stride, and then you're going to go back out running.
going to look up a few things about, you know, how to correct your stride, and then you're going to go back out running. Well, that's not what YouTube wants you to do. YouTube wants you to sit there
and watch and watch and watch. So they're trying to be as engaging and as interesting as possible.
And that's not a healthy thing most of the time, because what's engaging, as we discussed, what
keeps my eyes glued to the screen might not be the healthiest thing.
So it's that recommendation algorithms push to make money for YouTube that's causing this.
How is this algorithm? I know it's a proprietary algorithm, so we don't really know how it works,
but what do we know about how it's working logistically?
So what we know is that it's a machine learning program,
which means that it's not like Google's engineers sat down and said, you know, let's destabilize the world.
Because, I mean, make no mistake, having billions of people
watch increasingly edgier stuff or hate speech or amplify terrible things to them is destabilizing
the world. So it's not like the Google engineer said, all right, algorithm, go destabilize the
world. What they did was feed it a lot of data and said, keep people on the site. And this is a
consequence of trying to keep people on the site. So imagine
a cafeteria. And if the cafeteria's business model wasn't to feed you and feed you good food,
but to keep you in the cafeteria as long as possible, it would be giving you, you know,
sugary, salty stuff, and then it would give you more. And then, you know, the moment you stopped
eating your plate, and you're like, Oh, I'm done, I'll leave now.
It'd be like, here's dessert.
And it would try to sort of up the ante, so to speak, to keep you in the cafeteria.
YouTube is kind of like that.
It's trying to keep you there.
Right.
And that's what's programmed into it.
So it's not like the programmers can individually pick, you know, a billion videos to recommend to a billion people.
But what they can do is use these sort of data-driven, big data-driven techniques like machine learning to unleash a way in which they can figure out exactly what will maybe keep you on the site and try it.
And if it works, they'll sort of keep going at it.
And they're hoping you'll just sort of stay there in other places, right?
It's not even like you have a choice.
And it's not even like you can't even block a channel.
Like if you're a parent, your kid is almost certainly going to encounter white supremacist stuff.
He's going to encounter misogynist stuff,
hate speech, misinformation,
conspiracy theories, completely baseless.
And you don't have, as a parent,
any control that says block this channel from being recommended.
Like it's, their business model is so overwhelming
that they don't even give the most basic controls.
If it's tempting you with stuff you don't want to watch, you just want to block it. You can't even do that. So that's just
what's driving. It's greed.
You've talked about sort of conspiracy theories and more extreme content.
Can we talk a little bit more about how this is manifesting itself around political content, particularly right-wing content?
Right. So Becca Lewis, whom you've spoken to, has documented how a reactionary network of ideologues has taken advantage of YouTube's architecture.
They very strategically filled the YouTube with the kind of things that will get recommended
and that will reach people, and they filled that ecosystem with that.
So that's how it works.
It's important to note that the algorithm would also work with other kinds of content. It
would work with more left-wing content and recommend edgier, more extreme stuff too. It's
just that at the moment, historically, it's the right-wing and especially not just regular right-wing,
I'm talking about like reactionary, white supremacist, misogynist, so-called alt-right, that's really populated this ecosystem very effectively.
I'm interested to hear your perspective here, because from where I'm sitting,
YouTube hasn't taken the kind of criticism that some of the other tech companies have.
Facebook is the obvious one. Would you agree
with that? And if so, why? Like, why is YouTube not in the crosshairs like Facebook and maybe to
a lesser extent Twitter? I think a lot more parents are on Facebook than on YouTube. They don't really
watch YouTube the way the young people use YouTube as a search engine, as an entertainment place,
as a place to comment on. So they're kind of unaware. They think their kid is just watching
some videos. Journalists are mostly on Twitter, so they're not really using YouTube. So the kind
of people who tend to talk about this stuff aren't really heavy consumers of the platform the same way. So they're not really noticing what's happening
right under all of our collective noses,
because this isn't new.
The recommendation engine in YouTube,
switching to the best AI Google had,
happened around 2015.
And that's when the problem started, right?
They had this potent new tool for keeping people engaged.
And that's
what it evolved into. Would there be a reason other than a moral reason
for Google and YouTube to do something here? Well, I don't know if there was a very moral reason for
paint producers to not put lead in the paint, right? Because lead is a great ingredient in
paint. It's anti-corrosive, protects against moisture. It's very useful. But we as a society
said, you know what, the moral cost of this is too high. Lead is a neurotoxin. It ruins children's
brains. So we banned it. So I'm kind of not as much interested in Google's moral calculations,
because we've been talking about this for years.
And they slowly move if they lose advertisers.
And some people in the company, they may be very nice people and they may be trying to move things.
But the reality is as long as they're printing money this much, and they are, they're very, very rich because of this.
It's hard to see them acting without some, let's say, helpful nudging that we say this is not OK because there's a lot of things they could do.
For one thing, it's not like written in the Ten Commandments that there has to be a recommendation engine, that things have to be autoplayed, that the content has to be more and more engaging.
And this is just none of these are absolutely must happen kind of things.
Just a few years ago, we didn't have most of this.
And so there's so many things that should we do this?
And should we allow this are very good questions.
And I think that's for us to decide rather than just hoping that the better angels of the people who work at
Google will somehow win over. Because historically speaking, that's usually not how businesses clean
up their act. What's really important is that we, as a society, get a handle on this historic
transition in our information ecology
and try to figure out how do we make it healthier for all of us.
Zeynep, thank you so much.
No, my pleasure.
Just a note to say that we reshowed to YouTube for comment on this story.
In a statement, the company said, Hate speech and content that promotes violence has no place on their site.
The statement went on to say that more than 8.8 million videos and 262 million comments had been removed for violating policies during a three-month period alone last year.
For videos that do not violate the hate speech policies, YouTube said it applies a set of restrictions which include removing the video from recommendations.
The company is also reducing recommendations on, quote, borderline content that could misinform users in harmful ways.
That's all for today.
FrontBurner comes to you from CBC News and CBC Podcasts.
This week, the show was produced by Aisha Barmania, Chris Berube, Elaine Chao, Shannon Higgins, Nahid Mustafa, and Abby Pletter.
Derek Vanderwyk does our sound design.
Our music is by Joseph Chabison of Boombox Sound.
The executive producer of FrontBurner is Nick McKay-Blocos.
And I'm your host,
Jamie Poisson.
Thanks for listening.
For more CBC Podcasts,
go to cbc.ca slash podcasts.
It's 2011
and the Arab Spring is raging.
A lesbian activist in Syria starts a blog.
She names it Gay Girl in Damascus.
Am I crazy? Maybe.
As her profile grows, so does the danger.
The object of the email was, please read this while sitting down.
It's like a genie came out of the bottle and you can't put it back.
Gay Girl Gone. Available now.