Big Technology Podcast - Does YouTube Radicalize? A Debate Between NYT's Kevin Roose and Software Engineer Mark Ledwich
Episode Date: January 6, 2021In June 2019, New York Times reporter Kevin Roose wrote The Making of a YouTube Radical, a story about how a 26-year-old man, Caleb Cain, was radicalized through YouTube. For the story, Roose examined... Cain’s entire YouTube history, and plotted the path he took toward radicalization. Software engineer and researcher Mark Ledwich took issue with the story, citing his own research and claiming the notion that YouTube could radicalize was a myth. Instead of yelling at — and past — each other, Ledwich and Roose came together for a moderated debate on the Big Technology Podcast, where both stated their points of view, got a chance to respond to each other’s points, and ask each other questions. Further reading: Kevin's story: https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html Rabbit Hole podcast: https://www.nytimes.com/2020/04/22/podcasts/rabbit-hole-prologue.html Ledwich's story: https://mark-ledwich.medium.com/youtube-radicalization-an-authoritative-saucy-story-28f73953ed17 Ledwich's research visualization: https://recfluence.net/
Transcript
Discussion (0)
Hello and welcome to the big technology podcast, a show for cool-headed, nuanced conversation of the tech world and beyond.
And today we're trying something new. We're going to have a debate that's going to discuss whether YouTube is indeed radicalizing its users.
Joining us to discuss it on the con side is Mark Ludwitch, who's a software engineer, who's done some data engineering about this stuff.
And on the pro, I feel like it's fun to put you in this category, is Kevin Ruse of the New York Times.
who's written a widely shared and very interesting story about someone who did indeed get radicalized on YouTube and then produced a very popular podcast about it called Rabbit Hole.
right. Yeah, I, um, my background's on software and I, um, I've heard a lot of, um, the reporting
and research in this area. And I felt like there was a, there was a gap missing for someone
just to monitor exactly what YouTube was doing. So I took a sabbatical. You went ahead and did
it. And when did it. Yeah. And now I'm just trying to disseminate the information that I'm finding
and bring attention to it. Okay, great. Well, I appreciate you joining the show. Uh, and,
you know, being willing to talk about it here, not only that, but being able to discuss it
with someone who's going to be critical of your work, which I appreciate it. And so, Kevin,
you're at the New York Times? Correct. And I think a good way for us to begin with, begin this is to
start with a bit of a recap of the story that started it all. And Kevin, I'd love it if you could tell us
a little bit of a recap of what you talk about at Rabbit Hole. And by the way, folks, if you
haven't listened to Rabbit Hole. I highly recommend it. And maybe this discussion will inspire you
to do it with the added context. But Kevin, can you introduce us to the guy that you met Caleb
and how he became radicalized through YouTube? Sure. So I've been looking into online
extremism and radicalization for a number of years. And after the 2009 shooting in Christchurch,
New Zealand. That was the one where it was, you know, streamed on Facebook and posted about on
8chan. And, you know, the shooter had this very, like, online manifesto. It was awful. And it was,
you know, he referred to PewDie Pie and, and sort of that sort of catalyzed for me what
became the next sort of year of reporting where I was really trying to answer the question,
like, like, how does this happen? How do people?
encounter extremist views online, what role do platforms play in introducing them to new
and compelling extremist voices, and sort of what are the forces that power that process?
So I was really looking for a case study, someone who would sort of let me talk to me
about their journey, their process, and so I started looking around and eventually found
this guy, Caleb Cain. He was 26. He's probably now 28. He was from Virginia or West Virginia,
and he had a really interesting story. He basically was, you know, a Obama supporting liberal,
dropped out of college, was having some real troubles in his personal life. Not a lot was sort of
going right for him. Started looking on YouTube for self-help videos, things that might help
him feel better and stumbled on to this network of creators, including people like Stefan
Malinou, who really helped him feel better. They sort of, you know, had videos about self-confidence
and, you know, meeting people and getting a job and just sort of almost like life coaching.
And so he started watching these videos and started watching other videos that were recommended
from those videos. And he ultimately became pretty far right. I mean, he wasn't, he, he,
He considers himself alt-light, not fully alt-right.
That's how he sort of characterizes it now.
But at the sort of base level, he said that he was radicalized into the far right
through his YouTube recommendations and eventually started agreeing with many of the
sentiments of people like the Christchurch shooter.
And then actually came out of that, was sort of de-radicalized from watching videos
by this other cohort of YouTubers known as Breadtube,
who are sort of the left wing,
they do a lot of sort of counter-programming
of the kind of alt-right section of YouTube.
And so he eventually sort of got out of it
and started making videos and talking about
how he had gotten out of it.
And I was just fascinated.
I was really interested in meeting him,
so I went to West Virginia,
talked to him for days.
And then he provided me with his entire YouTube history.
So 12,000 videos.
spanning four years and I could see it was just big giant HTML document and so you're able to
download the whole thing exactly yeah there's the there's this Google takeout function where you can
download all your data and so he just downloaded his whole YouTube history and send it to me and so we were
able to kind of go through that and kind of retrace his his journey and all the people that he met
along the way. And so that became the story making of a YouTube radical that Caleb
is featured in. And it's about him and his journey, but it's also about what was going on
at YouTube at the time and the way that those things interacted and the way that the changes
that YouTube was making around its algorithm, you know, reprogramming it around deep neural
networks, changing some of the key metrics that it was optimizing for, changing from clicks
to watch time, how all of those factors sort of helped create the environment in which
Caleb was radicalized. So that's the story. Right. And in your reporting, and we'll get to Mark
in a second, but in your reporting, did you ever put a finger as to how many Caleb's there were
and whether this was a one-person problem or you were tracing the route that the Christ Church shooter
could have taken? Or was this something that was a larger issue happening, not just
to one or two people, but far more than that.
Well, I should say at the time I started reporting it,
I had already heard a bunch of stories from people.
I was doing things like looking through Unicorn Riot.
Unicorn Riot is this group that is sort of a left-wing sort of counter-extremism group
that, you know, has leaked, has sort of obtained and leaked a bunch of discord chats
and other communications from far-right groups.
And a lot of the times, if you read through those things,
It would be people who are, you know, currently pretty far to the right.
Maybe they're white nationalists.
And they would say, like, I got into this through YouTube.
So you would see these sort of testimonials of how people encounter these ideas.
And since the story in the podcast came out, I mean, I've heard from thousands of people
who have similar stories to Caleb's or whose family members or friends do.
So it's impossible to quantify, but like this is not a single person problem.
And this is, you know, this is a, this is a problem that was big enough that YouTube, you know, felt it had to address it through sort of changing its policies on white nationalism, hate speech, borderline content, and also changing its recommendation algorithm.
Great. And one more follow up, and this is probably an unfair question, but here we go. When was the last time you've heard from someone that's come to you with an issue similar to Caleb? So is this a common issue currently or is this something that was in the past?
oh you mean like who is saying this happened to me last week or yeah have you had it
have you had like someone you said thousands of people have reached out to you so
how recently has that happened or has it tailed off recently it it happens almost every day
and i would say it um many of the stories that i i hear are about um people there seems to have been
this sort of golden era of right wing youtube from
about 2014 until about 2019 when they started sort of making changes to tamp down the influence
of some of these channels. And so a lot of it took place in kind of that five-year window.
But there's been other rashes of it this year. I mean, we've seen a lot of people becoming
radicalized during the pandemic. QAnon has been, you know, has grown like crazy. Some of the
adjacent sort of communities to that have become quite large. And so it's not just happening on
YouTube, it's happening all over the internet. But I think that period, this sort of 2014 to
2019 period, is the one that I was really interested in. Okay, great. Yeah, I really wanted
to establish Kevin's argument. But now, Mark, I'm going to turn it over to you. What's the
counter argument? You seem to believe that people aren't being radicalized on YouTube or aren't
being radicalized to the same extent that Kevin might be telling us. So where's your proof?
So we did a study where I've been collecting data
about YouTube's recommendations since late 2018
and we saw those changes, so some of them at least.
So we weren't collecting at the time
where it was really promoting clickbait kind of material.
When we started, it was fairly neutral
in terms of the amount of views that videos got.
Recommendations were pretty much proportional to that.
And then early 2019,
to really clamp down on orderline content and conspiracy content.
So I think we're aligned the way we understand what's happened there.
I think more broadly, I think the influence of recommendations in terms of someone
radicalizing is just a very small part of a larger process.
So with the Christchurch, shoot I read that report.
And there were so many other factors.
And as they say in the report, it's often a once off a highly personalized journey.
It's not one model can explain the different ways people are out of class to extremism.
So I feel like my analogy for recommendations would be that there are like a gentle breeze that's able to be controlled by YouTube.
But there's larger factors at play in the environment that's more like a storm.
that's, you know, pushing all sorts of directions at different times and for different
people. But the recommendations are much more gentle than that, but something that's in their
control. Okay, so unpack that a little bit. So basically what you're saying is that people do get
recommended videos and they can go down these rabbit holes, but you have to look at that as just
one factor of many. And yeah, why don't you go ahead on that one? Yeah, like it's one
factor of many, and I wouldn't describe the recommendations people get as a rabbit hole.
We found that, on average, recommendations go towards more mainstream content, at least since
early 2019.
So it may have been back then, but we don't have data on that.
And I haven't seen anyone else with good data on that either.
So we're speculating as to whether the recommendations were pushing that direction or not.
Okay. So, Kevin, there's two arguments here. And Mark, correct me if I'm not summarizing them right. But one is we shouldn't put too much emphasis on YouTube, given that, you know, people are living not only in YouTube, but in the world. And there's other factors in the world. And two is that, you know, these recommendations don't always take people, you know, down this, down this rabbit hole. And in fact, the term might be wrong in and of itself. So what's your response on that front?
Well, to the first argument, I mean, I agree with Mark that there are other factors.
I mean, in the story about Caleb Kane, you know, I talk about the sort of larger forces at work here.
I mean, it's, he was living in an economically depressed area.
He didn't have a lot of career prospects.
He had kind of a shitty family life.
and it was like, it was not, he did not arrive at YouTube as a blank slate.
I mean, he was coming in with a number of different, you know, personal traits that made him
especially vulnerable to this.
But I just, I don't think that we can discount YouTube's influence, especially now when all
of us are just experiencing the world through our screens.
YouTube hasn't released data in a while, but the sort of last thing we know as a couple
years ago, people were watching a billion hours a day of YouTube. It is something like 15%
of all internet traffic is through YouTube. So this is not a small part of people's media
diets and especially for the group of people that I'm, you know, that I started studying,
which was kind of this, these like people who are maybe a few years younger than I am, people
who are in their, you know, teens and 20s, who really, for them, like YouTube is
media. It is culture. It is politics. Like YouTube is a sort of all-encompassing frame of reference
for everything. And so for those people, it's not just, you know, it's more than just one in a system
of inputs. It is like the primary input by a long way. And on the second point, the sort of
rabbit hole pipeline argument, I mean, I'm interested.
to hear more about this from you mark because i've read your study and i've read some other studies about
this and there have been some studies that have found strong evidence for um for sort of the
the migration of viewers from more sort of centrist and kind of alt-light um videos to alt-right and
sort of hard-right content over time um whether that's through recommendations or through other
forces, you know, is a question that probably can't be answered, except if you work at
YouTube. But I'm curious, you know, there have been some studies sort of suggesting that this
doesn't happen. Like, do you think it is possible with the data that we have that is currently
available to us as non- YouTube employees to account for and quantitatively study the effects
of radicalization on YouTube? Like, do you think that's even like an achievable goal?
I think you can do much better than we're doing now, but no, I think right now, like, there's still much, there's still room for a lot of other, there's a lot of room for opinion that the data that we have right now isn't definitive for the more holistic question.
Like, is YouTube influencing things, all things considered people towards extremism? Like, are they making that worse?
We don't know that yet.
But there's definitely studies coming
and that have just come out that are quite good.
So there's one recently that uses web panel data
from a representative group of Americans,
so they have real traffic,
and they're looking at actual click-throughs on recommendations
and looking at which direction they're going.
They have issues with their classifications,
but they're fixing those.
So I think that will be really good information.
The study you reference, I think that's the auditing radicalization pathway studies by Rubiro and others,
in that they looked at people commenting on videos and seeing over time whether they move towards the,
they classify channels as IDW, alt-line and alt-right, and whether there was a direction towards that.
I thought that was quite clever, but they only looked at that one direction.
So they didn't look at people moving, like, from something like Stefan Mullinue to Jordan Peterson to something more centrist.
They didn't look at that direction.
So I felt like that was an example of research that is technically quite good, but the bias really affected the way that they ran it.
Mark, your research itself found that a lot of the recommendations pointed people to more centrist content?
yes so towards mainstream content so it's got an authoritative bias yeah and that's after 2019
and that's that's since like april 2019 so that's the recommendations but if you're thinking
radicalization you do have to look at it more holistically that's just one input um so it could be
just the existence of certain content you could blame youtube for or um that video itself is
engaging enough that it's more likely to radicalize than like a book.
I don't believe those things, but yeah, that's in the question when we're thinking about
this more holistically.
Yeah.
So you would say that since these changes in 2019, would you say before the changes in 2019
they were radicalizing people, or you would say that would be an exaggeration?
And then after the changes in 2019, it's even further an exaggeration.
Yeah, I think it's an exaggeration.
Like one thing I'd point to is that.
we would expect places where people watch YouTube more for it to have an influence.
And I can see, like, stories are like what Kevin wrote about Caleb Kane.
They're like, I'm not against that type of looking at this problem.
In fact, I like to do that whenever I do data analysis to look at the stories.
So you've got a real concrete version of what you're trying to analyze.
But I feel like that kind of look, you need to look at all the different types.
So you need another person, who has said they're being de-radicalized by Jordan Peterson, for example.
He says he gets a lot of people telling him that they pulled him out of the old right as well.
I guess one thing I think we should talk about is scale.
And I think, Kevin, you might have mentioned this in your article or on the podcast,
talking about how if there's only a, you know, small, you know, let's say 99% of YouTube folks are watching it and not getting radicalized that if 1% are, then that's an issue.
So I'd like to be able to explain this, like, because Mark, even if what you're saying is right that the vast majority goes to, goes to centrist or authoritative stuff versus some of the radical stuff, then you still have that error.
at the end that, you know, people end up going down, going down that path and becoming
radicalized. So why don't we do this? Kevin, can, you know, can you riff off of that and sort of
give your take on it? And I like to hear Mark's response also. Yeah. I mean, the scale question
is, is the scale, there's a scale question. There's a prevalence question. So the, the prevalence argument is
one that YouTube loves to make. Facebook also loves to make this where they say, you know, only, you know,
6% of the content on our platform is political or, you know, it's, it's, you know,
borderline content is less than 1% of all content on YouTube. And I have no reason to doubt that
that's true. I mean, we can't audit that. They don't make, you know, data like that
available to the public. But, I mean, just if you think about all the many things that people
use YouTube for, like figuring out how to fix, you know, their broken toilet or, you know,
tie a bow tie or, you know, listen to music or, I mean, there are people who only use it
for those things.
And so that's totally plausible to me
that this might be a small percentage of overall usage of YouTube.
But the denominator is so big.
I mean, if you have people watching billions,
at least a billion hours a day,
I would guess that it's increased substantially
since that figure came out.
One percent of a number that big
is still a fairly big number.
and especially when you're looking at the kind of possible outcomes here.
I mean, it's not just that people are getting radicalized and then, you know, posting a bunch of memes on 4chan.
It's that they're, in some cases, going out and conducting mass shootings.
They're becoming violent.
They're, you know, engaging in sort of coordinated harassment.
Like, this is not a sort of, if it were just, you know, people getting sucked into big,
Bigfoot conspiracy theories or whatever, like that, that's, that to me doesn't really register as
a grave harm. But there's a real danger here. And I think that, you know, I'd be, I'd be
curious to hear Mark's take on the sort of scale question, because I think that even if it's true
that, you know, a small number of people relative to overall YouTube consumption are experiencing
this pipeline, I do think the existence of the pipeline is something that, you know, we need
to study and we need to get more transparency from YouTube about it.
Yeah, I'd love to hear Mark's answer on that also.
I agree with YouTube that it's a small percent, but yeah, it's a huge platform.
And it skews young.
So I'm expecting it to grow like quite a much, much, much larger than it is now.
I did compare, I don't have the numbers on me.
I compared Fox News on cable versus YouTube.
and cable's still bigger
to the amount of views they supposedly get.
But I think
in the next five years, we'll see
that flip.
So, but Mark, then
just like talk about the scale issue, though.
So if you're writing this paper pushing back
on the fact that journalists are saying,
journalists like Kevin are saying
that YouTube is radicalizing
and you say that there is a big scale
and maybe this is happening to a percent of folks.
Yeah, I'd like,
to hear, like, you know, isn't that something that we need to consider in this discussion?
I guess only if you're worried about us, I'm not, I just don't think it's a problem in terms
of our YouTube being an influence for extreme radicalization because I see it more, there's
a part of it where I'm doubting that that's a big factor, but I'm also doubting that it's a
YouTube specific problem because people are going to be watching video content.
on the internet no matter what.
So let's say YouTube got rid of all right-wing content
and the extreme right everything,
there would just be another platform that people watch.
And we saw a big migration to Rumble
when the Q&N on crackdown happened.
So they're getting millions of views on Rumble now.
So it's not as if this pressure to remove the content
or really change recommendations
will have a massive effect.
okay but but let's focus on on youtube though because that's sort of the topic of discussion
you know it's one thing that people are going to go elsewhere but um i want to i mean you know
we only have anecdotal data mostly on terms of the idea that you know uh recommendations are
or youtube the platform is you know driving people to radicalization so so far from you i've heard
arguments that have said, you know, they're recommending authoritative content, but not always,
that people would be radicalized, people would be radicalized no matter what, because they'd be
watching video somewhere else, and that there are other factors outside of YouTube that are,
that are causing this issue. But like, can you make your most convincing argument that, you know,
YouTube itself is not doing what Kevin has argued.
Apart from those points, I'd say that the way I look at YouTube,
the content is reflecting what people want to watch.
So I think there's an intuition that is more radicalizing for someone that just watches mainstream.
When they come onto YouTube, it's definitely more right-wing and there's more edgy stuff.
And I think it's intuitive to think that's a rabbit hole because
it's reflecting the population's demand of content more directly than mainstream news,
although mainstream news are starting to become more like YouTube in that way.
So I can see the intuition there, but I just, I'd say the type of information that I'm looking for
doesn't exist to show that there's a radicalizing effect.
And also the over-concern about it, like when we talked about it,
about when the discussion of radical, Islamic radicalization comes up, a lot of the same arguments
apply to this, which is, you know, we're talking about a small amount of numbers, more people
die for other means, you know, alcoholism is more important, things like that.
And I think of those things when this comes up as well.
And so you did research, so you're, you know, you're coming to this conclusion that there's
no real issue here.
What in your research led you to believe that?
Can you talk just through that a little bit?
So my research just focused on recommendations.
So what we saw on recommendations was a mainstream influence overall.
And we're doing more study to look at what effect personalization has on that.
So that's the area of research that I've done.
So we're kind of talking about a wider question where I'm just speculating much like everyone else is about what causes radicalization.
But when you looked at recommendations,
did you find any radicalizing influence?
Like, you know, I know you talk about authoritative.
You said that YouTube steered people to authoritative sources,
but did you find any evidence of even, you know,
a smaller group of folks being steered towards more radicalizing content?
We found, no, I didn't.
I can't, no, I can't think of a channel.
Actually, right now there's one channel NTD,
which is doing well with recommendations.
And I think that's, that's, I can't remember what it stands for.
They're part of the Epoch Times Network.
It's Newtown Dynasty.
It's part of the Falun Gong media empire.
They've been paying for a lot of ads on YouTube as well.
And perhaps I'll get to that soon and the effect of the demand for content on the
election fraud conspiracy content.
I think that has a large amount to do with that.
I love my trying to thought, sorry.
I'm curious, Mark, if I can jump in with a question for Mark, like, I'm just curious.
So, Mark, I remember when your original study came out, I got a lot of people emailing it to me, you know, saying, what do you think about this?
It was sort of interpreted as kind of a, you know, a response to my story, although I know you were looking at this before.
Or, and, you know, you seem pretty angry about the narrative of the YouTube rabbit hole.
And I think, you know, you were saying things like, you know, this is a, you know,
conspiracy that the mainstream media is using to sort of repress the influence of, you know, social media.
And, you know, this is, you know, these narratives aren't trustworthy and this is a self-serving thing.
So I guess I'm just curious, like, and I don't want to, you know, a true.
any arguments to you that you, that you, you know, aren't comfortable making.
But I'm curious, like, what you think is behind, like, the sort of meta-narrative here,
like, why you think if this radicalization pipeline doesn't exist, and in fact, if it's
pulling people, you know, in a more mainstream direction and has been, you know, for as long
as we've had the data, like, why would people be coming out with these stories?
why would people would be saying I was radicalized by YouTube?
Why is this such a persistent thing that we keep hearing over and over again?
Like what are the incentives of the people who are doing it if it's not actually true?
I think there's a elite culture which looks down upon popular culture.
And I think this is part of it.
And it's especially true of the New York Times, which I find a very small subculture of that.
I was listening to Ezra Klein talked to another reporter from New York Times who said he felt he was incentivized to write articles like this, reflexively anti-tech platform articles, and that benefited him at the New York Times.
So that's definitely my background into what I'm thinking the incentives are inside, place of your work.
and I definitely think the rabbit hole meme took on legs in a lot of places and I just wasn't
seeing anything more than stories and I think you get the selection effects at play so when
you tell those stories you'll get more of them and it's the same on the other side so if you're
if you're more right wing or an anti-woke YouTube channel you also get lots of stories of people
feeling like the places that they work at are stifling free expression and things like that.
So I think when you're in the public arguing about these things, you naturally attract a certain
site that you keep hearing from, and it's hard to maintain perspective.
Okay, this is actually a great cliffhanger to take a quick break on, and then we can have
Kevin respond when we come back from the break because I've got to get our sponsor and make
sure that we can pay to get this edited. All right. We'll be back right after this.
Hey, everyone. Let me tell you about The Hustle Daily Show, a podcast filled with business,
tech news, and original stories to keep you in the loop on what's trending. More than
two million professionals read The Hustle's daily email for its irreverent and informative
takes on business and tech news. Now, they have a daily podcast called The Hustle Daily
Show, where their team of writers break down the biggest business headlines in 15
minutes or less and explain why you should care about them.
So, search for The Hustled Daily Show and your favorite podcast app, like the one you're using right now.
Okay, we're back here on the second half of the big technology podcast, a debate.
Our first debate between Kevin Ruse and opinion writer at the New York Times and Mark Ledwich,
a software engineer who studied YouTube, looked at the algorithm and said, you know,
there's no evidence of radicalization, at least, you know, that on behalf of YouTube.
So, Kevin, before the break, Mark was making some arguments about,
about the New York Times and sort of, I guess, how the media is incentivized to write these
anti-tech stories and there's no, it seems like there's no room for the other side is what he's
saying. There was a lot of stuff that was said. Floor was yours to respond. I mean, I think
there are two issues here. One is like, what are the incentives of reporters generally? Right.
And I think that often people who aren't, you know, close to the media, who don't work in the media, who don't have a lot of experience in the media, tend to think that it's just, it's clicks, right?
It's like, you know, traffic, it's attention.
I would say that's true of some outlets and less true of others.
Like, I certainly don't feel like I'm motivated by traffic.
And then there's the sort of, there's the argument.
that journalists are motivated by sort of prestige and that, you know, stories that win prizes,
like you don't win a, you know, Pulitzer for the story that investigates the, you know,
the Wall Street Bank and finds no evidence of wrongdoing, you know?
Like, I think I would grant that our incentives skew toward holding institutions to
account and finding instances in which, you know, the public is being manipulated or taken
advantage of. I often tell people like, you know, we don't write about the Boeing planes that
land on time and safely. But that's always been part of the news business. And I don't think
that's changed meaningfully. There was this other argument that I think Mark and some of his, you know,
people who agree with him make,
which is that the New York Times and other media institutions
are sort of being mean to YouTube
because they want YouTube to promote their channels more
and to, like, you know, boost authoritative mainstream sources
and to sort of disappear independent creators.
And I've heard this from, I heard this after I profiled PewDiePie.
I've heard this for years that there's a kind of like,
institutional incentive for media organizations specifically to be mean to YouTube because
they want their own content to be favored in YouTube, in the algorithm, on the home page,
in trending, wherever.
And so I guess I would just love to hear Mark talk about what he thinks that that incentive
is.
Like, why do you think YouTube gets sort of criticized by, maybe?
mainstream institutions.
Do you think it's more like, I don't know, why do you think that happens?
I think it's largely political and cultural.
So it's like YouTube represents a more right-wing version and more, more scrappy,
sort of low, low-quality information type of platform.
And that maybe not everyone, but definitely I think maybe Tristan Harris and yourself
want to be as like a narrative about themselves that they're a large player in saving people
from these problems. So from that, that influences the way you describe or think about the
systems so that you can say, oh, here's this one problem that if I can just shine a light on,
we can fix. And that's where the bias comes in.
and do you think like YouTube is better for these changes that it has made since
you know people started paying attention to issues like radicalization like they
say that their algorithm changes have resulted in 70% less content or 70% fewer views of
sort of borderline content conspiracy theory stuff like that like do you do you think
YouTube is is a better platform today than it was say you know two or three years ago
Yeah, I think it's better.
I definitely,
like, and this is where I definitely credit to you
and others pointing this out early like Zainab as well.
It's good that they're not recommending
videos that are promoting things like conspiracies
or, you know, fire right ideology.
I think that's definitely a good thing.
But I think the changes, there's like, you know,
five steps forward, two steps backwards.
So I think they've curtailed some really good independent YouTubers.
People like Dave Pacman or Lacey Green,
they don't get recommended anywhere near as much as they used to
because of these changes.
And I feel like they have very high quality content,
like better than a lot of mainstream content.
But because of this blanket heuristic, they're being disadvantaged.
That's interesting.
I mean, I often wonder, I think YouTube would love nothing
more than for there to be a universe of YouTube native creators who do straight news,
for lack of a better term, you know, people, you know, like Phil DeFranco, people who, I mean,
he's doing some opinion too, but people, you know, who's, who could just do what TV people
on the news do, but like do it in a YouTubey way. Like, I think that would make them very happy
because they love promoting their own creators.
And I don't think they, you know, my sense is that they kind of,
they promote mainstream news sources because they don't have to like stay up at night
wondering if, you know, NBC is going to publish some crazy conspiracy theory.
It's sort of a proxy for how much they can be sort of trusted to at least report as close
to the truth as they as they can.
But I do think they would like there to be a universe of YouTube creators who do that kind of thing.
I just don't think YouTube creators are incentivized to do that kind of thing because, you know, it's not good for views.
Yeah, I agree.
It's that the demand for content is much more high for opinion than it is for straight news.
I think you guys find that between your departments as well.
do you worry about the criticism from media has made YouTube defensive so they wouldn't take
risk in terms of what content they promote?
I think that they are very sensitive to elite opinion and media is part of that.
I think they're sensitive to politicians.
I think they're sensitive to their peer.
in the tech world, they want to, you know, they want to be seen as the good guys.
And so I think they want to sort of be, you know, small C conservative with respect to, you know,
if they, you know, take it as a given that their algorithm is going to throw, you know,
X billion views to a set of channels.
They want those to, they want to make sure that those channels are not going to, you know,
be the next Alex Jones is and so I think you know a shortcut to do it's sort of like how you know
now if you search for 9-11 conspiracy theory videos or moon landing videos like they'll give you a little
thing from like Wikipedia on the video and so they're sort of outsourcing sort of trust to
Wikipedia because they don't want to write their own little blurbs and I think that that's I think
of what they're doing with sort of this authoritative news push as a version of that
where it's like, they don't, they don't want to be recommending conspiracy theory videos,
but they know that they have this algorithm that is, you know, needs to recommend something.
And so they feel more comfortable, you know, recommending, like creating sort of a safe bucket
of things that they know are not going to, you know, be extremist or contain hate speech
and recommending those. And I do think there's a danger of sort of,
making that bucket too small and sort of too, you know, not inclusive enough.
But I also think that that problem is more easily solved than, you know, the question
of radicalizing people.
Like, I think that it's really, I think what they don't want to do is create a situation
like they've had for the past couple years where they end up, you know, by sort of negligence
recommending these awful videos to, you know, hundreds of millions or billions of people.
And they don't even really know that they're doing it.
Yeah.
Another aspect to that is they do have competitors.
And in terms of the engagement of the platform, how entertaining those recommendations
are matters in terms of whether they're going to lose to TikTok.
And I feel like TikTok did a better job with their recommendation algorithm.
which, you know, it's much more of the views than on YouTube.
And they have to think about that.
So if they make their recommendations extremely bland,
then that's opportunity for competitors that aren't doing that.
Right.
Alex, I'm cognizant of the fact that you promised your listeners a debate.
So I feel like we should like...
No, this is good.
I'm letting this breathe because this is...
Oh, well, if you want to amp it up by all means,
but I think people are going to find this fascinating.
I agree with Mark about, you know, the fact that this is a broader phenomenon than just the YouTube algorithm.
And in fact, one of the things I'm sort of looking at now is, like, what is this demand side, you know, part of the equation?
Like, what, you know, what we see happening right now.
You mean there are people involved in these decisions too?
Yeah, obviously, like people, you know, people make choices.
The thing that's that I sort of push back on is this idea that people have total control.
over their information environments,
even when things are being recommended to them.
There's been a lot of academic research
about the power of recommendations,
the sort of psychological power.
There have been some interesting studies
around things like music recommendations
that people actually like a song more
when they know that it's gotten five stars
from Spotify for, you know,
or ended up on some personalized playlist for them.
they trust the things that are fed to them by an algorithm.
And in a case of YouTube,
like what YouTube's recommendation algorithm does is it really,
it not only recommends videos,
but it dramatically constrains the universe of possible videos.
There are billions of videos on YouTube,
but you are only going to see a couple dozen of them
in any given day because those are the ones
that are going to appear in your sidebar and on your homepage.
And so I think people have sort of free,
free will and free choice in some aspects, but I really think that, you know, part of the mistake
people make in the opposite direction is assuming that, you know, people are in total control
of their choices because we know, I mean, these platforms make billions of dollars a year by
trying to change people's minds in the form of targeted advertising. They know how
influential their recommendations are. And that's a huge part of what's made platforms like
YouTube so successful.
Yeah, so I'm not arguing that recommendations are like totally controlling people.
Like in your article with Club Can you called it like steering people and I feel like that's too strong a word in terms of the influence that it's having.
Like that's why I use the gentle breeze and a storm analogy.
And I think what happened with the election fraud, which you looked at quite closely, shows this in that YouTube was curtailing.
recommendations to videos on average you'll find exceptions but on average videos that were
promoting the election forward narrative were recommended a lot less but despite that the content
that was promoting it did really well so because Fox News who are promoted by the algorithm
were disputing mostly the election forward narrative they lost views at the same times
at places like Newsmax and NTD and One American News Network,
they gained a lot of recommendations.
And so gained a lot of views despite not being recommended as much.
NTD was an exception, but for the others, that's true.
Right.
I mean, I think that really, if we're talking about the U2 algorithm,
we have to sort of separate it into two, like algorithm 1.0,
or depending on how you count, you know, the sort of pre-19.
algorithm. And then what's happened since? Because I do think that, you know, just anecdotally and also
from some studies that have come out, like it does appear that YouTube is recommending, you know,
much more, it's much more likely to be recommending things from like big, you know, professional
media organizations than it used to. And also a lot of the sort of quote unquote worst people
have been de-platformed. You know, Stefan Molynev is no longer on YouTube. You know, Richard
Spencer is no longer on YouTube. These people who were, I mean, Stefan Molyneux was not a
marginal figure. He had, you know, hundreds of thousands of subscribers. He got hundreds of millions
of lifetime views. Like, these were some of the most popular political voices on the platform.
And so, you know, not only as the algorithm different now, but the pool of available videos
it's picking from is different in some meaningful ways. Okay, I do have a few pickups here.
Mark, hold the question. We'll let you ask it. But I have some.
some follow-ups here. So first of all, Kevin, you know, Mark asked you about, you know,
do you worry that your stories will get YouTube to remove, you know, too much content,
something like that? And you answered looking at it from YouTube's perspective,
but I'm interested from your perspective. Do you ever think about like the fact that you're
reporting on the stuff that you're reporting on what might lead to a crackdown for YouTube
that gets folks who shouldn't be demonetized, demonetize, and independent creators never get
a real chance to get off the ground like how do you personally feel about that well first of all i don't
think that you know the goal of my reporting is not to get youtube to take down stuff um that is not
you know a metric that i am sort of you know aiming toward my my goal is to report on what's happening
on youtube and if that leads them to want to take down stuff and to feel pressure to take down stuff
then that that's that but that's not my goal i would i would say i right but you have to know that that
that's a, you know, very clear potential outcome when you write a story like this.
Of course. And I think, you know, that I'm not naive about that. I guess I worry about the,
I worry about the false positives less than the, than the false negatives, to be sure. Like,
you know, something like this happened with, with QAnon and the crackdown over QAnon and, you know,
want a podcast that I really love is this podcast QAnon Anonymous, which is an sort of anti-QAnon
podcast, but because it had the name Q&on in the title, like it got swept in the crackdown,
which sucked, which, you know, I was like, I don't, I don't want to miss this podcast. This is a great
podcast. So, but they, you know, they made a stink about it and got it restored. And, you know,
there are avenues for sort of redress of grievances in the case of a false positive. And so I guess I
worry less about the sort of overbroad application of these rules? Because, I mean, really,
we're still in a phase where these companies are being flooded with sort of misinformation.
Like, it's, we are nowhere near the point of having a totally clean house as far as, you know,
information integrity goes. And so I just think it's a little premature to worry about, you know,
whether we're sweeping up too much stuff, you know, when, whether YouTube is,
is sweeping up too much stuff when it goes after white nationalism and neo-Nazis.
Right.
And then how about the stricent effect, right?
Do you think that the fact that all these folks are getting banned will actually make their
message resonate, maybe with a smaller group of people, but they'll use it as proof to say,
look, we're right.
And then, you know, big media and big tech don't want you to know because that's certainly
the argument they make.
Well, let's just look at like what actually happens to people after they get deplatforming.
And when was the last time you heard from Alex Jones?
When he shows up on other people's podcasts.
Like he did just show up on.
He just went on Joe Rook.
Right.
He has a smaller audience than this show, but not by much.
Right.
So like, but Stefan Mullenu is a great example.
Like he, you know, he still puts up his videos on bitch shoot and other, you know,
minor video platforms.
But he's struggling.
He's not happy that he was de-platform from YouTube.
And, you know, I think that that.
people who have been banned by these platforms generally don't it's really hard to rebuild that audience on a
on a smaller platform and i guess as far as if i was ranking harms like i definitely worry about
building these sort of concentrated spaces you know where there is like you know like parlor or
you know places where there is like gab you know a concentrated amount of sort of extremist activity
um but i actually worry about that less than that sort of contagion effect where you have people who
go on to YouTube, go on to Facebook.
They're looking for self-help or parenting advice or, you know, health information or,
you know, they're watching, you know, unboxing videos or whatever.
And then they're sort of coming across this universe of extremist content that sort of pulls
them in.
That is what, to me, is more worrisome than the kind of the small clusters of extremist activity.
Yeah.
Okay.
And then I got one more pickup for you.
I'm going to go back to Mark, but, you know, Mark mentioned competition and TikTok as a competitor.
It's kind of ridiculous because, you know, if all this bad stuff is powering YouTube and then,
and then, you know, maybe TikTok uses the dark arts of social recommendation engines and grows,
you know, we might end up with a company based in China, you know, that's crushing YouTube.
and uh and uh and uh yeah and sort of less responsive to legitimate concerns and maybe doing stuff
in the background so how much do you think about that in terms of you know and i know like
it's kind of a crazy situation because like it's almost asking how could youtube weaken itself
by removing all this radicalization but but you know what do you think about that though well i think
TikTok and YouTube are different products.
I mean, they're both video platforms that are sort of driven by recommendation engines.
But TikTok is, you know, very short form videos.
YouTube, you know, I don't know what the median length of a YouTube video is,
but it's probably significantly longer than a TikTok video.
And I think TikTok is an interesting case of like moderating in the other direction.
So YouTube started off as like total free for all and then has sort of gradually like
winnowed down like we don't.
want, you know, hook bay. We don't want, you know, nudity. We don't want neo-Nazis. We don't,
like, they've sort of shrunk the pool over time. Whereas TikTok started out, like, very
constrained. Like, there was basically no politics on TikTok at the beginning. And that was a
conscious choice because bite dance wanted it to be a fun, lighthearted place. They didn't want
people talking about, you know, oppression and, and, you know, injustice. And they wanted to keep
bit light. They wanted, you know, teenagers renegating in their bathrooms or whatever. So they have
sort of expanded what is considered acceptable over time where they started letting on political
videos and then they started sort of broadening out. I think the diversity of content is probably
higher on TikTok now than it was a year ago. So they're sort of approach, I think they're probably
going to end up in a pretty similar place, but they're approaching it from different directions,
which is a really interesting thing to me.
Yeah.
Yeah, we'll see what happens there because, I mean,
just having the experience of spending more time on TikTok lately,
that thing is addicting.
And I mean, I have a pretty good YouTube addiction right now as it is.
And it'll be interesting because to me it's a zero-sum game time on TikTok
is time not on YouTube.
So that competition will be interesting to watch,
especially as they make decisions based off of safety.
Okay, we are coming close to,
being out of time. And I'm sorry, Mark, I kind of took the floor from you. So do you have any last
questions or thoughts that you would like to ask? Ask Kevin, yeah. I just want to downplay the
influence recommendations a little bit in that. The best estimate for political videos I've seen
is about 40% of the recommendations, sorry, 40% of the views coming to political videos coming
from recommendations. So there's a large amount where it's coming from links and search and other
means. And direct, right? People just, yeah, allowed to search. Right. Yeah. So people are
often in social media. They're in private chats with each other or they're Facebook or Twitter and
they're clicking on links through that amongst other means. So there's lots of ways for the content
that's in demand for people to get at. And I think if you, and that will change depending on what you
do with the recommendations. So if you make your recommendations, particularly.
bland, I think that would actually change that makeup, it would go down to less than even 40%.
And you could also get recommend, like, if you, the way that I, I think rabbit holds a good
term because it's how I kind of fall into these things. And generally, like, what happens is
I'll just watch like one genre of video so much. And then, you know, YouTube will recommend
something new and then I'll just start going direct to it. And just for listeners,
most of my YouTube videos are just like sharks and whales in the ocean, not any political
leaning that I think that's probably mirrors most YouTube users.
But yeah, it's the recommendation plays one part and then you start going direct.
So it's a process.
It's a process.
Kevin, do you have any last thoughts?
I'm really appreciative for all the research being done by Mark and
others on this. I think it's super important for the academy for, you know, engineers and data
scientists and for journalists all to be looking at these problems simultaneously. So even though,
you know, you disagree with my reporting and I'm really glad that you're looking at this
and doing the work. It'd be great in my next study. If you could write it up on the New York Times.
So do you guys feel like you, you know, I think there was
you know, a bit of a standoff.
Do you guys feel like you understand each other's points a little bit more broad after
this conversation?
Yeah, I think we got much deeper into the dynamics of this problem than you would in an article.
Yeah, it's good.
You're a peacemaker, Alex.
Okay, well, I did want to say that I'm a believer in this type of dialogue.
I think that this is good.
Benefits everyone.
So if there's folks listening who have bone to pick with other reporters or reporters
who don't like what's going on in industry, we can see if we can set more of these up.
But I have to say, I'm thrilled to have the opportunity to host this discussion.
There was a point like here in the second half where it was such a great conversation that
it was time to just let it breathe.
And I'm grateful for you guys coming on and running with it and, you know, being able to
address each other and, you know, bring up these questions.
And I agree.
It's something that we're going to be talking about for a long time.
More dialogue, more research, more reporting.
All this is good.
say if you if you want this to go viral on on youtube you're going to have to give it a title like
you know programmer owns journalist you're not going to get any views with the the boring title
yeah it's a new york times admiter here's time the reporter admits the censorship campaign is
well on its way okay great well we'll find a creative title i mean i don't know i'd like to go
with the bland ones it's probably why i'm losing out to tic-tac but thanks everybody
for listening. And thank you, Kevin and Mark, for joining us here on the Big Technology Podcast.
If you're a new listener, please subscribe. We release a new episode every Wednesday with
tech insiders, outside agitators, and maybe we'll do some more debates like this.
If you're a long-time listener, a rating goes a long way. So if you could rate us,
it doesn't matter how many stars, but a rating would be great on your app of choice. We would
appreciate that. Thanks again. We will be back next week with a new edition of Big Technology
podcast. Until then, take care of yourself.
And we will see you then.