Radiolab - Content Warning
Episode Date: October 17, 2025Over the past five years TikTtok has radically changed the online world. But trust us when we say, it’s not how you’d expect.Today we continue our yearslong exploration of what you can and can’t... post online. We look at how Facebook’s approach to free speech has evolved since Trump’s victory. How TikTtok upended everything we see. And what all this means for the future of our political and digital lives.Special thanks to Kate KlonickEPISODE CREDITS: Reported by - Simon AdlerProduced by - Simon AdlerOriginal music from - Simon Adlerwith mixing help from - Jeremy BloomeFact-checking by - Anna Pujol-MazziniLateral Cuts:The Trust EngineersFacebook’s Supreme CourtSignup for our newsletter!! It includes short essays, recommendations, and details about other ways to interact with the show. Sign up (https://radiolab.org/newsletter)!Radiolab is supported by listeners like you. Support Radiolab by becoming a member of The Lab (https://members.radiolab.org/) today.Follow our show on Instagram, Twitter and Facebook @radiolab, and share your thoughts with us by emailing radiolab@wnyc.org.Leadership support for Radiolab’s science programming is provided by the Simons Foundation and the John Templeton Foundation. Foundational support for Radiolab was provided by the Alfred P. Sloan Foundation.
Transcript
Discussion (0)
Oh, wait, you're listening.
Okay.
All right.
Okay.
All right.
You're listening to Radio Lab.
Radio Lab.
From WN. W.N.Y.
See?
See?
Yeah.
You're right in here.
Awesome.
You're going to be speaking in that microphone.
This guy?
Nope. The one closer here.
Hey, I'm Simon Adler.
This is Radio Lab.
Two for five. Can you hear me, Kate? Yep. And that is Kate. Kate Klinick. I'm a professor
at St. John's Law School. I've talked to her a bunch over the years. We did a couple different
stories that felt like news at the time about Facebook's rules for what we can and can't post
on their platform. Don't get me saying the F word again because last time my parents yelled
to me. Did they? Yeah, they were like, Kate, you're an adult now. You're a serious person.
I prefer to swear on the radio as much as possible.
We covered the origins of these rules and just how complicated they can become.
But beyond the specifics, what we were really exploring was how the ideal of free speech plays out in different spaces in our society.
You know, from a good old public square where anyone can say anything they want to lightly regulated broadcast TV to straight up private spaces.
and we were asking, like, where does social media fit into all that?
And, you know, I kind of thought we were done talking about all this.
But then...
I'm happy we still have a show, too, I guess.
This past month.
Jimmy Kimmel can't say that anymore.
The late-night host taken off air indefinitely.
As we all know, free speech was in the news again.
I mean, look, we can do this the easy way or the hard way.
That's censorship.
That's state speech control.
And these questions of who can say what, where, and how much pressure the government can or can't exert just felt fresh and vital all over again.
And so I called Kate, yeah, to see how this is all playing out online.
Yeah.
And now it is a problem of, okay, how do we stop billionaires and authoritarian governments from twisting these platforms?
into censorship machines or political propaganda.
Okay.
I know. That's kind of how I feel, too.
Well, I guess before we get into all of that,
let's build a bit of a foundation first.
Sure.
So I guess how has the actual practice,
of keeping stuff up and taking stuff down changed? And why? Sure. So the main thing,
the main thing from the last time we talked that has really truly changed from like 2020 to
2025 is the rise of TikTok. I mean, if you will remember it like in two short years that had
basically caught up with 12 years of Facebook's growth. And I mean, TikTok has a different way that
they run their content moderation.
Okay, how so?
Well, when we spoke in these past episodes,
and one of the assumptions of content moderation
when it was getting off the ground,
be it Facebook or Instagram or YouTube,
was that we don't want to censor people unnecessarily.
Yep.
And so you would keep content up
until it was reported as being harmful.
And then you would make rules
that would limit and try to preserve voice
as much as possible as they put it.
that was like the industry term for free speech, voice.
There were limits to that, obviously.
But generally, like, it was a keep it up unless we have to take it down type of thing.
But that's not TikTok.
TikTok comes from, obviously, China, and it comes from a censorship kind of authoritarian CCP culture.
And, I mean, I believe the Chinese kind of approach to speech is very reflected in the algorithm that TikTok uses.
It is not a default.
everyone should see everything.
This is a free world
and people have a right to say
whatever they want,
even if it's a private platform.
It is a we get to determine
what people see and say.
And that's it.
So they're just taking
tons and tons and tons of stuff down?
Oh, I mean, no.
Like, TikTok,
it prescreens such a volume of content
that it determines
to not be outside of certain political
parameters. And so they're less likely to cause negative interaction effects to put kind of an
economic term on it. If I can put a stupid man's term on it, it's like they are choosing to push things
up instead of pull things down. That's a perfect way of thinking about it. And they push things up
that are very milk toast, very like happy, make you feel good, very apolitical. And so this is
basically downranking or shadow banning. The idea that you're going to
manipulate the algorithm to not delete the content, but not promote it. And in addition to that,
the algorithm is constantly improving and iterating on all the behavioral signals that you give it.
And so it's able to provide a very addictive and expectation meeting.
Product.
Yeah, product. I mean, there's no way, I'm almost an experience, but I'm like,
it's kind of, but it's not, it's, I don't know what it is.
I have a confession, which is that I've maybe spent five minutes on TikTok in my life.
I don't have TikTok.
You don't either?
Well, I have, like, rules for some of these things.
But, you know, I study online speech for a living, so it seems kind of crazy.
But, like, I don't need to actually be on TikTok for TikTok to be all over my life.
I see TikTok videos constantly.
They are cross-posted.
I don't need to actually be on TikTok.
Well, on that, it is interesting that TikTok figured out how to make banal stuff compelling
because we were certainly told that, well, the reason Facebook wants to leave some of this stuff up
is because it's the highly emotive, highly reactive stuff that keeps people around.
So what did we have wrong there?
Was this just like an adjacent path to the same outcome, which is keeping people on a platform?
Oh, I mean, I think that it's actually fascinating.
You know, what they figured out is it is a format of video that people are hooked by.
And so it does not really matter.
You will find yourself often watching things that you didn't know you were interested in,
but like you're just compelled by certain types of couples that like look very different from each other,
doing any type of like interaction.
Fascinating.
So it's like Facebook figured out the sort of information that would keep you there.
TikTok figured out how to package any information to keep you there.
Yes, that's like one way of thinking about it.
Like, yeah, I mean, you know, but this is not new.
I mean, like advertisers have been doing this forever.
Sure.
Right?
Like, this is, you know, it's just a very different business model.
It is a very different product model.
And it seems to then be a very different informational ecosystem you're creating.
Because if you're pushing up.
everything that falls within certain bounds, and you're deciding what those bounds are,
it becomes far more, like, is controlled the right word?
What's the word?
Yeah, it's controlled, but it's also, in like a certain way, is even more dangerous
because, like, the ultimate in censorship in American First Amendment law is really
prior restraint.
Right, sorry.
Oh, sorry, excuse me.
What is prior restraint?
Prior restraint is censorship before something.
goes up, or is ever published.
Oh, so it's not redacted.
It's that it was never printed.
Exactly.
That is the exact distinction.
And it's important because the existence of this redaction, the proof that it was removed
from Facebook is actually evidence that censorship has happened, right?
Right, right, right, right.
Whereas with TikTok, you never even know what you missed.
You never even know what you were kept from seeing.
And that is really, unfortunately, what we're staring down.
at this moment because in the last five years,
American social media has moved towards TikTok's approach to content moderation.
Wow. Okay, I didn't expect us to be talking about TikTok so much, but I'm glad we have.
So if I'm telling the story of this, it's like, once upon a time, Facebook creates content moderation for everything, all these policies, all these rules.
Meanwhile, TikTok is sort of lurking across the Pacific eventually jumps over, and Zuckerberg and the Silicon Valley folks see they're doing it this very different way.
When does that actually start to shift not just the way Facebook is thinking about its content moderation, but also maybe.
be the way people are experiencing Facebook as a result?
That is not as clear, but the biggest C-change is the one that you're thinking of.
Hey, everyone. I want to talk about something important today, because it's time to get back
to our roots around free expression on Facebook and Instagram, which is the one that happened
in January 7th of this year, 2025, and Mark Zuckerberg announced the end of the fact-checking
program. We've reached a point where it's just too many mistakes and too much censorship.
And that he was going to try to move towards a community notes-based system of content moderation.
So we're going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms.
And I mean, I think that, like, it was, and it wasn't a sea change.
Okay, well, and talk to me, like, when we say Facebook got rid of its fact-checking, at its sort of height, what was Facebook's fact-checking?
Okay, so not much, which is why this was a really, which is.
is why this was such a frustrating announcement, and it was frustrating that the media focused on
it so much. The fact-checking was like a commitment to fact-checking because there had been so
much clamor about miss or disinformation, but they were removing posts days after they were flagged
and, like, it was very small. And so to watch it go on the chopping block was really more of a
signal to a very particular person and to a very particular party that felt like big tech censorship was
coming for them. And like, you know, we can get into a whole kind of conversation about whether
or not that was reality-based, but that was kind of the complaint. Right. And if I'm going to
mount the best defense for conservatives about censorship by big tech, it would be that during the
pandemic, there was sort of a party line as to what was an acceptable way to talk about the origins
of the pandemic, right? Yeah. And you can even go before the pandemic.
Okay, you could take it before.
You can, there's a few things.
And one of them was...
There are serious questions for Joe Biden this evening following the publication of emails
allegedly belonging to his son, Hunter.
The Hunter Biden laptop scandal.
Reporting lays out purported emails between Hunter Biden and Ukrainian businessman.
New York Post.
They broke the story.
And links to that were taken off Facebook and Twitter.
That was absolutely censored.
And what was the justification by Facebook?
Well, that was happening a couple weeks before.
for the 2020 election.
And so what had been the huge concern for Facebook
and all these other companies
was how social media impacted the 2016 election.
And so they made a lot of big changes.
And one of them was just kind of like
we're not going to allow things that could possibly be
for an influence to stay up
because this is exactly what we got yelled at in 2016.
And so they kind of overcorrected.
And I think in hindsight,
it was a really hard call
and maybe probably the wrong one.
And then you extend that to the Wuhan lab leak.
Now, those were just insane, insane issues.
And look at us.
We're still talking about them today.
It's not like they were that censored,
unlike going to say China
where it's like, you're like, oh, you know, Tankman
and they're like, who?
Right.
Because there are no photos of Tankman.
Right.
They are not published.
Right?
And so it's not like, I just also.
Point taken. Okay, well, so then, like, what has changed then? If, yes, there was some censoring going on and censoring of things in these sort of critical moments, like, would that not happen now? Is that the difference?
I mean, I, my honest belief, I can't predict the future, but my honest belief is this administration would very quickly put the platforms in line.
Yeah, I think that there would be no hesitation to do this because I don't think that this was ever about free speech. It was about their speech. And that is really what you're unfortunately seeing right now. There is no recognizable free speech notions coming out of this current administration. And with the TikTokification of social media, people have seen the vector for power that is in content moderation.
Okay, so Kate, uh,
you were saying that TikTok has this fundamentally different approach to content moderation,
that instead of reactively taking stuff down,
they are proactively flooding the zone with happy-making stuff,
that Facebook and X and others have taken notice and started adopting this approach,
and that all this has happened as folks have begun to see that content moderation itself is,
I think you said, a vector for power.
Yeah, I think that basically what you're seeing is the power over what appears in your feed or doesn't appear in your feed or the types of new content that you're recommended or the first commenters that you see on a video that you just watched.
That type of control is an ability that we've never seen before.
I remember when I was first writing about this in like 2017 and 2018 and presenting my research, one of the things that people,
were so concerned with those filter bubbles.
Yeah.
Well, we're going to be in these filter bubbles fed to us by the algorithm.
And as it turns out, that was one very true that that would happen.
But also, even maybe more disturbingly, we don't even need filter bubbles anymore.
People are just choosing platforms based on the types of content that they expect to find there.
And in that way, so if we were gone from filter bubbles to platform islands, where
the owners of the platform get to
push up whatever
it is that fits
whatever their ideological ends
are. China and TikTok, it seems to be
like, milk toast stuff that's not
going to rally you up, but it's going to keep your eyeballs
on here. It feels a little bit
like X,
formerly Twitter, is the mirror image
where it's like, we're just going to rile you up
all the time. Is that
right? And is that what we're going to just
see more of, which is come to this
platform island for
Emotion A, come to that platform island for emotion B.
I think that that's exactly right.
I mean, yeah, I mean, that's what we go to the movies for.
That's what we turn into certain types of things for, right?
It's not in the mood for, you know, a horror film.
So I don't go to a horror film.
This kind of approach is much easier to moderate.
People get much less upset and it's much cheaper
because there is not as much reactive content moderation to do.
You don't have to employ hundreds of people in call centers to review every report of something that's been flagged.
And so this has kind of become the new standard.
I remember one of the big questions, probably in the first piece we did, was this question of like,
what kind of a space to consider Facebook?
Because the First Amendment treats private spaces differently than public spaces.
So it matters whether or not.
Facebook is more like a mall or a public square.
And so given all these changes, you've just mentioned,
like, what is the metaphor now?
I have one based on what you've said,
but I'm curious what yours is.
No, I mean, I've always liked the mall metaphor
and it has a weird, scurly little place
in First Amendment law in a bunch of cases,
but I kind of want to hear what yours is.
Well, to me it's now, or certainly the direction things seem headed
based on what you've said,
is that it's now just, it's just broadcast again.
Yeah.
And with broadcast, there is no free speech, right?
No.
Like ABC, NBC, they can cancel a show at any time.
They get to decide exactly what the evening lineup is.
But with this, with social media, it's like,
it's like a broadcast camouflaged as an organically generated thing.
A hundred percent, you know, you can shadow ban or take down or limit the reach.
But it doesn't even have to be that subtle.
Like, Elon Musk always showing up in my feed, even though I don't follow Elon Musk, is like having
Rupert Murdoch in like the interstitial spaces before every commercial break at Fox News, you know,
like directly telling me what I should think.
That isn't subtle.
Like, that is the other thing about this that is maybe the scariest part of the last couple of months
is that none of it even is super pretextual.
Like, there isn't a lot of like excuses.
We're not even hiding behind algorithms anymore.
It is just the owner of the platform saying the thing out loud and forcing everyone to see it if they're on his platform.
You know, I think that if you're going to all of these different platform islands, the other thing is like, how do we change those?
To use regulatory regimes to try to control how they speak is obviously a problematic thing by any type of measure.
We don't want government's controlling speech
for the exact reason of all of the authoritarianism
we've just discussed.
And so I think that there's, it's very hard...
Sorry, if I can jump in there, though,
but it does feel like, yeah, I'm not for
and have never been for the federal government
coming in and molding Facebook's content moderation policies.
Of course not.
But if something no longer resembles a public square at all,
And instead has become, to keep reusing my label, like a camouflage broadcasting network where it's like, yeah, these are individuals saying something that they believe in.
But then that is being co-related, amassed, and pushed out as a opinion-changing product by someone on high, I am okay at that point with there being some sort of regulation.
It's not regulating maybe what people are allowed to post, but maybe how it's being aggregated.
I don't know.
There have to be some clever, somebody smarter than me who could come up with these sorts of rules.
No, I mean, like every Western state has some type of media regulator, specifically to avoid maybe like two or three people controlling all of media.
Right.
Right.
But all of a sudden we're like on the internet.
And yes, there is an infinite amount of content on the internet.
But is it so infinite?
Like if there are, if we're talking about like the same three main places that people are going to for their news, people are going to for like their daily interactions, people are going to feel like they're part of a conversation, their water cooler, their public square, whatever it is.
If that is like three people and they're all friends of the president, like that's a problem.
And maybe even more importantly, journalists, they go to X, they go to Blue Sky.
they go to YouTube, they go to TikTok, and they report things that are happening in those places
as if they're real places that things are happening. But they're also controlled by these
individuals. And so they're not reflective necessarily of real world, yet they are being
reported on as if they were reflective of real world, right? And so I just think that what you've seen
the last five years is an industry understand the power that it holds in content moderation,
that it's so not a customer service issue, that it is actually like a huge, huge force for shaping
public opinion, and that that has exponential value to political parties and governments.
It's like as valuable as oil and guns, because how you push things, what you
keep up, what you take down. I mean, this is how you can basically create, you know, the rise and fall
of presidencies if you want to, or political parties. And they know how to market them to you,
no matter how niche you are. And that's scalable. And so, like, it's a way to make a lot of money.
And then it's a way to control a lot of minds. You know, I think one of the reasons you and I have
gotten along so well over the years and have worked so well together in this.
this now trilogy of stories,
is that we both had sort of an unorthodox approach to this.
I mean, most people were saying that these Facebook guys were idiots,
that they're bad, that they're causing lots of trouble,
that we should just like cast scorn upon them.
Yeah.
And you, and then me sort of following your lead,
we're more like, what if we actually try to understand this problem?
And I guess now, with hindsight,
I'm wondering, like, did we miss something here where we sort of played the fool?
You know, it wouldn't be the first time that someone has told me that in some way I'm a useful idiot to Facebook or in some type of capacity.
I didn't say, I would say we would be useful idiots.
So I didn't call you.
I'm asking if we are, is the question.
I feel as if a lot of people and a lot of what we've said today, people will be like, of course this is what happened.
This is what we were saying would happen.
But it wasn't fatal complete when we talked about it.
It wasn't.
Every single one of these solutions has the same flaw at the end of the day to it,
which is that these are for-profit companies that do what they want to do.
And things change as things settle.
So I don't know.
Okay.
Well, so then, like, is content moderation sort of dead?
I just, yeah, this is like a very controversial thing.
It really depends on what you mean by that question.
There has been a lot of controversy around, like, are they going to invest in these huge
cost centers of trust and safety?
Are they going to care about this type of issue if they can TikTokify everything and just
send you down these rabbit holes of endlessly droly, like, eye glaze over like Wally
kind of scene where you're on the couch
with your slurpy like bark a lounger
or whatever like watching things
is that what they're basically going to do
and are they going to have to keep moderating
and I mean I think that like the answer is
that we're going to increasingly see
a automated content
moderation system. It's going to increasingly
not embody
the edges
of society and
the range of voice that we
had at the beginnings of the internet
and that we are going to kind of see a productification of speech.
I'd love to give you one more idea that I've been playing around with for a couple of years.
Yeah.
If I was ever going to write a short sci-fi story, it would be about the quote-unquote perfect piece of art.
You step in front of it, it does a quick facial scan of you, pulls everything about you that it knows from the internet.
and then it puts forward an image perfectly generated for you that will evoke a feeling.
On Tuesdays, it's happiness, on Wednesdays it's sadness.
And so it's this visual tableau personalized to every person that evokes the same emotion.
And once you have that, once you can control the emotions of people with the flip of a dial
by putting something in front of them that's going to only peak that feeling for them,
then you could just control everybody.
Well, I love that.
Sounds like a Ted Chang story.
But you should write that.
Maybe you can ask AI to do it for you if you're really busy.
This story was reported and produced by me, Simon Adler,
with some original music and sound design by me, mixing done by Jeremy Bloom.
Of course, huge, huge thank you to Kate Klonic, as always.
And, yeah, we will be back next week saying some more things.
Until then, thanks for listening.
All right, I need some of these scripts here.
I think we're using this one.
Hello, hello.
Oh, I can hear myself.
Kid, podcast, crossover.
special.
Hi, I'm from,
wait, hi, I'm Norseulton
and I'm from New York, and here are the
staff credits.
Weirder Lab was created by Giad
Abram Brad, and it's edited by
Sir Witter. Dylan Keith is our
director of sound design. Rulu Miller
and Latif Masser are our
co-host. Our staff includes
Simon Adler, Jeremy
Bloom, W. Harry
Fortuna, and
David Gable. Oh, so I
I just have to read that one name.
Okay.
Oh my God.
So you have to like tap it out.
Okay, okay, right.
Sendu and I somebody.
Yes, yes.
Score.
Good job.
Annie McEwen, Alex Neeson, and Sarah Kari.
Oh, Sarah Sandbach, Anisa V. Tess,
Ariane Wack, Pat Walters,
Molly Webster, and Jessica Young.
Yeah, yeah, I see it.
Do I sound like happy?
With help from Rebecca Rand.
Our fact checkers are Diane Kelly, Emily Krieger, Anna Pua,
Mezzini, and Natalie Middleton.
Guys, I know I'm famous.
You don't got to clap.
It's all.
Hi, this is Laura calling from Cleveland, Ohio.
Leadership support for Radio Lab Science Programming
is provided by the Simmons Foundation
and the John Templeton Foundation.
foundational support for radio lab was provided by the Alfred P. Sloan Foundation.
