Today, Explained - Trouble at TikTok
Episode Date: February 4, 2026TikTok has new owners in the US. With claims of censorship, widespread outages, and possible algorithm changes on the way, the reboot has been rocky so far. This episode was produced by Peter Balonon...-Rosen, edited by Jolie Myers, fact checked by Andrea López-Cruzado, engineered by David Tatasciore and Patrick Boyd, and hosted by Jonquilyn Hill. Photo by Jakub Porzycki/NurPhoto via Getty Images. Listen to Today, Explained ad-free by becoming a Vox Member: vox.com/members. New Vox members get $20 off their membership right now. Transcript at vox.com/today-explained-podcast. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
Your algorithm is under new ownership.
A mostly American group of investors recently took over American TikTok.
American lawmakers forced the sale, citing data and privacy concerns because of the app's Chinese ownership.
Since the handoff, things aren't going great.
Oh, we're definitely being censors slash silence.
Who else is having issues with TikTok muting a portion of their videos when they upload them?
Y'all, TikTok is going downhill fast, I fear.
It turns out lots of people miss their corporate Chinese overlords.
And that's not the only trouble at American TikTok.
I'm John Glyn Hill, sitting in the host chair.
The trials of TikTok coming up on Today Explained from Vox.
With Amex Platinum, $400 in annual credits for travel and dining
means you not only satisfy your travel bug, but your taste buds too.
That's the powerful backing of Amex.
Conditions apply.
When McDonald's partnered with Franks Redhot,
they said they could put that shit on everything.
So that's exactly what McDonald's did.
They put it on your McChrispy.
They put it in your hot honey macnuckets dip.
They even put it in the creamy garlic sauce on your McMuffin.
The McDonald's Frank's Red Hot menu.
They put that shit on everything.
Breakfast available until 11 a.m.
At participating Canadian restaurants for a limited time,
Frank's Red Hot is a registered trademark.
of the Frenchest Food Company LLC.
You're listening to Today Explained.
David Pierce, I'm the editor at large at The Verge.
All right.
So American TikTok has new owners now.
And almost immediately after they took over,
people started reporting issues with the app.
I want to start with the big one.
People said that they were being censored.
What's going on there?
And what are the complaints?
That is the big one.
It's also the most complicated one to sort through
because fundamentally it's about feelings.
So a thing to understand is that everybody has always believed
they're being censored on social media.
Since time immemorial, this is the story of social media.
What's happening on TikTok is at this particular moment,
I believe less about censorship and more about normal internet problems.
There were a lot of people reporting
that they would upload videos around what was happening in Minneapolis
and those videos would get no views or those videos would actually not upload properly.
So I can't post anything about what happened yesterday in Minnesota,
so I'm going to be very selective with my words.
If you talk about ICE or anything political, if you're even talking about the weather, zero views.
We're already seeing proof of suppression.
There were people who were saying that if you DM'd the word Epstein to somebody else,
that that wouldn't go through.
Insamely, you now cannot use the word Epstein in TikTok DMs
and are immediately flagged for community.
violation. Can't use the word Epstein on TikTok since Trump allies took ownership. Absolutely
CCP levels of censorship. All of this is more easily and just as successfully explained
by normal corporate ineptitude, which is that TikTok's new data center provider, Oracle,
had a huge outage. What we think we know is that it was a big data center in Virginia
had what they called a weather-related issue,
which I live in Virginia,
and I have spent a whole week having weather-related issues.
Oh, there's still snow on the ground where I'm from.
Yeah, I have a very hard time imagining anybody
not having weather-related issues right now.
But so there was a big power outage.
They've had big issues at the data center,
and that seems to be the actual culprit here.
There are lots of good reasons to be worried about censorship.
There are lots of potential censorship problems coming to TikTok,
but rationally speaking, the likelihood that this new group
would have taken over TikTok and immediately like smashed a big red censorship button is pretty
unlikely anyway. Very hard to do technically when you take over a platform this huge to change it
in any meaningful way. It just takes time. And it was in the middle of huge disastrous technical
problems that I think are probably the real answer here. Is there a way for us to actually know?
I mean, people are pretty skeptical of TikTok right now. I think one useful analog here is when
Elon Musk bought and took over Twitter.
And when Elon Musk took over Twitter, he just said out loud all of the changes he was
intending to make, right?
We want to be just very reluctant to delete things, just be very cautious with permanent
bans.
And this was after years of conservatives in particular saying that they were being censored
by Twitter's existing leadership.
So Elon Musk comes in and essentially says, I'm going to reverse that.
Free speech is meaningless unless you're a lot of people you don't like.
to say things you don't like.
And then does a bunch of very obvious things that you go on the platform and you spend
five minutes on it and it just felt different.
You were seeing different posts.
So I think there is a version of this that feels very obvious.
It's just that for right now, there are better, simpler sort of Occam's Razory explanations
for what's going on.
California Governor Gavin Newsom says he's going to investigate TikTok censorship.
Is there anything to that or is he just, you know, someone with 2028 on the brain?
in that particular one, I think it is 2028, Brian, pretty aggressively.
Gavin Newsom was responding very directly to all of these reports that the word Epstein was being censored in DMs.
That problem has largely gone away as Oracle's technical problems have gone away.
But I think the fact that if you're Gavin Newsom, that is a politically useful thing to do is very telling.
We've had this six-year saga of what's going to happen to TikTok all the way back to the first Trump administration.
At first, Donald Trump wanted to ban TikTok.
We're looking at TikTok. We may be banning TikTok.
And then Trump discovered pretty clearly how important TikTok was in getting him elected again in 2024.
The U.S. should be entitled to get half of TikTok.
And all of a sudden saw it as a politically useful platform and then engineered the thing to be sold to some of his close friends and allies.
American investors and American companies, great ones, great investors, the biggest.
If you just look at the set of things that has happened,
it would be surprising if it doesn't end up being a politically geared, right-wing, coded platform.
And so if you're Gavin Newsom to say,
I am going to be your champion to keep TikTok as the thing that you want
and not turned into the kind of cesspool that X has become, it's a smart move.
I think something that's poured gas on this is something that you mentioned,
which is who these new owners are.
tell us about them.
Sure.
So the new owners of TikTok are essentially three players that matter.
One is Oracle, the huge old Silicon Valley company owned by Larry Ellison, who is a very close friend of President Trump's.
Which we've covered on the show.
Allisons, meet the Ellisons, the newest right-wing billionaire family.
MGX, which is an investment firm located in Abu Dhabi, and Silver Lake, which is a huge private
equity firm that is invested in just everything you can possibly think of.
Each of them has 15% of this new company.
The new company is called TikTok USDS Joint Venture LLC.
UofS.
It really just rolls off the tongue.
Yeah.
But everybody just calls it TikTok US.
So for all intents and purposes, it's Oracle, MGX, and Silver Lake are the controlling
new investors in this company.
And most importantly of all is Oracle.
Right.
Oracle is in charge of the algorithm.
It's in charge of the private.
It's in charge of the data security.
All of the things that made this a national security issue in the first place are now up to Oracle to solve.
So I think Larry Ellison is the single most important person for the new TikTok.
You have this group of people with very similar politics, with a very clear understanding of what you can do when you own a platform like this.
And if you look at TikTok and you say, okay, this is actually an incredibly valuable, just information.
dissemination product in the United States.
And you are a political animal.
Of course you're going to look at that and say,
actually,
this thing that we've been railing against,
which is fears of the Chinese government,
you know,
melting our brains with dance crazes,
we can just do that for our own purposes.
There are lots of good and fair questions to ask
about what the Chinese government was doing,
both in terms of the data they were collecting about U.S. citizens
and the way that the Chinese government might be skewing the algorithm
to show certain things to Americans or not.
But the difference now is we see what the government is doing.
Right now instead of saying, okay, it's this sort of abstract big bad that is China.
But you look around and it's like, okay, it's actually very obvious what the Trump administration is up to
and the ways in which they want to control information and the ways in which fascism is creeping
across the way that we communicate with each other in so many ways.
And that is not abstract.
And you can just say, oh, okay, if you give Donald Trump the,
means of showing people videos that they watch for hours a day. Like, it's not unclear what he's
going to do with it. And so I think this stuff is both sort of literally closer to home for a lot of
these users, but also it's just much more obvious what it would look like if it goes wrong.
You know, what is also not abstract is that there were new terms of service that folks had to
agree to. What do we know about what's changing? Okay, this is a tricky one because
one of the very funny things about terms of service on apps like these is that the
that they're always terrifying. And they're often terrifying for totally non-terrifying reasons.
What happened in this case is there are some new things in the terms of service. The new TikTok
US is going to collect more precise location data if you allow it. It also gives TikTok permission
to collect a bunch of data around kind of nebulous AI things that make it clear they're going to
do a lot of sort of gen AI stuff inside of TikTok and that's data it can collect. But there was one
thing that a lot of people got really nervous about.
Your immigration status, your geolocation, your mental health diagnosis.
This is the private information that TikTok is now lawfully collecting from you.
They're going to know your citizenship and immigration status.
Your religious beliefs, your mental and physical health, sexual life, sexual orientation.
Also if you're transgender or non-binary, it's just a lot.
All of these things.
And actually, that data has been in TikTok's terms of service for some time.
Oh.
Again, I think it is reasonable to be alarmed by
this data is going to be collected by a new group of people.
And I think it is fair to be worried about who that group of people is and what they might do with your data.
But the sheer collection of that data is not new in TikTok.
It's just that people finally read the terms of service.
Okay.
So all of this is the business side.
But let's say, you know, I'm out here scrolling.
And right now, cooking videos, videos about makeup, videos of men cleaning horse hooves.
will my experience on the app change now?
Is all of that gone for me?
Okay, A, we have shockingly similar time once.
This is very exciting.
I love the horse hoof cleaning video.
Shout out to Nate the Hoof guy.
Yes.
In the past, I've said that this is one of my favorite tools.
And in today's video, I'll show you exactly why.
And all the comments were always like, doesn't that hurt?
Anyway, you and I would never have searched for horse hoof cleaning videos.
But the TikTok algorithm is clever enough to have shown them to us.
And that is the thing that actually no other platform has done a good job of replicating.
But now, one of the stipulations of this deal is that there has to be some meaningful separation of that algorithm from Chinese control.
The new owners are going to, quote, retrain, test, and update the algorithm.
That is a very vague phrase, but it means in some way the algorithm is going to change.
But that we're going to not see for a while.
Okay. And wait, one more question before we go.
Sure.
Gotta close the loop on the horse hooves.
Does it hurt them when they get them cleaned?
It doesn't hurt them because they don't have feelings on that part of their feet.
And actually, it feels good for them.
You want to be a professional to do it, though.
You and I should not be cleaning horses.
No, we should not.
But I'm glad that Nate's out there doing it.
Coming up on Today Explained,
some major lawsuits alleged social media platforms are designed to be addictive.
TikTok settled the first big one,
but most of the rest of the social media world,
headed to court.
Did you know that Staples Professional can tailor a custom program to make running your business easy?
With a Staples Professional account, you get one vendor, one delivery, and one invoice for all your must-haves.
From tech to cleaning supplies and dedicated support from Staples experts who guide you on everything,
from product selection and ordering to payment.
Join today at staplesprofessional.ca and get expert solutions tailored to your business.
That was easy.
Investing is all about the future.
So what do you think is going to happen?
Bitcoin is sort of inevitable at this point.
I think it would come down to precious metals.
I hope we don't go cashless.
I would say land is a safe investment.
Technology companies.
Solar energy.
Robotic pollinators might be a thing.
A wrestler to face a robot, that will have to happen.
So whatever you think is going to happen in the future,
you can invest in it at Wealth Simple.
now at WealthSimple.com.
Welcome aboard Via Rail.
Please sit and enjoy.
Please sit and sip.
Play. Post.
Taste.
View.
And enjoy.
Via Rail, love the way.
What do you think today explain this?
I don't know.
My name is Naomi Nix.
I am a reporter at the Washington Post.
I cover social media.
Okay, so there are some big social media lawsuits just getting underway.
Why are these such a huge deal?
Well, this is really, you know, the first time that social media addiction cases are going to be headed to a jury trial.
Hundreds of parents, school districts, and state of trains generals have filed lawsuits alleging that the way tech company is like meta, YouTube,
TikTok, Snapchat, have designed their products,
have made them especially addictive to young people,
and that that has harmed teenagers
and fueled a mental health crisis in today's young people.
Today, the tech giants behind some of the most popular social media apps
are headed to trial.
This case is the first of thousands of lawsuits
that claim platforms like MetaSnap, TikTok, and YouTube
knowingly created harmful products.
A 19-year-old anonymous plaintiff,
claims she got hooked to Instagram, TikTok, and YouTube at just 10 years old due to features
like endless scrolling and frequent notifications.
TikTok and Snapchat were also named, but they have already settled out of court.
Essentially, jury selection has begun in the first of a cohort of social media addiction cases.
It focuses on the story of a woman who's now a young adult. She is anonymous, but goes by KG
And essentially, you know, she says she used all those social media platforms, you know, throughout her childhood.
KGM has testified that social media use, quote, contributed to her anxiety and her depression and caused her to feel more insecure about herself.
We should expect testimony from top executives like Mark Zuckerberg and Adam Aseri, as well as this sort of debate about, you know, her own.
life. We've already seen in filings that the companies like Google have alleged that, you know,
there are other things going on in her life that may have contributed to her distress. And so
that case will play out and they'll pivot to the next bellwether case. Okay, so just to clarify,
this is about whether teens are being harmed by addictive content or whether the platform
themselves are addictive?
It's really about the platform design.
And so they focus on things like, you know,
an endless stream of content, right?
They focus on features like autoplay on YouTube, right?
Notifications, product features like that,
that they say are really designed to just keep people
returning to their sites over and over and over again,
leading to really harmful consequences,
whether that's anxiety, whether that's depression,
untimely deaths.
And so they're kind of making claims that are similar to, you know, some of the claims that were hit to big tobacco companies, you know, way back when when they said, hey, these products have been especially designed to be addictive. The companies also know that based on their own internal research. And they haven't really been upfront about the risks. In fact, they've been perhaps deceptive.
and have failed to protect young people in the way that they should.
Yeah, you know, these lawsuits are all about the addiction and harm that might be caused by social media.
But what do we actually know about how addictive these platforms are?
What do these studies tell us?
It's a little murky, right?
There's no such thing as like social media addiction and the DSM that, you know,
a psychologist is going to, like, give you as a diagnosis and put on your insurance form.
But there is a body of research, outside research,
certainly that raises the question about whether social media usage can lead to or contribute
to potential harmful mental health outcomes. And so there has been research that has, you know,
perhaps drawn the conclusion that problematic use of social media or long periods of time of
social media is correlated with, you know, poor mental distressing outcomes.
A new study finds that excessive social media use can exacerbate depression in some young
A new study has found a link between pre-teens' use of social media and future depression.
Direct correlational evidence that the more you use it, the more depressed you are.
But there's other studies that don't establish that correlation.
A study released last month from the University of Oxford found no evidence linking
individual Facebook data to a negative well-being.
I don't think that a quick fix of taking the technology away or blocking people will do a darn thing.
And again, there has been an even sort of figure debate in academic.
committee of, is there, you know, even if you prove correlation, is there causation?
How do we know that it's the social media platforms themselves that are causing the distress
among, you know, those teens?
Or is it just the fact that teens who are already kind of distressed for a variety of reasons
are turning to social media and perhaps using social media in problematic ways?
And also, you know, where are the parents?
I think this is a jury trial, and that's probably one of the obstacles I'll have to overcome is,
well, how much is the tech company's responsibility and how much is our parents supposed to, you know,
intervene and try to protect their kids? My guess is that conversation will probably emerge as well.
You know, mental health, anxiety, depression, body image, those really get a spotlight in the worries about social media.
but are there harms that we're not necessarily thinking about or talking about?
Yeah.
I mean, scams, extortion scams have been, you know,
one area that regulators are paying increasing attention to,
where particularly for targeting young boys,
someone will post a photo of like a young, seemingly attractive woman.
They'll engage a conversation.
They'll get him to send compromising photos.
of himself and then threatened to send those photos to his friends and family if he doesn't
give over money. And, you know, he ends up in a vulnerable situation. And so there's been several
cases in which, you know, that young child has committed suicide because it was so distressing
to have that experience, right? I think there's been a lot of attention around also technology
in schools. We've seen an entire movement across the United States in which,
schools and school districts have banned phones within the schools
because school districts themselves and sort of the ministers are saying,
we're seeing the effects of phones, you know, on academic performance
and sort of the environment of the school.
And, you know, sleep, right?
Like, it's hard to have a debate about, like, what the pro is for a teenager
using social media at 2 a.m.
And they probably, you know, when they have a test at 7 a.m.
And so I do think that while some of this sort of
around like depression and anxiety has been a little bit nuanced among researchers. I do think
there's other types of potential harms that the plaintiffs will be able to point to. And the other
thing they have on their side is just the documents. There's own, you know, the companies have
their own internal research in which they have looked at these issues. And that will be especially
critical, I think, in the case. A lot of this focus on social media use is focused on young
people. But if this is a big tobacco moment, cigarettes aren't just bad for kids. They're bad for
adults too. As adults, should we be worried about these platforms being bad for us?
I mean, I am for me. If not maybe as much as a mental health concern, but just of like,
how do I want to spend my time? What's the best use of it? Is it connecting with the people that I
care about? Is it, you know, pursuing work that makes me feel like a productive citizen? Or is it
scrolling TikTok for three hours, right? And I think one of the interesting things about this case is
because it's a jury trial, my guess is the jurors are all going to relate to the issues. Like,
they most likely have their own phones. They might have their own kids. They themselves, like many of
us, may have, you know, had to sort of battle or have these conversations with themselves of like,
how much am I using my phone? Which apps do I want to use? I think a lot of us can relate to these
questions about phone usage and screen time. You know, what this case does is, again, puts that
question, you know, to the courts, which is like, are companies more responsible, more liable
for the impact their services are having on young people? And this isn't just a new scandal with new
internal documents or new whistleblowers, you know, further shedding light on the ways these
companies operate. This is sort of a new, potentially new avenue for people who think the
companies need to be held more accountable to force or expand that liability. And that they made it
to trial in this far is an accomplishment in of itself, whether they'll moon, you know, who knows?
Naomi Nix, Washington Post.
Today's show was produced by Peter Balladon-Rosen, edited by Jolie Myers, fact-checked by Andrea Lopez-Cruzado, engineered by David Tadishore and Patrick Boyd.
I'm John Gwynne Hill. This is Today Explained.
