The Daily - Social Media on Trial
Episode Date: January 29, 2026For years, social media companies have relied on an impenetrable first amendment protection to shield them from legal claims that their products are dangerous to children.But now, a cluster of plainti...ffs are trying a different tact.Cecilia Kang, who covers technology, explains why these new lawsuits pose an existential threat to social media giants, and how those companies are likely to defend themselves.Guest: Cecilia Kang, a reporter covering technology and regulatory policy for The New York Times.Background reading: Here’s what to know about the social media addiction trials.TikTok reached an agreement to settle a lawsuit, avoiding the first in a series of landmark trials.Photo: David Gray/Agence France-Presse — Getty ImagesFor more information on today’s episode, visit nytimes.com/thedaily. Transcripts of each episode will be made available by the next workday. Subscribe today at nytimes.com/podcasts or on Apple Podcasts and Spotify. You can also subscribe via your favorite podcast app here https://www.nytimes.com/activate-access/audio?source=podcatcher. For more podcasts and narrated articles, download The New York Times app at nytimes.com/app.
Transcript
Discussion (0)
From the New York Times, I'm Rachel Abrams, and this is The Daily.
For years, social media companies have relied on an impenetrable First Amendment protection
to shield them from legal claims that their products are dangerous to children.
But now, a new cluster of plaintiffs are trying a different tact.
Today, my colleague Cecilia Kong explains why these lawsuits pose an existential threat to social media giants
and how those companies are likely to defend themselves.
It's Thursday, January 29.
Trouble for TikTok, as a group of attorney generals in several states,
look into whether the video sharing platform TikTok is harmful for children.
Internal research at Facebook found that its photo sharing app,
Instagram, can harm the mental health of millions of young users.
Research shows 95% of teens are on social media.
more than a third say they're on constantly.
For young people, the TikTok platform is like digital nicotine.
One chart showed 21% of girls in the U.S.
felt somewhat worse or much worse after using Instagram.
Social media taught me things about myself that I didn't even know,
like how I had an ugly nose or how my weight wasn't the proper weight.
Social media said the solution to these things wasn't self-acceptance.
Social media said the solution to these things was very,
products, and sometimes even surgeries.
Unregulated social media is a weapon of mass destruction that continues to jeopardize the safety,
privacy, and well-being of all American youth. It's time to act.
As a dad of three, I'm angered and horrified. As an attorney general, I, along with my colleagues
across the country, are taking action to do something about it.
Cecilia, welcome to The Daily.
Thanks for having me.
So, Cecilia, we've talked a lot on this show about the claims that social media is harmful for children, that it can lead to mental health disorders, social isolation.
And there have been all sorts of attempts over the years to really curb the reach and influence of these social media platforms.
Now we have this new crop of lawsuits.
And I want to understand how are these lawsuits any different from previous attempts that we've seen to regulate or reign in these companies?
So these social media companies have for years faced really tough,
scrutiny and criticism for being too powerful and crushing competition, for hosting content that is
false, all kinds of harms related to the kind of content that is hosted on these platforms.
But the cases that are about to begin this week in trials is really different in that there are
thousands of individuals, school districts and state attorneys generals, that have come together
in a series of lawsuits that are arguing the same one thing, which is that social media is addictive,
and that the addictive nature of these platforms have led to a bevy of personal injuries,
including anxiety, depression, suicidal thoughts, eating disorders.
So what's really different is this is less about the content they host, and this is more about
the nature of the technologies.
And this is a really novel legal theory.
It's essentially social media's big tobacco moment, which led, as you know, to many years of litigation against the tobacco companies and ultimately led to the decline of smoking.
And so many in social media see this as a really existential moment.
So basically the crux of this is that these are personal injury claims, right?
And that effectively allows the plaintiffs to sidestep what has traditionally shielded these companies from liability.
which is their free speech defense.
That's exactly right, Rachel.
What the lawyers in these cases and the plaintiffs are trying to do
is to get around that legal shield that the social media companies
have been able to use to protect themselves in court.
And they're saying, no, this is actually not about speech at all.
This is about you companies creating and engineering technologies to be harmful.
And that those are violations of state and federal consumer laws.
So let's walk through these cases. How are they making that claim specifically?
So this year, we will see two big batches of trials begin in all of these cases that have been filed.
And the first batch that takes place in Los Angeles include nine plaintiffs, nine trials, separate trials by these different plaintiffs.
They're all individuals, all claiming that when they were young, when they were minors, they became addicted to social media.
and they suffered these harms.
And these nine cases,
they're known as Bellwether's
because they've been picked out of thousands of lawsuits
filed by individuals against the social media companies.
And they're seen as very representative
of the many different charges and experiences
that individuals have had and suffered,
as they claim,
by becoming addicted to these social media companies.
So the first case in trial that began,
begins is of a individual who goes by the initials KGM.
She is a now 20-year-old from Chico, California.
And she has said that she created her first social media account on YouTube at the age of eight.
She then joined Instagram at the age of nine.
And musically, which is now known as TikTok, at the age of 10, and Snapchat
at 11.
So she's been using all the social media platforms for a long time.
And her mom said that she had no idea that these platforms could be dangerous and could
become so addictive to her child.
And she only figured that out after watching a news program where she learned about the
potential harms of social media.
Her mom said that if she had known how potentially harmful these sites were, she would have
prevented her daughter from perhaps even having a phone and using the apps.
And what KGM, the plaintiff, is arguing is that the social media platforms were incredibly alluring
to her and that she got hooked. And these very addictive products that use features like
infinite scrolling, meaning it's just so easy to keep scrolling and scrolling and things like
auto play videos where right after you finish a video, the next one's queued up before you
even think about it. And algorithms that direct you and recommend particular content that she is
found to be very toxic, that all these features led her to overuse social media and become
addicted. And that in turn led to lots of mental health problems, including anxiety, depression,
suicidal thoughts, and body image issues for her.
So these are the kinds of claims that I think a lot of people have become familiar with by now.
The idea that young people can develop any number of mental and emotional conditions from repeated exposure to social media platforms.
What is some of the other litigation that you're watching?
So the next big wave begins around June in federal court.
They're all bundled together and they're brought by attorneys general in dozens of states as well as school districts.
And those are really interesting, Rachel, in that they're charging the companies with being a public nuisances, that the fact that they as school districts and states have had to shoulder the costs of mental health services, phone programs within schools, all kinds of programs to deal with a youth crisis.
And so they are suing the companies for monetary damages.
And they're also saying that they would like to see big changes within the companies,
that the platforms have to give up some of these addictive technology features.
Given that these are all personal injury claims,
what do the plaintiffs actually need to prove in order to prevail in court?
What these plaintiffs have to prove is that social media is linked to addiction.
And that's going to be hard.
It's going to be a new sort of argument that,
hasn't been tested before.
And so they're going to have to show that there is expert evidence that the use of tools
like infinite scrolling on TikTok and on Instagram and autoplay of video are features that have
led to compulsive use and that there is a direct link between the technology and behavior.
And they'll also have to show that these companies knew all along that their products were harmful and that they withheld what they knew from the public.
So what's the best evidence that the plaintiffs have to show what you're describing as a causal link between the technology and the harm?
So there have been numerous studies done on the mental health effects of social media.
But what the plaintiffs are going to really rely on is hundreds of thousands of documents that they've collected in discovery ahead of these trials that the plaintiff's lawyers say showed that the companies knew that there was a problem and they found internally that there was a lot of troubling evidence about their products and how they affected young people.
For example, in 2018, META began studying how beauty filters on Instagram.
Beauty filters, just to be clear, those are the filter you can put on your face or somebody else's face to make them more beautiful to just alter the image, right?
Yes. And they began studying that in 2018 and decided in 2019 after a lot of backlash publicly that they would ban the filter.
But that same year in 2019, Mark Zuckerberg, the CEO, considered bringing the filters back to Instagram.
These were big drivers of engagement and young people like to use them.
And employees within the company implored him not to, including an executive because she said they're really just so toxic for particularly young girls.
And she said that her own daughter suffered from body dysmorphia.
And she sent an email directly to Zuckerberg asking him to reconsider.
he ignored the email and decided in 2020 to reinstate the beauty filters.
And so lawyers for KGM are going to point to these internal documents and say that this is really
the proof that the company not only studied the problem, they recognized there was a problem,
and yet they did not tell the public about the problem.
They allowed the tools to continue operating.
And what are the plaintiffs asking for specifically?
Obviously, money, but can you just give us a little bit more specifics on their demands?
The plaintiffs are asking, as you said, for monetary damages, and they are also asking for changes to the designs of these platforms.
So they're going to ask for stronger age verification and tools to make sure that underage users are no longer able to escape the terms and service and use the platforms.
They'll probably also ask for more parental controls and that the companies remove addictive features like,
infinite scroll and auto play of videos and snap streaks.
I'm really going to show my age here, Cecilia, but what is a snap streak?
So a snap streak is it's kind of a game, and this is why it's been accused of being addictive.
It's messaging between two people.
And the idea is to create a streak of messages between two people.
And you maintain a streak by communicating every day and sending snaps, which are usually visuals,
like a photo or some sort of a video or some sort of a message.
And you keep your streak going if you communicate every day.
You lose your streak if you stop even for one day.
I see. And that does seem very clearly like an example of a tool that is designed to keep you on the platform as much as possible, which is part of the business model, right?
That's what these companies are trying to do with their users.
So it makes sense that if you take those features away, that could pose, as you said, kind of an existential threat to the entire business model.
That's right.
And it's important to keep in mind that the business model is advertising.
And what really fuels advertising revenue is engagement.
Right.
Engagement is at the heart of this.
And these tools are meant to keep people more engaged.
So you can see why these trials are really so potentially damaging for these companies.
And so that's why we've seen two companies, Snap and TikTok, settle.
the very first case with KGM.
We don't know the terms of those settlements,
but META and YouTube are still scheduled to go to trial
as defendants in KGM's lawsuit
and appear very determined to continue to take this to trial.
We'll be right back.
Cecilia, if these lawsuits are so existential,
potentially for some of these social media companies,
why would some of them not settle
the way that TikTok and Snap did with that first case?
presumably the money that they would have to pay to settle is nothing compared with having to
alter an entire business model, right? So why even take the risk and go to trial?
Well, there are many trials that are scheduled, first of all. So even though two companies were
able to settle with KGM in this first case, there are numerous more in the state court as well as
in federal court going forward. The other thing to keep in mind is that the companies, especially
meta and YouTube really feel strongly that they have a good case on their side and they will bring up
speech protections like you mentioned rachel they're going to say that there is a law known as
section 230 of the communications decency act that shields internet companies from the content
they host because section 230 has been so broad and so strongly used in their favor in so many
different instances. And so they're feeling pretty confident that they can rely on that legal
shield once again. In addition, they reject the idea that social media can be linked to personal injury.
And the company's lawyers are expected to argue that there are many factors that go into mental
health issues. They're going to say that it's multifactorial. It could be school problems,
stress with friends. There could be all kinds of factors that lead to anxiety, depression,
and other mental health disorders, and not social media alone.
Right. And the causal link does, in fairness, feel like something worth grappling with, right?
Because how do you distinguish the impact, for example, of social media from a culture that promotes certain beauty standards and certain body types, right?
Like, is it actually possible to isolate and prove causation back to a specific social media platform?
What the plaintiff's lawyers are going to try to do is to again draw from all.
the internal documents they've collected, and they will try to show how the push to increase
engagement and to make their products sticky and even addictive. But ultimately, it comes down to a
jury in these California cases. Juries will decide the subsequent cases as well. And that might be
favorable for the plaintiffs because everyone has a story about social media. We know, for example,
that the majority of American parents see social media as a problem.
And yet the companies have so far escaped scrutiny.
Cecilia, if this does end up being social media's big tobacco moment and they lose these cases in court and a jury decides that this is, in fact, an addictive product, that means that we have an entire generation of kids who are now addicted.
And so I wonder, we've been talking this whole conversation a lot about what happens to the social media companies.
But what happens to these children that have essentially been the guinea pigs for this massive social experiment?
Remember decades ago when the trials began against big tobacco?
It seemed crazy and really far-fetched to accuse the companies of creating an addictive and harmful product.
But they did.
And with social media, with all of these young people who have been blamed for years for being unable to regulate their,
use of these social media apps, the conversation might change. The blame could lie in a different
place with the social media companies. Now, that won't take back the experiences of so many young
people who say they've been harmed by these social media platforms. But it could profoundly change
the conversation in our society. Cecilia Kong, thank you so much for your time.
Thanks for having me, Rachel. We'll be right back.
Here's what else you need to know today.
On Wednesday, the Federal Reserve voted to keep interest rates at the current levels,
despite enormous pressure from President Trump to cut rates.
Two Fed governors, both appointed by President Trump, cast dissenting votes.
But Fed Chairman Jerome Powell continues to reject Trump's demands for a rate cut,
even after the administration opened an unusual criminal investigation this month into Powell's conduct.
And...
Our founders debated extensively over which branch of government should have the power to declare or initiate war.
Virtually unanimously, they decided what was entered into the Constitution was that the declaration or initiation of war would be the power of Congress.
In a series of pointed exchanges on Wednesday, senators of both parties, including Republican Rand Paul of Kentucky,
pressed Secretary of State Marco Rubio to explain why neither he nor President Trump consulted with Congress before sending U.S.
U.S. troops into Venezuela to arrest and remove the country's president.
So I would ask you, if a foreign country bombed our air defense missiles, captured and removed
our president, and blockaded our country, would that be considered an act of war?
Would it be an act of war?
We just don't believe that this operation comes anywhere close to the constitutional definition
of war.
But would it be an act of war if someone did it to us?
Of course it would be an act of war.
During the hearing, Rubio refuted.
refused to rule up future U.S. military action in Venezuela, but said that President Trump has no desire to send American troops back to the country.
Today's episode was produced by Rochelle Bonja and Shannon Lynn. It was edited by Lexi Dio and Michael Benoit.
Contains music by Rowan N. Misto and Dan Powell. And was engineered by Chris Wood.
That's it for the Daily. I'm Rachel Abrams. See you tomorrow.
