Today, Explained - Why parents are suing social media
Episode Date: May 1, 2023Congress has yet to pass legislation regulating social media companies, so parents are taking matters into their own hands. A lawyer representing them explains how a new spin on an old legal theory mi...ght lead to a big win. This episode was produced by Haleema Shah, edited by Jolie Myers, fact-checked by Laura Bullard, engineered by Paul Robert Mounsey and Michael Raphael, and hosted by Sean Rameswaram. Transcript at vox.com/todayexplained  Support Today, Explained by making a financial contribution to Vox! bit.ly/givepodcasts Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
Congress has tried and failed for years to regulate social media companies, but now there's another group taking them on. Parents.
He was so addicted to it that his last moments of his life was about posting on social media.
And parents aren't alone. They're joined by school districts. Kenosha Unified Schools are joining a national
lawsuit accusing companies like Facebook and Instagram of maximizing profits at the expense
of kids' mental health. And even state governments. Today we announced along with the Attorney
General here in Arkansas that we're filing three lawsuits, two against TikTok, one against Meta
because of the deceptive practices that they
have engaged in. On Today Explained, we're going to explore a new legal argument suggesting that
social media algorithms are causing mental, physical, and sometimes lethal injury to kids.
BetMGM, authorized gaming partner of the NBA, has your back all season long.
From tip-off to the final buzzer, you're always taken care of with a sportsbook born in Vegas.
That's a feeling you can only get with Bet MGM.
And no matter your team, your favorite player, or your style,
there's something every NBA fan will love about Bet MGM.
Download the app today and discover why BetMGM is your basketball
home for the season. Raise your game
to the next level this year with BetMGM.
A sportsbook worth a slam
dunk. An authorized gaming partner
of the NBA. BetMGM.com
for terms and conditions.
Must be 19 years of age or older to wager.
Ontario only. Please play
responsibly. If you have any questions
or concerns about your gambling or someone close to you,
please contact Connex Ontario at 1-866-531-2600 to speak to an advisor free of charge.
BetMGM operates pursuant to an operating agreement with iGaming Ontario.
You're listening to Today Explains.
Is it Today Explain or Today Explains? Explain-da. Explain-da. As much as we want to believe otherwise, the kids are not okay.
Rates of self-harm and suicide are up over the last decade,
and the root causes range from the pandemic to gun violence to climate anxiety to, of course, the internet.
And now, a slew of personal injury lawsuits around the country
are going after social media companies.
Previn Warren is one of the attorneys leading a lawsuit in California
that represents over 150 kids and their families.
He says the owners of Instagram, Snapchat, TikTok, and YouTube
are knowingly getting kids addicted to their platforms.
Part of how it works, it's very similar to a slot machine.
It's a psychological principle called intermittent variable rewards.
Basically, when you pull a slot machine,
You never know what you're going to get.
Maybe you're going to hit gold, maybe you're not.
But the frequency with which you get a payoff
is indeterminate, right?
You don't actually know when it's gonna hit gold
or when it's not gonna hit gold.
And so that compels you to keep pulling the lever
and playing over and over again
in order to sort of get that dopamine hit.
Instagram and TikTok work very similarly. They study exactly what you hover over and for how
long. They study what your likes and comments are. And when I say they study, I mean the algorithm
is really processing that data in real time, right? And they use that information to design your feed in a way that gives you a payoff,
but on a variable and unpredictable schedule, right? And that actually is the most addictive
way to set it up, kind of like the slot machine. You don't know what you're going to get and when you're going to get it, and it keeps you scrolling. But creating an addictive product isn't necessarily
illegal. So Previn's team is drawing on an old legal concept to make an unexpected case.
It's an interesting case in the sense that we're bringing principally a products liability
claim against these companies. So products liability is a legal
concept that's, you know, 50 years old. And the idea is that if you manufacture or design a
defective product, you should be held responsible if that product winds up hurting people. And so
we've applied that concept. We've sort of dusted it off and repurposed it in the 21st century and applied
it to these social media apps. The algorithms that power these apps are addictive. And what
we're really beginning to understand that I don't think the public has understood up until now is
that the fact of addiction is something that these companies have really been aware of,
at least Meta has been aware of it,
and their company documents show that.
We haven't gotten too far into the case yet,
but what we're already seeing is disturbing.
It confirms what the Facebook whistleblower,
Frances Haugen, leaked to the press at the end of 2021.
Facebook understands that if they want to continue to grow,
they have to find new users.
They have to make sure that the next generation
is just as engaged with Instagram as the current one.
And the way they'll do that is by making sure that children
establish habits before they have good self-regulation.
By hooking kids.
By hooking kids.
Through that addictive mechanism,
young people wind up in really dangerous rabbit holes
that cause them to have serious body image and self-esteem issues.
And so our clients have developed really serious eating disorders,
suicidality, and in some really tragic cases,
have actually taken their own lives and were
suing on behalf of their estates or for their parents.
My name is Rose Marie.
I am the mom of a teenage daughter who has been diagnosed with anorexia nervosa restrictive
type.
Her disorder began in 2020 when she was 14 years old, and we are still going through that process in recovery
now in 2023. Another Meta employee described kids saying they often feel addicted and
know what they're seeing is bad for their mental health, but feel unable to stop themselves. I wasn't aware of it until she went to residential treatment and I actually had her phone.
And then I did look at her Instagram account and found that it was frightening. I found images of
some really thin, emaciated-looking teenagers in bikinis,
showing off their bodies,
and telling other people how they should look like them.
How much of this is on parents?
How much of this is buyer beware, you know, kids sign up,
and it's a free-for-all, just like the rest of us
when we got 56K 25 years ago.
I think the social media companies really want to point the finger at parents, but they're completely outgunned.
Most parents had no idea what they were in for when they got their kids a phone.
When she first got her phone at 11 years old, we were not aware of the applications that kids had access to and that she was constantly
seeing photos of other teens, body images.
She was getting tips and tricks on how to hide her eating disorder.
So I believe that it did cause her to spiral into that addiction.
You can give your kid a phone for completely innocuous and reasonable purposes,
like you want your kid to stay in touch with you when they're on the bus route home, right?
You don't realize that you're giving your kid this license to be exposed to all kinds of nonsense and to be subject to these apps that are demanding their near constant attention.
And we have clients, parents that try to take the phone away, try to disable the apps, and the kids experience classic withdrawal symptoms just as if they were being taken off of a drug or trying to
taper off of nicotine use. They'll throw things. They'll hit things. They'll hit their parents.
They'll be destructive. I mean, I'm really serious about this. I'm not making it up. And so the
parents say, you know what? I'd rather just give the kid the phone because that's even worse,
right? And yeah, kids just cannot get themselves to disengage
once they're in this cycle of compulsive use.
What does suing all these companies,
Meta, Google, Snap, ByteDance, get us?
What's the manifestation of a win here?
Well, there's a lot of different ways
to imagine what that looks like.
But one way to think about it is to wind back the clock. When Instagram was first released, it was, you know, for latte art, right? It was for vacation photos.
I miss those days. I miss that version of Instagram. I do too. And so then you have to ask yourself, what changed? Why did it become the really
negative, pervasive social phenomenon that it became? And the reason is that the algorithm
changed, right? The core workings of the product were modified to prioritize photos and videos
that keep people on the app as long as possible, right? Your feed's not organized
chronologically. It's not organized by, you know, what your friends posted. It's organized in a way
that is giving you the stuff that the app has predetermined through its algorithm are going
to keep you quote-unquote engaged, which means using it, right? Because that's what drives the
ad revenue, and that's what makes meta money.
It doesn't have to be designed that way.
And in fact, it wasn't at one point in time.
So that's a change we could see is change the algorithms
so that, you know, maximizing engagement,
you know, which means creating compulsive and addictive use,
that's no longer how the algorithm works.
Previn Warren is an attorney at law. He's going up against some of the biggest companies in the
world. We reached out to them all, Snap, Meta, Google, ByteDance, they all said something like,
we prioritize safety for kids, except ByteDance, which owns TikTok.
They didn't say anything at all.
Google, which owns YouTube, pointed to Family Link,
a feature that allows parents to limit screen time and block specific types of content on supervised devices.
Snap, which owns Snapchat, obviously said,
it uses human moderators to catch harmful content
"'before it spreads to large audiences.'
"'It also has a similar tool to Google's
"'called Family Center, which allows parents to monitor
"'who their kids are communicating with.'"
Meta, which owns Instagram, said,
"'They've invested in technology that finds
"'and removes content related to suicide, self-injury,
"'and eating disorders
before anyone even reports it to them.
In a minute, the science of what social media is doing to kids.
It's Today Explained.
Support for Today Explained comes from Ramp.
Ramp is the corporate card and spend management software designed to help you save time and put money back in your pocket.
Ramp says they give finance teams unprecedented control and insight into company spend.
With Ramp, you're able to issue cards to every employee with limits and restrictions
and automate expense reporting so you can stop wasting time at the end of every month.
And now you can get $250 when you join Ramp. You can go to ramp.com. Ramp.com.
Cards issued by Sutton Bank.
Member FDIC.
Terms and conditions apply. The all-new FanDuel Sportsbook and Casino is bringing you more action than ever. Want more ways to follow your faves?
Check out our new player prop tracking with real-time notifications.
Or how about more ways to customize your casino page
with our new favorite and recently played games tabs.
And to top it all off, quick and secure withdrawals.
Get more everything with FanDuel Sportsbook and Casino.
Gambling problem? Call 1-866-531-2600.
Visit connectsontario.ca.
Look what I just posted. Brunch with these two dum-dums. Oh my gosh, so good. Is this good? I
said Sunday Funday with these idiots. Yeah, that's good. That's great. Today Explained is back. Previn,
the lawyer suing social media companies, is gone. His case hinges upon whether social media is damaging to kids' mental health. So
we wanted to find out what the science says. My name is Dr. Mitch Prinstein. I'm the chief
science officer of the American Psychological Association. Dr. Prinstein studies how kids
interact with one another, including online. So I asked him a simple question.
Is social media ruining kids' lives? Yeah, if you ask a scientist a simple question,
you're going to get a really complicated answer, I'm afraid.
But, you know, social media is not all good.
It's not all bad.
It's an interaction between who you were before you logged in
and what kinds of things you're doing on there.
And the product of those two pieces
could lead to vastly different outcomes from kid to kid.
And when we're talking about kids, vaguely, what age level are we talking about?
So we're really talking about kind of that pre-adolescence and that adolescent period,
mostly. So we know that adolescents around the time that puberty starts, maybe a year or two
before that, we see huge increases in the risk for depression
in general. We see huge increases in rates of self-injury, like cutting is a really common
example. We see increases in more what we call disruptive behavior, acting out, aggression.
And usually this is when we see a lot of increase in what we call health risk behaviors, whether
it's risky behaviors to change your body shape or things like substance use.
And so, but these behaviors you're talking about, how do they connect, if at all, to
social media?
Do we know?
We're starting to figure that out.
I'll say in short that it's not so much how much kids are on their devices or using social
media.
It really is about the specific kinds of functions
or the specific types of content
that you can find on some platforms.
It depends on what you allow yourself to see,
who you follow, who you don't follow.
What's on your explore page is just what you make it.
So the kinds of things that we're concerned about when we think about the content that kids are experiencing on social media are content that's exposing them to discrimination and hate,
content that is literally teaching them how to engage in psychologically disordered behavior like cutting or substance use and how to hide that behavior from their parents.
But you're talking about phenomena that sort of predates social media.
What's changed since kids have started spending hours on end on TikTok or Instagram?
There's something about kids being able to communicate now,
first of all, predominantly on social media.
It doesn't supplement.
It's really taken over, for most kids, the majority of their social media. It doesn't supplement. It's really taken over, for most kids,
the majority of their social interactions. It's a lot easier to just meet up with people wherever you are, as long as you have a way to connect with them. So whether that be
texting or Snapchatting or calling them. But also, scientists have characterized kind of aspects of
social media like it's asynchronous. So you're interacting with folks not in real time.
It's permanent. It's very visual.
There are ways in which it creates stress
because you can quantify how much people like you
or like what you said or don't like you.
People that I know actually worry about
how many followers they have compared to other people
and if they're not getting as many messages.
It just causes so much unnecessary drama,
I would say.
But maybe also particularly important is
the work of algorithms and machine learning here.
You know, for the whole history of our species,
and we're only here because of our ability
to be a social species,
is this is the first time we've ever outsourced
our social relationships to a computer.
The computer now picks who our friends are, whose posts we read in what order.
It really guides us in a way that, again, can be very helpful.
But also we're giving up a lot of control.
And for teens who have pretty immature developing brains, you know, when they usually get started on this, that's a question that scientists are really interested in understanding more.
Where are you sort of heading in your scientific study of how an algorithm can sort of shape the brain of an adolescent. Well, one of the things that we're starting to see in the science is that the area of the brain
that's activated on social media
is kind of that area where it gives you a dopamine
and an oxytocin response
when you are being agreed with
or getting attention or feeling like you're being liked.
I think we're influencers
because some people our age or under our age look up to us.
When people look up to you, you feel good about it sometimes.
That's all fine.
It's just that that's really close to areas that motivate us
to engage in more and more of that behavior.
And that can lead to what, as scientists, we stay away from the word addiction,
but we do talk about problematic social media use
where kids are spending far more time online than they even want to.
Me and my friends are on our phones a lot at the same time. I honestly don't really know why
we do it while we're together, but we do. They can't stop even if they want to, some report.
And they're experiencing withdrawal symptoms.
So it's affecting their homework, their relationships, and perhaps most importantly, their sleep.
The reason why sleep is especially important is because sleep is really needed for the adolescent brain to grow to the size that it's supposed to be.
So when you have disrupted sleep, we're seeing that that is actually affecting brain size in adolescents.
These algorithms, you know, are designed to keep us engaged as much as possible. But when a teen
is staying up till early hours of the morning, watching videos or reading others' posts,
that then has a direct implication on really how their brain is growing.
I only use Instagram now. I've stopped using Snapchat as of like January.
I just found myself getting really anxious about things.
So I decided, you know,
I'll just take a break from social media.
We're starting to hear that a lot of kids
are experiencing a remarkable amount of stress
from their devices.
It notifies them too often.
There are too much information
that they're trying to digest all at once.
And they're really concerned about what they'll miss out on if they're not online.
And if they are online and they post something, they're very concerned how that will be received.
About 50% of kids are now reporting that they're experiencing so much stress that it's interfering with their day.
And the more stress they're experiencing, the more depression they report about a year later.
About a year or two before you notice that a kid's body is changing and growing up,
the brain has already started doing its work.
And one of the first things that it does is it starts to develop more of an adult-like brain in an area that makes us crave social rewards,
that feeling of getting attention or influence or power
or, you know, positive feedback from our peers.
We don't know exactly why,
but it might be because the brain
is kind of preparing kids to be more autonomous.
So, you know, the brain is kind of encouraging you
to want to hang out with your peers a lot more
and roll your eyes at your parents,
which is what we all see happens
when your kid is around 11 or 12.
And we don't just see this in humans.
We see this in other mammals as well.
There's that tendency to want to hang out with adolescents.
Well, the reason why that's important
is because teens are then around 12, 13 years old.
They are very, very much craving this peer interaction.
And for, you know, about 60,000 to 100,000 years,
the only way you can get that was by going to school
or by going to their house or maybe at some point
picking up the phone and calling them.
Now it's different.
Now kids can satisfy that urge.
They can scratch that itch.
By pressing buttons on a device 24-7, 365,
the brain wasn't built for that. So we're a little bit trying to figure out
what is the effect of taking a kid who is, their brain is built to crave that kind of interaction
and now giving them the opportunity to get it far more than we ever expected,
even with a quantified tally
showing them how successful they are doing it.
I have 316 followers,
and then most of my friends have like 1,000 or like 600.
This is a little bit of a perfect storm
where we've got adolescents' brains craving something,
and now this brilliant technology
that allows them to get it far more than we ever had expected and we've
ever before been able to in the history of our species. We're also learning a lot of positive
aspects. Kids are having real friendships with kids that they might never meet. And those
friendships do in fact serve a buffering function to help them in times of stress, maybe even make
them less suicidal in some cases.
Sometimes when I'm sad, I like to communicate with my friends on social media and that really
makes me feel less lonely. I never really feel depressed or anything like that just because
there's always somebody to talk to and always somebody that's there for you. That's kind of
a good thing about social media. Kids also have more diverse friendships online than the friends
that they're able to meet offline or in real life. And
of course, that's a great thing if kids are being exposed to, you know, more diversity online as
well. You're painting a complicated picture here. There are clearly positive aspects of kids
spending a lot of time on social media. They're developing friendships. They're encountering
people and things that they would not usually encounter. But then you're also saying there's
a risk of kids getting addicted. There's the risk of kids not sleeping enough,
not having their brains fully develop. There's a lot going on here. Is there any scientific
consensus on how social media relates to mental health in adolescents?
Well, I think we're seeing both the risks and the benefits, and we're going to have to set some controls or some systems in place to make sure that adolescents' biology doesn't get the better of them, and they're able to do this in a safe way that optimizes the benefits and minimizes the consequences.
Social media is now one of those kinds of behaviors.
We should be teaching kids about this in school, help them be smart
consumers. We should be teaching the parents about the science so they can make the best
decisions for their kids. And we should, you know, consider whether there are some systems we need to
put in place, whether legally or informally, so that way we're able to protect kids who are
engaging in this in a more vulnerable way than adults would.
We frequently work with kids and tell them to consider why would a company invest billions of dollars and thousands of brilliant minds to offer you a completely free application just to hang out
with your friends? What is the motive there?
And when they start to realize that someone's making money off of what they do,
maybe their data, maybe that's why they're encouraged to stay on as long as possible,
it really helps kids to realize, wait a minute, let me take control and figure out,
how do I want to use this?
What are my goals?
How do I know when use this? What are my goals? How do
I know when I have stopped? I've reached my goal and now I'm just giving someone else my brain for
profit. Dr. Mitch Prinstein, he's chief science officer at the American Psychological Association.
Halima Shah produced our episode today.
She had help from Jolie Myers, Laura Bullard, Michael Raphael, and Paul Robert Mounsey.
We used some footage from Common Sense Education in the show.
Thanks, Common Sense Education.
I'm Sean Ramos for them.
This is Today Explained.