Your Undivided Attention - Social Media Victims Lawyer Up with Laura Marquez-Garrett
Episode Date: July 21, 2023Social media was humanity’s ‘first contact’ moment with AI. If we’re going to create laws that are strong enough to prevent AI from destroying our societies, we could benefit from taking a loo...k at the major lawsuits against social media platforms that are playing out in our courts right now.In our last episode, we took a close look at Big Food and its dangerous “race to the bottom” that parallels AI. We continue that theme this week with an episode about litigating social media and the consequences of the race to engagement in order to inform how we can approach AI harms. Our guest, attorney Laura Marquez-Garrett, left her predominantly defense-oriented practice to join the Social Media Victims Law Center in February 2022. Laura is literally on the front lines of the battle to hold social media firms accountable for the harms they have created in young people’s lives for the past decade. Listener warning: there are distressing and potentially triggering details within the episode.Correction: Tristan refers to the Social Media Victims Law Center as a nonprofit legal center. They are a for-profit law firm.RECOMMENDED MEDIA 1) If you're a parent whose child has been impacted by social media, Attorneys General in Colorado, New Hampshire, and Tennessee are asking to hear your story. Your testimonies can help ensure that social media platforms are designed safely for kids. For more information, please visit the respective state links.ColoradoNew HampshireTennessee2) Social Media Victims Law CenterA non-profit legal center that was founded in 2021 in response to the testimony of Facebook whistleblower Frances Haugen3) Resources for Parents & EducatorsOverwhelmed by our broken social media environment and wondering where to start? Check out our Youth Toolkit plus three actions you can take today4) The Social DilemmaLearn how the system works. Watch and share The Social Dilemma with people you care aboutRECOMMENDED YUA EPISODES Transcending the Internet Hate Game with Dylan MarronA Conversation with Facebook Whistleblower Frances HaugenBehind the Curtain on The Social Dilemma with Jeff Orlowski-Yang and Larissa RhodesYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Transcript
Discussion (0)
In May of this year, when Open AI's CEO, Sam Altman, visited Washington to testify before Congress about AI,
I noticed that lawmakers kept connecting the modern threats of AI back to social media.
Congress failed to meet the moment on social media.
Now we have the obligation to do it on AI.
We acted too slowly with social media.
Many unfortunate decisions got locked in with lasting consequence.
So I'm deeply concerned about repeating social media's failure in AI tools.
an application.
Congress thus far has demonstrably failed to responsibly enact meaningful regulation of social media
companies.
These lawmakers are right to make this connection.
Because social media was humanity's first contact with AI, a curation AI that's just
selecting which post or tweet or video to show you next when you flick your finger up on
the news feed.
So if we're going to create laws that are strong enough to prevent second contact with AI from
destroying our societies, we could benefit from looking at the.
major lawsuits against social media platforms that are playing out in our courts right now.
So today on your invited attention, we have Laura Marquez-Garrett, who's an attorney at the
Social Media Victims Law Center, the non-profit legal center that was founded in 2021 in response
to the testimony of our friend and Facebook whistleblower Francis Howgan.
And Laura is literally on the front lines of the battle to hold social media companies accountable
for the harms that they've created in young people's lives and the families that they've
impacted for the past decade. But first, a quick content warning.
and I will be touching on some distressing examples of what happens with social media affecting kids
and parents. And just wanted to pause for a moment that if that's just not what you're feeling
today, please feel free to leave us and you can join us next time. And with that, here we go.
Laura, welcome to your undivided attention. Thank you very much. How did you first get involved
in this work? In terms of how we got started, so in the end of 2021, you have the Facebook whistleblower
who comes forward. And our founder, Matthew Bergman, he's a 30-year products liability attorney out
of Washington. He sees the revelations that are coming forward. And with his experience as a product's
liability attorney, I think that light bulb sort of went on of, hey, wait a sec. This is not third-party
speech. I will say personally, when this happens at the end of 2021, I am a partner at a very
prestigious law firm in Seattle. I have recently gotten the corner office that is very coveted, of course,
and big law. And I love my job. I love my colleagues. I am specializing in electronic discovery,
as well as forensic and fraud investigations. And through a series of circumstances, I am introduced to
Matt Bergman, and I read his first two complaints. And one of those was about Celina Rodriguez.
She was nine when she began using social media right as the pandemic started. She was 11 when she died by
suicide and posted that video on Snapchat. And then Matt said, hey, you need to watch Social Dilemma.
It was January 26, I believe, or 22nd. I watched Social Dilemma. I took notes, which I don't typically
do when I watch movies. And at the end of it, I remember they asked some of these designers,
do you let your kids use these products? And every one of them said, no, not a chance.
So as a 20-year defense-oriented litigator, it sort of clicks for me. You know, I've worked with
manufacturers, distributors, when you work for Nike or Nordstrom, their kids are wearing Nike
and Nordstrom clothing. To have a group of people that are creating a product that they are
marketing to and targeting to children, but will not let their own children get near.
In my mind, that is absolutely something that as a society, as a legal system, we not only have
the ability, but the obligation to address. And as a mother of small children, watching social
dilemma, seeing that complaint, I turned to my wife and said, hey, we have to do this. It was
unexpected to say the least. It was the most extreme career move I'd ever made, and I don't regret it
for one second. Laura, it's amazing to me to hear that the film could have impacted you that way
and that it would have actually caused your life to move in this direction. So you just mentioned
Section 230 of the Communications Decency Act and the platform's responsibility for third-party content,
at least here in the U.S.
But the cases that you've been working on take on a different focus
by centering on product liability
and proving that these platforms are defective products under the law.
Can you explain that transition?
Sure.
So historically, what you have is this Section 230
of the Communications Decency Act,
which I know people hear a lot about,
and 230 deals with third-party speech.
And it essentially says that these social media companies,
internet companies, cannot be held liable
for what third parties are saying and doing on their platforms.
But to your point, what I think people started to realize with social dilemma,
and then, of course, in, was it September, October of 2021,
when you had a Facebook whistleblower who came forward with documents,
what people began to realize is, hey, this is not about third party speech.
I mean, this is not simply a case of somebody posting something
that a child has gone online and found and has been harmed by.
These are systemic issues that go to the addictive nature of these products, which are designed to be addictive.
It goes to the tools and technologies that these companies are using to increase engagement,
which often includes targeting harmful content, extreme content, at some of the most vulnerable users,
not because the users are asking to see that or not because they want to see it,
but because the companies and the AI and the technologies they're developing know that it will be hard for the user.
to look away. So really that's where the lawsuits that are going on right now against these
companies come from. They come from that recognition of these are not third-party speech
situations. These are products liability, right? These are companies that are knowingly harming
users, especially younger users, more vulnerable users. And they're doing it on purpose, and they're
doing it by design. So obviously, one of the key arguments that the tech companies will use
is that we're not the ones responsible for people seeking out anorexia content
or ads for fentanyl or cyberbullying content or these kinds of things.
We're just a neutral platform.
And then they have this huge shield that we gave them in 1996 called the Communications Decency Act
when we basically made platforms immune to any kind of liability for what content shows up.
But what we were talking about is a different approach.
So the key phrase that you used was seeking out, right?
Which is they're going to say, hey, these are neutral tools.
These are people seeking out this content.
What we found in our work at the Social Media Victims Law Center is these kids, many, if not most of these kids, they're not seeking this content out.
You have kids saying, look, I was 12.
I opened a Snapchat account because I wanted to send silly photos to my friends.
It looked like fun.
Everyone was doing it.
And as I opened this account at 12 years old, I'd never heard of drugs.
And all of a sudden, they're getting content glorifying drugs on their feeds.
They're getting connected to drug dealers, predators.
and so it goes to that issue of seeking it out.
Can you explain for people who are not familiar with the law or these cases,
what is the burden of proof in this kind of law?
How does that work?
Yeah, so it varies in case-to-case.
It varies on the circumstances.
And the reason that's a hard question to answer to some degree
is that what's very unusual and unique about this circumstance
is you have a product where parts of the technologies,
parts of the algorithms are functioning in real time.
So it's almost like a human brain making these decisions,
these companies, to the best of our knowledge, are not retaining all of the data.
They're certainly not retaining it for a very long period of time
when it comes to the decisions their products are making.
So part of what's going to be different than what we've seen historically
is that there's going to be pieces that we can't just simply say,
hey, show me what you did here with this child six months ago.
Now, that's said with the evidence that's already out in the public record
due to the whistleblower, due to other whistleblowers coming forward,
due to individuals such as yourself who've had the courage to stand up and say, hey, this is what was
really happening behind closed doors. We can piece some of that together. And part of what is unique here
is you have an industry that was not regulated that was being operated under the, not just the belief,
but the absolute conviction that nobody will ever get discovery. We are immune because of Section 230.
And in essence, and they're wrong, by the way, but in essence, I think the operating theory was,
we can do whatever we want, as long as we are a computer service provider,
internet service provider, no one can touch us.
And that's simply not true.
That's not what 2.30 was designed to do.
It's not what it says on its face.
And we are tackling those issues right now in court.
But when it comes to the discovery piece, I'll give you an example without too many
specifics, you have a company like TikTok or meta or snap who can make statements in
court that say, hey, my product works this way, when in fact it doesn't.
In a case that came out recently, I believe it's called LW versus SNAP, you have got a statement that SNAP has made to the court along with a supporting declaration that says, in order for quick ad, which is the sort of user connection algorithm they use, in order for quick ad to work, the other user must be in your contacts.
And they use the word must, and italicize it, right?
They must be in your contacts, or you must have mutual friends on SNAP.
but I can tell you that we have kids all the time
who are opening accounts.
We even have kids who open accounts
on brand new phones
without a single contact
without a single Snapchat friend
and they get quick ads.
And by and large,
so I've spoken to dozens, if not hundreds of kids.
Every young girl I've spoken to
has said that the moment they open a Snapchat account,
they get a high number of predatory users.
So they get dickpicks,
they get exploitative content.
It's almost like they've said it's crazy
because Snap says they don't have public profiles,
but I feel more exposed sometimes on Snapchat than I do on Instagram.
I have yet to find an adult woman who has been targeted that way.
And you have to ask yourself, how is that possible?
Especially when some of these young girls are saying that they're 18, 20 years old,
how is it possible that they are having an entirely different experience than these women?
It goes back to you have a company like Snap that feels confident.
I can put something like this in the public record.
And who's to say that I'm wrong?
Right? Nobody knows how my product works. And if I want it to, I can make it change tomorrow. I can make it work a different way.
So so far we're dealing with cases of harm from a particular app. But underlying all these harms is just the deeper issue of the totally addictive nature of all of social media.
And some people might say that the solution is just to grab their kid's phone away from them. So I'm wondering what you're seeing when parents try to remove the phone.
We actually have a number of kids that when parents take their social media access away.
they die by suicide.
And I thought that was shocking as well until I spoke to a doctor in Orange County
who independently said, you know, we no longer tell parents to take social media away.
That used to be the prevailing theory was, hey, my kid is addicted to social media,
take it away, don't let them have it.
And then what she said was then we started seeing too many cases of suicides that followed
the taking of the device.
And we realized that it could be that this was a loss trigger, that the dependence on these
products is so strong that there are children that when you take it away, they are so dependent
on it is such a part of their identity that they don't see any option but death. And I can tell
you we have at least three or four that just immediately come to mind of children who had no
symptoms, no signs, other than that dependency on these products when their parents took the phone
away, they died. And in one case, the young woman actually wrote a suicide note that said,
you shouldn't have taken my phone away.
That is the level of addiction that these products are causing.
Now, you know, these companies may say, oh, well, that's a minority of the users.
But the question of the society is, how many children are we willing to lose to something
that doesn't have to be addictive?
Yeah.
Doesn't have to exist.
I mean, how many of those kids people would push back and say, well, were those kids
already predisposed to suicidal thoughts or they were already depressed or something like that?
Because I think what you're trying, I'm imagined for you, you have to prove
the case that there's these kids that are fine, and then it's only the introduction of
Snapchat that we know, first of all, brings the child predators into their lives and the dick
picks and all of this stuff. But then the next step of, then it's so addictive that they
commit suicide when it's not available. Yeah, so there's both, right? There's what you would call
in law an eggshell skull plaintiff. And that theory is if you go out to a bar and you get in a
fight, you punch someone. And it turns out they have a condition that makes their school
like an eggshell and they die, you're not off the hook. You're not off the hook. You're not off
hook for killing them. You shouldn't have been hitting them in the head in the first place.
So for the kids that do have some pre-existing issues, these products are absolutely making those
worse, and that's not okay. But for the most part, what you described is what we see.
Again, it's patterns, and I'm a big pattern person. I have spoken to hundreds of families,
and you always see that same pattern. And what's also striking is that that pattern that we see
when parents are telling their child's story, it is the same whether the child was 10 when they
started using, 12, 14, 16. You can't blame this on puberty. I've seen kids that are 10 years old
that have no issues, and they begin using social media, and you see specific harms begin to
occur within a matter of days, weeks, sometimes months. So we've known about the harms caused by
social media for many, many years now, and the big companies have managed to avoid regulation
by first saying, hey, we're going to self-regulate. But it hasn't worked.
So how does litigation change the calculus for these companies?
Litigation is the last resort.
And it's a last resort for a lot of these parents.
It is something that arises when the companies don't fix things on their own.
And we've seen that.
And for many of us, I mean, look, when I started this in February of 2022,
I had no idea what was happening.
My kids are not old enough that it had not become an issue for me yet.
I was shocked to learn about the harms these companies were causing.
and often in this situation where you don't have regulation in place
where they're absolutely not self-regulating.
And sorry, that's the craziest thing I've heard
that a company that is making billions
and is already hurting people on purpose,
or at least knowingly,
is going to sort of self-regulate and change its behavior.
The reason litigation is effective
is because we have to make it more expensive
for these companies to kill children than to not kill children.
Right now, it is a calculus.
And so that is the point of litigation, is that we make it cost too much for them to keep hurting people.
This is what Harvard philosopher Michael Sandell calls the difference between a fee and a fine.
A fee is just the cost of doing business.
A fine says this is a moral problem.
This is morally wrong.
And the problem is in a values-blind market-based economy where things are just dollar signs.
And to corporations, which only see in the language of cost and dollar signs,
if you don't get publicly shunned or shamed for causing these harms,
then, like you said, it's just a cost of doing business.
And it's cheaper to have those people die
and to deal with the cost of those court cases
or those dead kids and the cost of those court cases
than it is to actually deal with a problem in a moral sense.
It's very sad.
Yeah. Yeah.
And then here you had Section 230,
which for these folks, they thought meant we can do whatever we want
and not be held accountable.
Okay, so Laura, could you, just to make this moment,
more vivid for our listeners. Could you walk us through the meta lawsuits around eating disorders
and, for example, the case of Alexis Spence? Sure. So meta has feed products. I think it's called
explore. Through Explore, if you have a meta account, they will send you things that they think
might be of interest to you. So what we often see in the case of young girls in particular
is that young girls will get an Instagram account. In the case of Alexis Spence, she really
wanted it for Webkins, which is a stuffed animal that has an online component. You can
like create, I don't know, habitats, friends, whatever. So she opens an Instagram account without
her parents knowing using, I think it started with an iPad or something that her parents had
no idea. She can get on there. When she gets older, she has a phone and they put parent control
software on there, but she goes to school, gets on a friend's account and can learn on Instagram
how to bypass those controls. So she's able to sneak Instagram on to her
phone, hide it with a third-party app. She's on Instagram for Webkins, as well as, like,
TV shows that she really likes. And what begins to happen without her driving this is that
meta, Instagram starts sending her very skinny models, disordered eating content, what is
called, you know, Thinspo, and it's pro-anorexia. And before this child, which is what she is,
was before she knows it, she's locked into the stuff. She's got this.
in her head of I need to be skinny or I won't be popular. She's also got meta is connecting her
into pro-eating disorder groups where she is being put into chat rooms with adult users
who are encouraging her. So again, this is not stuff she ever started looking for. This is what
meta decided. This is what's going to engage this kid. It goes down that trail of self-harm,
body dysmorphia. In her complaint, one of the things that we included that is so meaningful is
this drawing she did of herself. And she goes, and in her journals, you can see she goes from
this happy, easygoing kid to somebody that thinks that she is a terrible person. And she's
hiding it from her parents. She's lying to keep up her Instagram habit. And that is putting a rift
between her and her family. And thankfully, she's still here today. But there was a time when
her parents didn't know if she would be here tomorrow.
So I want to talk about one of the cases against TikTok.
Can you tell me about the case of Chase NASCA?
Sure, and we have another one that's very similar that was recently filed.
Two young men, Chase Nasca and Mason Edens.
Both cases involved a sort of normal teenage breakup situation.
For Chase NASCA, I think it was around October of 2021.
He goes through a breakup, we believe, and his TikTok.
talks turns dark very quickly. The algorithm shifts, not because he's looking for suicide content.
In fact, you can see in his searches. He's searching for Spider-Man and workout tips. And
on one occasion, he searches for motivational speeches. And despite these searches, what he starts
getting is just a constant stream of go-kill-yourself content. And I don't use those words
lightly. It is, the content itself is so extreme. It is beyond words. It is, it is beyond words.
It is often very violent.
It is yelling at these kids saying things like, you know, nobody loves you.
Your life is meaningless.
You should just go kill yourself advocating violence against others.
And in particular, one of the themes he began getting in early 2022 was suicide by moving train.
One night in February, Chase hugs and kisses his mom and says, hey, I'm going to the gym to work out like he usually does.
He goes to the gym.
He works out, right?
I mean, this is not a kid that you would have thought.
was about to die.
And he leaves the gym, stops at the unfenced portion of track, stands on the tracks, and
texts his best friend, and says, I'm tired, I can't do it anymore.
Mason Eden's, same story.
He died in November of 2022.
About two weeks before his death, he went through a breakup, and his TikTok algorithm turned.
It was advocating suicide by a particular means,
which in fact is what happens in November of 22.
His mother was in the home, tries to resuscitate him with no success.
So I just want to pause here for a second
because obviously this is deeply confronting.
I mean, real people are losing their lives
because of design decisions made by social media companies.
As you're saying these words,
also think about the content moderators that are working in places like the Philippines
or certain places in Africa where literally their job is to stare face-to-face at this
horrific content that gets flagged for the system as the worst of the worst and then look at
it one by one all day long.
You know, and this goes back to when we talk about this is not content that these kids are
seeking out, right?
I mean, their searches are for uplifting speeches.
And, you know, people think, well, hey, why don't they turn it off or why, you know, they
don't need to look at this stuff. And just to be clear, I always tell people, look, you can post
whatever you want on the internet. We're not suing these companies because of what somebody's posting.
What they don't get to do is take the most harmful, most extreme content and target it at kids.
That's what these companies are doing. These stories that we're talking about, these things that have
happened to real children who are gone as a result. These are not children that were looking for
this content. And in Chase's case and Mason's case, you can see that from their search history.
They didn't want it. They didn't ask for it. Those are choices that TikTok made for them.
And I'm reminded of, I mean, literally, I have been working in the space for such a long time, and it's so frustrating to hear these examples over and over again.
I just go back to dinner table conversations I had in Palo Alto, at restaurants with friends who worked at Facebook, who said, we're giving people what they want.
And the difference between we're giving people what they want and we're showing people what they can't help but look at.
If I think about even someone who has clicked on the photos of a boyfriend or a girlfriend,
and then that person becomes an ex-boyfriend or an ex-girlfriend,
well, there you are.
You've broken up with the person.
And then for months, the algorithm will just show you more and more and more of the person
that you can't help but look at.
And that's a more maybe potentially relatable experience for people who haven't gone
through the most extreme case.
But there's a philosophical mistake in using the word like.
Sometimes I've even said to people that just the fact that it's called the like button,
and then having engineers at Facebook or Snapchat say,
well, this many people liked that content.
It's ontologically framing how we are talking about
and naming what is going on.
And this is all going on in this big abstract virtual universe
of billions and trillions of clicks that are happening
every single day, an hour.
And it's just so sad that such an obvious philosophical mistake
that is resulting in the end of human lives,
of young vulnerable human lives,
and that we are allowing any of this to continue
under the obviousness of the problem.
It just makes me furious,
and I wish that by listening to this podcast,
tens of millions of parents and kids sign up
for the cases that you are working against,
because as I've said at the beginning of this year,
I think there should be, just like there was a moms
against drunk driving,
I think there should be a moms against media addiction movement.
Yes.
And it needs to be a powerful, ongoing political force
that stops this once and for all.
I mean, this is just the most obvious problem
that has no reason to continue,
except to make money for people who are already billionaires.
Right. And just on the other thing for parents that are listening is, I tell parents often, look, if you're going to let your kid use TikTok, check it every other day. Watch the feed for five minutes, right? It's the algorithm when it turns, it turns fast. And if you do that and you're comfortable with the content, so be it. But if you do that and your kid doesn't give you access, you have another issue. And of course, in many cases, there's just no way for parents to control the accounts their kids are opening, how many they're opening, the devices they're using them on, which oftentimes are not even one.
their parents are providing.
All right.
So finally, I want to talk about a case that's notably different from the other ones,
which is the Snapchat fentanyl cases.
Can you talk me through some of those?
Yeah, so we have these parents that have been shouting for years
that something is going on on Snapchat.
We have the fentanyl crisis, which many people are familiar with,
what many people don't realize what is different about this drug crisis,
is that for the first time in our country's history,
it's not just the rate of overdose deaths among adults that has gone up.
it's also the rate among kids age 13 to 18.
And the other issue that people are not aware of by and large is that based on what we've
seen, based on investigation we've done, based on families we've spoken with, we believe that
over 70% in fact, probably closer to 90 plus percent of those teens age 13 to 18 who are
purchasing counterfeit drugs through social media that turn out to be fentanyl, dying of fentanyl poison,
we believe that over 70% of those are getting the drugs through Snapchat.
So basically you've got one social media app that is being used more than every other social media app out there
to sell deadly counterfeit drugs to unsuspecting children who then die a fentanyl poisoning.
When you have one app that is doing it, right, there's something different.
There's something different about Snap.
Now there's the obvious disappearing messages, but I'll tell you it's not the ephemeral messaging on its face.
that's a problem. Snap could operate its product so that the messages like you and I send each other
disappear without affirmatively deleting that data on the back end. So it is SNAP's policies and
systems which are set up to actually destroy all of that data. One product feature I just learned
about the other day that was shocking is there's actually a way within SNAP. If you and I are chatting
on SNAP and you have saved messages, I can actually go into your account and delete those
out of your account, through my account.
And in two of the cases, actually, that we have pending right now,
we have a scenario, and one of them, the family actually saw the dealer doing just that.
They went in, a sibling said, hey, you just, you killed my sister.
And then as they're watching the screen, saved messages out of this kid's account start disappearing.
Now, they had the foresight to take photos, so they got some of that.
But these are product features that, to our knowledge, no other social media product has,
nor do they do that on the back end.
You've got Snap Maps.
Dealers will verify your identity by checking out where you're on Snap Maps or live video.
You have My Eyes Only, which is a data vault within Snapchat itself, where you have a pin that nobody can access.
So that you're the only one with the pin, police can't get it, Snap can't get it.
If you lose your pen or if you're a 12-year-old kid and refuse to tell your parents, that data effectively gets destroyed.
Now, it doesn't affirmatively get destroyed, but nobody can access it.
Only the person with the pin can access it.
it's how the product is operating
and many of these most dangerous dealers out there know
if I don't want to get caught, I go to Snap.
I find these kids on Snapchat.
I sell on Snapchat because the data is deleted on the back end.
Thousands of kids.
Thousands of kids have died from this.
Yep, that's correct.
And in the case of these two survivors that I told you about,
these are kids that not only was it a quick ad
gave the drug dealer access to the child,
but both of these children have said,
I never even looked for drugs on Snapchat.
That's what my feed started showing me without me asking.
So the whole thing is basically Snapchat grooming these children for engagement purposes.
And if they happen to be exposed to dangerous stuff, as far as SNAP's concern, so be it.
In every one of these cases that we've brought, these are children who thought they were buying.
I mean, we have a kid with a shoulder injury and another one with teeth issues.
And because of COVID, couldn't get into the dentist.
They go on to Snapchat, which is an app they know, they trust.
they don't think of it as dangerous. It's Snapchat. I mean, come on. They advertise with cartoons.
And a dealer says, hey, here's a percocet for 10 bucks. Great. And they buy a percocet. And the
percocet turns out to be pure fentanyl or laced with fentanyl. The amount differs. We have some
of these kids where toxicology reports have shown there was enough fentanyl in that child's system
to kill eight grown men. And it's not just prescription drugs. Percocet, oxies. We have a young man
who believed he was buying a marijuana edible.
And actually just earlier today,
I was talking with someone about the possibility
now that they are putting fentanyl in vaping pens.
And I will tell you that the sale of vaping supplies
is another common pattern we see with SNAP
to the point where I typically ask
every family whose child has died from fentanyl poisoning,
hey, did your child start smoking marijuana and vaping at some point?
Because vaping advertisements on SNAP are pretty common.
And the pattern we see often with the younger kids
is it starts with marijuana and vaping,
it slowly progresses over time to SNAP pushing the harder drug content
and or the connections that lead to that.
So how has Snapchat been responding to legal complaints
that its algorithm caused DETES?
What's so fascinating about these cases is in early 2021,
Snap actually meets with a group of parents
who've lost their children with fentanyl poisoning.
I think it was April of 2021.
Snap executives invite them to a meeting
so that they can tell them about all the stuff
that Snap says it's going to do to prevent this going forward.
forward. And according to these parents, the first thing that comes out of SNAP executive
Jennifer Stout's mouth is you can't sue us because of Section 2.30. And this is all detailed
in our pleadings, but the level of cover-up that we have seen with SNAP on this issue is
significant. Just what SNAP knew and when it knew it, how its product was being used
and why. Police saying that SNAP is not being cooperative. It's crazy. Yeah. You know, I think
about all of the retirement accounts that happily are putting some of their funds into all these
tech companies and they're watching their 401ks and their retirement accounts go up, literally
profiting off of the harm to our own children. And it is just this screwed up dynamic where
we would even allow something like Snapchat to exist in the first place. I mean, it is so clear
to me after a decade of meeting people like you who have met all of these moms and these stories.
have just met too many people, thousands and thousands of people. I've stared into their eyes like you.
And it's disgusting. These things should be shut down. There was no problem that the world had
before Snapchat existed that Snapchat solved. It is just a parasite on society. And parents who
have this in their 401k, they don't know that Snapchat is doing all this damage. It's the results
of a values-blind economy that has a narrow optimization function of profit and GDP and stock
market going up that does not see externalities that are showing up in the invisible
landscape of their children's minds.
And that landscape is resulting in
dead children.
You know,
we're using a
medieval institution called litigation
to deal with 21st century technology issues
called social media amplification,
virality, quick ad. There's all these things
that the founding fathers with free speech
did not anticipate and did not mean for free
speech to protect. And
that the 1996 Communications
Decency Act did not foresee fencing
and all and personalized drug dealers and Snapchat maps and a hundred things that we would continue
to invent. So I'm just curious when you think about litigation as the tool that it is, what do you
see as the limitations of litigation as an approach and what do you see as what might be needed
to upgrade that to a 21st century model? Yes, the limitation. I mean, I think you've just
pointed it out, right, is with us, we're working with families whose children have been harmed.
And we are focused on harm to children because within the existing jurisprudence, we
have ways to show, look, these are vulnerable individuals, these are children, right? As a society,
as a society when our children are dying, the hope would be that everyone will notice and will ask
questions and we'll say, no, now we need answers. And hopefully if we can get to the truth,
that way, if we can find out what these companies are doing, how they're doing it, we can broaden
it. But I honestly, I don't know how we address that as a society simply because of these products
and how they're built and the fact that they're acting in real-time,
regulation is a good answer.
I mean, I know these companies have been saying for years,
oh, sure, just give us regulation, tell us what you want to do,
which is funny because they are fighting regulation on every front.
So, Laura, you dropped everything to get involved in this
because you fundamentally saw this need out in the world
and you looked at your specific skill sets to contribute.
What advice would you give to others that are frustrated about these issues
and might feel powerless and they don't know where to start?
I dropped everything because I'm an overprotective mom.
And what I would say to others is be an overprotective mom.
Ask questions, push, write letters, hold up signs, put things on social media, protect your kids, tell your school, tell your teachers, tell your school districts, tell your governors, mayors, senators.
You know, you mentioned mothers against drunk driving earlier.
These companies have more influence in D.C., more resources, more technology than anything we've ever seen.
we have got to get loud
and we have got to get people speaking up.
And part of the problem is that
what these companies have done for years
is trying to put it back on parents and say,
well, it's your fault if you can't monitor it.
I mean, that's insane.
We see parents all the time
who never even gave their kids a phone.
But it shouldn't matter, right?
They're hurting kids.
We need to speak up.
Laura, thank you so much
for coming on your undivided attention.
Thank you very much, Tristan.
Your undivided attention is produced by the Center for Humane Technology,
a non-profit working to catalyze a humane future.
Our senior producer is Julia Scott.
Kirsten McMurray and Sarah McRae are our associate producers.
Sasha Fegan is our managing editor.
Mia Lobel is our consulting producer.
Mixing on this episode by Jeff Sudaken,
original music and sound design by Ryan and Hayes Holiday.
And a special thanks to the whole Center for Humane Technology team
for making this podcast possible.
Do you have questions for us?
You can always drop us a voice note at Humane Technology
com slash ask us, and we just might answer them in an upcoming episode.
A very special thanks to our generous supporters who make this entire podcast possible,
and if you would like to join them, you can visit humanetech.com slash donate.
You can find show notes, transcripts, and much more at humanetech.com.
And if you made it all the way here, let me give one more thank you to you
for giving us your undivided attention.
