Front Burner - Facebook whistleblower on school boards’ social media lawsuits
Episode Date: April 3, 2024In a Canadian first, four Ontario school boards are taking the companies behind Instagram, Facebook, Snapchat and TikTok to court, alleging the platforms are knowingly harming students and disrupting ...the ways schools operate. The claims haven’t been proven in court, and all three companies say they do their best to keep young people safe online.Our guest today has been speaking for years about the kinds of issues raised in the Ontario school board lawsuits. In 2021, Frances Haugen quit Facebook, took tens of thousands of internal documents and leaked them. She later testified to the U.S. Congress, and alleged the company’s products were harming children.Today, we’ve got Haugen on the podcast to discuss the Ontario school board lawsuits, the harms she believes these companies are causing to children, and what she thinks should be done about it.A previous version of this episode included an anecdote about a boy who was bullied, and later took his own life after videos of his bullying were posted online. That anecdote has been removed. In fact, the boy was murdered by two other boys, in an attack that investigators say was planned on social media, and was triggered by an online conflict in a chat group.
Transcript
Discussion (0)
In the Dragon's Den, a simple pitch can lead to a life-changing connection.
Watch new episodes of Dragon's Den free on CBC Gem. Brought to you in part by National Angel
Capital Organization, empowering Canada's entrepreneurs through angel investment and
industry connections. This is a CBC Podcast.
Hi, I'm Jamie Poisson.
In a Canadian first, four of the country's largest school boards, three in the Toronto area and one in Ottawa,
are suing the companies behind Instagram, Facebook, Snapchat TikTok, and seeking billions in damages. The four Ontario school boards allege that Meta, Snap, and ByteDance have knowingly created products that are addictive
and are intentionally marketing them to kids. The decline in mental health, the anxiety of students,
these have skyrocketed in recent years. And we now know that this is directly linked to products
that have been knowingly doing things that are harmful to children.
The claims have yet to be proven in court,
and all three companies say they do their best to keep young people safe online.
Frances Haugen has been speaking out for years
about the kinds of issues that are in these Ontario school board lawsuits.
In 2021, she quit her job as a product manager at Facebook and took tens of thousands of internal documents with her,
leaking them first to news outlets and then testifying about them to U.S. Congress.
But I'm here today because I believe Facebook's products harm children, stoke division, and weaken our democracy.
The company's leadership knows how to make Facebook and Instagram safer, but won't make
the necessary changes because they have put their astronomical profits before people.
Haugen is the author of The Power of One, How I Found the Strength to Tell the Truth and Why
I Blew the Whistle on Facebook. And she is also now a senior
fellow in residence at McGill University's Centre for Media, Technology and Democracy.
Today I'm speaking to her about the issues raised by the Ontario lawsuits, the harm she believes
these companies are causing to children, and what she thinks can be done about it.
Frances, hi. Thank you so much for coming on to FrontBurner.
Happy to be here.
Before we get into these lawsuits, I do want to spend some time talking about your connection to this specific issue of social media's impacts on young people.
So let's go back to something some of our listeners may remember when you blew the whistle on Facebook back in 2021.
You exposed this large body of internal research about how the company's Instagram app harms teenage girls.
And can you tell me more about that? Back in 2021, I faced a very hard choice, which was I was working inside the company in a safety role. And I saw over and over again that the work that was being done
inside the company or the effects of the products inside the company were being mischaracterized.
So, for example, Facebook would come out over and over again and say, we really care about your kids. We're doing
all these things to try to help them, but then they wouldn't be actually, you know, assessing
those interventions based on whether or not kids said they felt helped. You know, it was, it was
all marketing messages. And sometimes they had really serious consequences. 44 U.S. states have sued Meta alleging product safety violations where Facebook knew that they were harming children, for example, sending them notifications during the school day, even though kids said, these make me feel really anxious.
Because when they didn't send them notifications during the school day, usage of the product went down a little bit, right? Like that was the internal culture I was living in.
And I felt that the public didn't have the information it needed to keep itself safe.
And Facebook had an internal culture that was not going to be able to heal itself
unless the public helped. And as a result, I came forward with some of the first hard evidence
that Facebook knew it was harming kids and was misrepresenting its products publicly.
Facebook accuses Haugen of mischaracterizing the research
and downplayed concern about young users having a negative social media experience.
These were teens who were already struggling with mental health issues.
And on all of the issues, the majority of boys and girls said that Instagram either makes things better or doesn't have a material impact on their experience.
about some of the information that you exposed.
One of the most alarming findings was a connection some users made between suicidal thoughts and being on Instagram, right?
Can you tell me more about that?
So one of the most dangerous things about these products for kids is that the algorithms
are very finely tuned to try to pick up on even slight interests you have and continue showing
you more content about those interests. If they believe it'll keep you on the product longer,
if it'll keep you engaged with the product, that means like sending other people comments and likes
and those things. Unfortunately, some of the most, you know, we'll call it sticky content,
you know, content where if you have a slight indication for it, you can end up developing, say, a preoccupation is things like self-harm, you know, eating disorder
content. And many independent researchers have shown since then that you can have, you know,
very small interests in things like, you know, healthy eating and slightly, you know, I'm feeling
a little blue topics. And the algorithm will begin showing you
things like people committing acts of self-harm. And I think the issue is particularly dire
internationally. So one of the most shocking things that I learned after I came forward
was that when people speak smaller languages, for example, Norwegian or Norwegian and Danish,
languages, for example, Norwegian or Norwegian and Danish, because they don't rebuild these systems language by language, they had an active cluster of young adults that were glorifying
ending their own lives, freely operating on the platform.
And even though journalists reported them to the company, likely because they didn't have any staff that
were responsible for self-harm in Norwegian, those clusters are still operating today. You know,
there's freely operating accounts in that cluster that was reported two years ago that is still up
there. Wow. I'll just note here on this issue, Meta did recently tell The Guardian that they take the issue of suicide and self-harm very seriously and will hide content that discusses suicide and self-harm from teens, even if it's shared by someone that they follow. After you went public with these findings, you've continued this work, right?
You've written a book that you released last year, The Power of One.
And I want to just continue drawing on the research that we've seen since, right?
So I just want to read one point from the statements of claim. negatively impacts the student population by causing maladaptive brain development,
compulsive use, disrupted sleep patterns, behavioral dysregulation, learning and
attention impairment, and other serious issues that impact the school learning and teaching
climate. And can we just break down the evidence around some of these allegations, particularly
maybe to start maladaptive brain
development and behavioral dysregulation? Do we have evidence that these platforms are causing
those problems in teens? You know, it's interesting. A lot of the problems that we're now discussing in
the context of schools, we've seen these things play out in situations of, say, societal violence for at least a decade.
So I'll give you an example.
One of the things that has been called out by these school lawsuits is that kids will go and pick fights and film them so that they can post them on Instagram.
first went back to the high school that I went to a while ago, I'm old in teenage years,
the principal said to me, the number one thing I spend disciplinary time on is Instagram. And I just stared at him. I don't have a high schooler. I don't have the privilege of interacting
with high schoolers or middle schoolers. Shocking fact, most employees at Facebook do not have a
child that is older than
10 years old, right? They don't have experience with these things either. It's a very young field,
you know, a young person's game in terms of who works on these things. But he said, yeah,
the number one thing I spend disciplinary time on is Instagram. And I was like, what do you mean?
He goes, well, there's this anonymous fight club account for our high school and kids
will go and pick fights with other kids and post the fights on Instagram.
And I have to go find who the victims were because they're too embarrassed to come forward.
Like I have to go find the kids who picked the fights.
And I was like, did you tell Facebook about this?
And he's like, I have repeatedly asked them to take this account down and nothing happens.
And so I think a core part of the complaint is that a lot of the things that are happening
on these platforms, they're not new.
You know, we've heard about things like the devious licks trend where kids were destroying
school bathrooms and posting it on TikTok.
I think it's like pretty funny.
That's dumb,
but I guess they, to all their own, I guess. Kids are stealing everything from urinals
to soap dispensers, some even trying to take the entire sink. These kids are apparently stealing
anything, even entire water fountains. And so the question is, you know, what are the obligations that these platforms have to our children?
And I think some of the ways of fixing them are really simple.
You know, it's things like saying these apps should probably not function during the school day.
At a minimum, you shouldn't send notifications during the school day.
But these choices, when left to privately held for-profit companies, they're always going to be tuning and optimizing for more usage and more profit.
Not Arcus. I'm going to go. empowering Canada's entrepreneurs through angel investment and industry connections.
Hi, it's Ramit Sethi here.
You may have seen my money show on Netflix.
I've been talking about money for 20 years.
I've talked to millions of people and I have some startling numbers to share with you.
Did you know that of the people I speak to,
50% of them do not know their own household income?
That's not a typo.
50%. That's because money is confusing.
In my new book and podcast, Money for Couples, I help you and your partner create a financial
vision together. To listen to this podcast, just search for Money for Couples.
Let's talk about sleep too, because that was an interesting one that I saw in the statements of claimant.
And so, you know, what are they talking about when they say that there are big issues around sleep?
So for context for your listeners, the American Academy of Pediatrics talks about how sleep deprivation is one of the largest health dangers for kids.
the largest health dangers for kids. When a child is sleep deprived, it puts them at higher risk for mental health issues, not just depression or anxiety, but more serious things like bipolar or
schizophrenia. It increases the chance they'll abuse substances, downers because they're depressed,
uppers because they're tired. It puts them at higher risk of accidents. It decreases their
ability to learn in school, things that will stay with them for the rest of their lives.
It literally changes how their brains develop.
And yet right now, if these companies were forced to publish the numbers on how many kids are on their applications late at night, I think we'd be shocked.
In the U.S. right now, 30% of high schoolers report being on screens till midnight or later most nights, most school nights, most school nights.
If 30% is on till midnight, easily 10% is on till two. When they look at their success
statistics, a child that's on till 2am versus a child that's on till midnight is a more successful
user because they look at more content, they click on more ads. And unfortunately, the people
who pay for that extra profit are our children in the learning deficits they acquire in the stolen chance to grow and be
healthy that they would have gotten if they'd been able to fall asleep at a reasonable hour each
night. And over the last two years, I've been forced to really confront the fact that I worked on these tools
for years without ever really thinking about the needs of kids. And so if people who are
conscientious can still have blind spots, we need the help of different views. And that's what these
lawsuits, this is what the Canadian Online Harms Bill, which was recently introduced, is about.
This is what the Canadian Online Harms Bill, which was recently introduced, is about.
The idea that we will build better products that we're proud to use if we can get more eyes on the problem, more conversations on how to solve these things, instead of just
relying on a few people sitting in Silicon Valley who are probably under the age of 30
designing the telecommunications infrastructure for the world.
Together, they teach and care for more than half a million students.
Now, four of Canada's biggest school boards are going after the social media giants,
each launching its own lawsuit against Snapchat, TikTok, Instagram, and Facebook to the tune of $4.5 billion. Among their allegations...
The lawsuits really underline this idea that the companies, the three companies, the owners of
Instagram, Facebook, TikTok, Snap, deliberately target schools and students. And so one example
that they give is Snap CEO. So back in 2012, he was celebrating the fact that most early users
were high school students who were using Snapchat as like a way to pass notes in class. So using it
secretly during class time. And just from your experience and your research, how important is
this demographic to these companies? How hard do they work to target them? So one of the kind of
shocking revelations that came out of the state attorney's general lawsuits in the United States
was that Meta was tracking the number of under 13-year-old users for a long
period of time. Even though they were saying, we don't have anyone under the age of 13 on our
products, they cared enough about the demographic because it is the future. Whatever platform kids
start using first, it's usually the one where their peer group gets critical mass. So the most important
phenomena for understanding social networks is something called network effects. So a network
effect is when the more people use a single product or service system, the more valuable it is for
everyone else. So this is like, think of all the times you've been frustrated with meta or with another product that your friends use.
But the reason why you keep using it is because getting all your friends to switch at the same time would be too hard.
In the case of children, getting children onto these products early means you can secure the next generation.
And this is one of the reasons why we need public action. Because right
now, if you're a company and you're trying to do the right thing and say things like, we don't
operate during school hours. We have a little pause every day in the middle of the day for school.
And another company doesn't do that. Kids will migrate onto the other product.
And so it's one of these things where the companies know if they can't get young
users, this is why young kids don't use Facebook, right? They didn't use it as their default
platform when they were 15, 14. And now Facebook is for older people. If you don't have young
users, you don't have the future. Which app do you think is winning that right now of these four?
So TikTok definitely has, I would say, the most intense seizures. But it's a question of winning
for what? As Snapchat pointed out, they said, we're not a social media app to the extent that
Instagram is. We are about peer-to-peer
communications. I don't have the data on exactly what fraction of, say, peer-to-peer communications
are Snapchat versus Messenger, but each platform has different uses and is strong in different
ways. And I think one of the things that's really, really interesting about this lawsuit is that
they picked multiple providers, I think probably for that reason,
that they didn't want to go in there and penalize a single platform because in isolation, they would
just shift to other platforms. Many of these products could be alternatives, but kids coalesce
sometimes school by school, community by community on different ones.
school by school, community by community on different ones.
And I would imagine it's kind of intuitive to think that maybe the harms are a bit different on different ones. Would you agree with that?
We've definitely seen that different platforms have different levels of conscientiousness when
it comes to harms to kids. So for example, Australia has something called an e-safety commissioner, and they have the responsibility of monitoring whether or not platforms comply with the digital safety laws in Australia.
And one of the things that they did last fall was they put out a report on child exploitative imagery, more like child abuse imagery. And they found that the platforms
radically varied in how fast they could take down an image once it was reported.
And Snapchat, for example, was one of the most conscientious at being able to take down an image
within a couple of minutes once it was reported as being child abuse material. This is not true
for all platforms. Some platforms will take hours
to take down violating content. Another one that is surprisingly good is TikTok tries a lot harder
to do things like detect networks of people who are abusing the platform. There's been a lot of
quite shocking reporting coming out of the New York Times and the Wall Street Journal
over the last few months around the presence of people who are seeking images of young children
on Instagram. And what you see over and over again in that reporting is things are happening
that should be easy to find. You have hashtags that are quite graphic and quite obvious for
soliciting images of children. And yet Instagram didn't do the work to try to find those accounts.
Or if an adult follows multiple of those accounts, they didn't intervene to at least dissuade that adult, let alone block them. While TikTok, if you begin poking around
for that kind of inappropriate content, they'll lock you out of your account much, much quicker.
And so we shouldn't equate all these platforms as being exactly the same because some are trying
harder than others. But the important issue here is for all of these platforms,
we're left on the outside kind of trying to figure out what they're doing.
We don't have this black box. We don't have any rights to data. You know, we have to figure out
tricks to figure out these things instead of having the right to have conversations and get
real answers. And that's why it's important for us to pass laws that give us the right to say,
you know, you have to have a conversation with us about these harms. We need to know that you're
at least trying to address the harms that are known on your platform.
Talking about the harms, we've been talking about some anecdotes and examples today,
but it also stood out to me in the lawsuits that they're asking for $4.5 billion in damages.
Largely, the school boards are saying that the compensation is necessary
because these problems are a huge drain on their already limited resources.
They need it to address additional mental health concerns, cyberbullying, increased risks of child sex abuse.
But the fact that they're saying this, the fact that they're asking for this number,
what does that tell you about the scale of the problem?
So let's say an even simpler thing.
So let's say kids are distracted in class because they're on social media. Paying for tutoring just to remediate the lost education opportunities for those kids is extremely expensive.
know there is significant need amongst students. So if we do win, and that will be for the courts to decide, of course, we would look to put that directly back into students, resources for
students, because that's what this is. I think one of the things that's not necessarily obvious
if you don't have a child in this age range is just how alluring these platforms are and how
hard it is for kids to self-regulate at this age when you
have such powerful incentives keeping them glued to their phones. Teens know full well somehow
those apps keep them hooked. They are pretty addictive. I really couldn't say that they're not.
Yes, very. Super. Yeah, yeah. Super addictive. And so I think it's not an outrageous number.
Paying for more mental health services is extremely expensive.
Paying for more disciplinary time, like having people who and 18, even if they're not on social media.
Because if you're in a school where your peers are on social media, you still end up having to live with the costs of these systems.
This is a thread that you've been pulling out through this whole conversation, but I think it's worth it for me to just ask it straight to you. So what about the argument that this is all a lot of pearl clutching about, you know, like what the youths are up to these days?
clutching about you know like what what the youths are up to these days like in every generation there are adults trying to stop kids from doing something that is largely fun and then maybe
sometimes dangerous or negative right and and so is is this just more of that i think one of the
things that's different about uh social media versus, rap music or video games is if you had gone back in time and interviewed kids about how rap music made them feel, most people who listen to rap music were like, I like rap music.
It makes me feel powerful.
It makes me feel strong.
It helps me identify with community.
It's shocking how many kids, when you interview them, you say, hey, does this make you feel anxious? If you could wave a magic wand and keep your younger sibling off social media,
would you? Have you received an unwanted sexual communication in the last seven days?
That's one of the key pieces of data that came out of the state AG lawsuits was one in eight
kids between the ages of 13 and 15
said, I received an unwanted sexual communication in the last seven days. You know, the answers to
all these questions are not positive. You know, kids are saying, I do feel anxious. I do feel
depressed. I feel like I don't get a choice to use these things because if I don't use them,
I'll be ostracized. And I think that's the real
question here is, you know, if let's imagine we came in and said, hey, what would these products
look like if they were designed for kids, you know, designed such that kids needs were fulfilled?
How would they look different? They probably give kids a lot more control. You know, they wouldn't show kids stuff that they didn't ask to see.
They wouldn't have adults that they didn't want contacting them, contacting them.
And the thing that I was most shocked by getting to talk to teenagers was how angry they were that they had been experimented on.
Right?
Like, these are products that were designed for adults.
That are optimized for adults.
Right.
These are platforms that don't give kids a voice and they don't,
they weren't designed for kids.
And in the case of social media,
if you're a high school kid today, it's very hard to opt out of it as a system.
And once you start having systems where it's the kids don't get to choose
what they see, you end up having algorithms that direct kids' attention.
One of the things that was most heartbreaking to me was I talked to a child psychologist.
And they said, I sit in my office every week listening to high schoolers who say, you know, I'm trying to make better choices about
my eating disorder, but when I open Instagram, it follows me around, right? If you said to a kid,
hey, like if you could have Instagram, but you could reset the algorithm whenever you wanted to,
right? Like you had a right to have the algorithm reset. Would you do it? I don't think there's any
kids that would say no. So we shouldn't think
about these things as binary, like, you know, can kids have social media? Can they not have social
media? It should be around what are the rights of kids and families and communities to having
systems that are safe, developmentally appropriate, where kids have a voice and are safe.
Like, that's the real question here.
You testified in front of U.S. Congress about Facebook's products harming kids and democracy in 2021.
Two and a half years later,
what do you think has really changed? I think the question I want to ask and end this interview on with you is, are you honestly more
hopeful or more discouraged? So I am blessed with, I took a lot more history classes in college than
most tech people do. So I was a Cold War studies minor way back in the day.
And the history of the 20th century is such a miraculous period in that impossible things happened.
Right. Like the Soviet Union fell.
No one in 1960 thought that was ever going to happen.
The British left India.
Apartheid ended.
You know, like these things that seemed impossible happened.
And when I look at the world, I think on timescales that are five or 10 years long.
And one of the big challenges in the United States right now is like we have a government that struggles to actually pass budgets, let alone pass laws.
budgets, let alone pass laws. And so I spend my time on Canada and Australia and middle power countries because I think that's who's going to be the next set of players to move. So I used to
spend a lot of time on Europe because I thought the DSA was the most likely next thing to move.
And I'm amazed that we've seen anything. If we went back in time to September 2021, so that's when the first article was published in the Wall Street Journal, no one could imagine 44 US states suing META. No one could imagine school districts suing META. No one could imagine the Digital Services Act passing. They've been trying to pass that for like four or five years, right? And so I am extremely heartened that the world is starting to move.
And I am extremely realistic that the world moves slowly. The world is a very big place.
And so I will continue to keep plugging ahead because the way the world changes is we change it.
And we have no choice but to fight for social media we feel good about.
Because it is our future.
And it will continue to impact our democracies and our societies and our children.
And the way we get nice things is we make them nice things.
And we deserve to have nice things.
Frances Haugen, thank you.
Thank you so much for this.
It was so interesting and at times very surprising
and it was a real pleasure to have you on
My pleasure
Just a note to say that a previous version of this episode
included an anecdote about a boy who was bullied and later took his own life after videos of his bullying were posted online.
That anecdote has been removed because, in fact, the boy was murdered by two other boys in an attack that investigators say was planned on social media and was triggered by an online conflict in a chat group.
media and was triggered by an online conflict in a chat group. Before we go today, I just want to note that Meta CEO Mark Zuckerberg has previously refuted Francis Haugen's claims about the company.
In 2021, he wrote this blog post saying that the idea that the company prioritizes profits
over safety and well-being is just not true. In that post, he also wrote that,
quote, I've spent a lot of time reflecting on the kinds of experiences I want my kids and others to
have online. And it's very important to me that everything we build is safe and good for kids.
Okay, I'm Jamie Poisson. Thanks so much for listening. Talk to you tomorrow.