Your Undivided Attention - U.S. Senators Grilled Social Media CEOs. Will Anything Change?
Episode Date: February 13, 2024Was it political progress, or just political theater? The recent Senate hearing with social media CEOs led to astonishing moments — including Mark Zuckerberg’s public apology to families who lost ...children following social media abuse. Our panel of experts, including Facebook whistleblower Frances Haugen, untangles the explosive hearing, and offers a look ahead, as well. How will this hearing impact protocol within these social media companies? How will it impact legislation? In short: will anything change?Clarification: Julie says that shortly after the hearing, Meta’s stock price had the biggest increase of any company in the stock market’s history. It was the biggest one-day gain by any company in Wall Street history.Correction: Frances says it takes Snap three or four minutes to take down exploitative content. In Snap's most recent transparency report, they list six minutes as the median turnaround time to remove exploitative content.RECOMMENDED MEDIA Get Media SavvyFounded by Julie Scelfo, Get Media Savvy is a non-profit initiative working to establish a healthy media environment for kids and familiesThe Power of One by Frances HaugenThe inside story of France’s quest to bring transparency and accountability to Big TechRECOMMENDED YUA EPISODESReal Social Media Solutions, Now with Frances HaugenA Conversation with Facebook Whistleblower Frances HaugenAre the Kids Alright?Social Media Victims Lawyer Up with Laura Marquez-GarrettYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Transcript
Discussion (0)
Hey everyone, this is Tristan.
And this is Aza.
Okay, for this episode of your undivided attention, we're going to be discussing the January 31st
Senate hearing on social media companies and their exploitation of children online.
This was a very important public event with testimonies from the CEOs of Meta, Snap, TikTok,
and actually, this is a thing that's been coming for a long time.
And so to host this discussion, we're going to be hands
we're going to be handing off the mic to the brilliant Julie Celfo.
In addition to her work as a journalist and a former New York Times staff writer,
Julie is the founder and executive director of Get Media Savvy,
which helps to create a healthy media environment for kids.
And she has two very special guests for this episode,
who she'll tell you about next.
Welcome, Julie.
Thanks, Tristan.
In this episode, we're going to unpack the recent U.S. Senate hearing
on social media companies and their exploitation of children online.
The hearing was a significant step towards making tech platforms accountable for the array of harm their products have been causing.
I went to D.C. for the event and found myself sitting directly behind Mark Zuckerberg, the CEO of META, with Jones Facebook and Instagram.
The experience was, frankly, surreal.
So today, I'll be discussing the hearings and where we go next with two guests who have been directly involved in trying to make policy reforms happen.
First, Camille Carlton, the policy director at the Center for Humane Technology,
where she steers the organization's national and state policy strategy.
And Francis Hogan.
Francis was the memorable Facebook whistleblower who came forward in 2021 with tens of thousands of Facebook's internal documents.
She's the author of The Power of One, How I Found the Strength to Tell the Truth, and why I blew the whistle on Facebook.
Welcome, Camille, and Frances.
Thanks so much for having us.
Happy to be here.
So, Frances, you and I were both there in the hearing room,
and, you know, I felt so many different emotions that day.
It's taken me some time to sort of process it all.
What did you think about the hearing, or how did it leave you feeling?
So for context, what I showed up at the hearing,
I did not know that the parents who were present had a coordinated kind of theatrical plan
of how they were going to approach the hearing.
So when you come into hearings, usually,
people are not standing on their feet.
They are usually not holding things over their heads.
When I walked in, the witnesses were about to file in.
So I thought the thing that people were holding over their heads
was, you know, tablets for recording or, like, phones for recording.
And then I realized these are people holding up photos of their dead children.
You know, some of these kids died because of eating disorders,
some because of bullying, some because it's extortion.
And so the energy with that table setting, right, the idea that you walk in and you're like, oh, my God, like these parents are not angry. They are livid.
So there were a lot of contentious soundbite moments in this hearing. Politicians trying to outdo each other by sounding tough on big tech.
At one point, met a CEO, Mark Zuckerberg, stood up and turned to face the families who were holding up placards of their kids who had been severely harmed by these platforms.
Here's what he said.
Your families have suffered, and this is why we invest so much and are going to continue doing industry-leading efforts to make sure that no one has to go through the types of things that your families have had to suffer.
This was kind of an astounding moment for me, and I'm still trying to process what happened and put it into words.
Francis, what did you make of this apology to families?
What you hope for in a Senate hearing like this is that senators use their bully pulpit to demand accountability.
and to demand people to actually face awkward facts they might otherwise avoid.
Some of the lead-ups to this moment included Senator Butler of California calling out the fact that when she met with, I guess, the CEO's the night before, Mark had said he had never actually met a parent who had lost a child due to social media, due to his product.
The fact that Senator Holly leaned in so hard on the idea of, if you have never apologized, you need to do it right now, and Mark followed through, it shows you the energy that was in the room that day.
You know, the time for, we'll solve this later, the time for maybe we don't have enough data, the time for it's the parents' fault is over.
And the Senate has woken up and is willing to, like, push on that point now.
I mean, it really was startling, right?
Because up until that moment, we had no idea that was going to happen.
And I think all of the camera crews even were caught off guard because he turned around and they were behind him.
Camille, what did you think of the apology?
You could also see at the end, you know, Mark's training coming through.
He circled back with, and this is why we invest so much in child safety.
And frankly, from our vantage point, I think that's what we actually really need to talk about is how much are these platforms really investing in child safety.
Well, one of the things, though, that was really different about this time,
was like, you know, it's been years since I've seen Mark seem afraid on the stand.
Like you were saying, like, he's so well trained.
Usually he's there because he's like, I've been forced to show up, and he gives that vibe, you know,
or he gives the robot vibe.
And you could see, like, the fear in his eyes.
Like, standing up to a parent who's lost a child is like an emotionally intense experience.
And he had just spent, I don't know, maybe two hours at that point in a room full of people
who wanted him dead.
You know, it's a unique human experience.
So just two days after the hearing, META stock price had the biggest increase of any company in the stock market's history.
Camille, what did you think of that?
Why did that happen?
So one of the things that Mark was really talking about in this quarterly earnings call was META's, quote, year of efficiency.
And what we need to understand is that when he says year of efficiency, would that
means is a year of cutting trust and safety teams, a year of cutting the teams who are working
to ensure that your children are safe online, and a year of prioritizing innovation over
safety. And until we have regulation that makes it the opposite, that makes it so that these
companies have to prioritize safety, we're just going to keep seeing these market incentives
drive the same pattern we've been seeing up until today. I have a very different.
take on this. Yes, it is true that the stock price surged the day after or two days after the
hearing when they did their quarterly earnings call. But there's a couple very critical details.
You have to pay attention to go in terms of what drove the mechanics of that price.
So the first is that META announced the second biggest stock buyback they've ever done.
The second thing is they announced their first dividend. At dividend, for context, tech companies
try to go as long as they possible can go without paying out dividends. They're like,
know we know how to handle money better than you we're not going to give it back to you we're
going to keep building and growing and they had announced their first dividend but the third thing and
this is where i really smelled blood in the water they announced they were increasing all of the
performance bonuses in the company by 50 percent like take a step back for a moment and think about
them so the thing that the company did after facebook file started to publish was they said internally
Francis is lying. She cherry-picked all these things. None of this is true. Oh, by the way, we're going to lock down all of your access to security documents so you can't confirm for yourself. So they didn't want people to quit. This time, they had the senators, like, pounding on them being like, you lied about this, you lied about that. They had to announce huge compensation increases. So I see it as they knew they were fragile. The only way they could keep the stock price from betraying how fragile they were was to get like pay.
back to investors in a way that they've never had it before.
This was actually the 10th hearing on teen mental health and child online safety in less than
three years. Camille, what if anything made this hearing different from the rest?
This hearing is happening amidst, a confluence of events all talking about this.
Several months ago, we had almost all attorneys general across the United States come forward,
and Sue Mehta for intentionally designing their products to be addictive and for knowingly having
underage users of their products. We have a series of lawsuits being led by schools and parents.
We have the Surgeon General's social media advisory. So it's an entire ecosystem of people,
policymakers, parents, litigators, youth themselves coming forward and saying enough is enough.
and this hearing took place in the midst of all of this, you know, this inflection point that we're seeing right now.
And I would add there, it's one of these things where it shows you the power of actually having transparent data from these companies.
Right. So like Camille mentioned, the 44 state attorneys general, because they had subpoena powers, the documents that they published, the filings that went out, add up to about 2,000 pages.
And those are summaries, effectively, of thousands of pages of documents about teen mental health.
When Congress has information to work off of, they're very good at asking hard questions and calling out specific facts.
But when they don't have that data, they don't know where to focus.
And that's what's different this time, is they have the data.
Sometimes I find that it's hard for people, maybe some people listening now, to really grasp the depth of this problem for kids and teens.
Many adults, probably most adults,
are using the same social media platforms
as these teenagers who are being harmed.
Why is there such a gap in user experiences?
One of the things that I think is not obvious
is the experience of social media
is very different for different people in this country.
So if you are privileged, if you are in your 60s and 70s,
if most of your friends went to college,
if most of your friends are economically secure,
you see a version of Facebook
that is much more sanitized, like literally they spend more money, content moderating it,
and just the content your friends are providing is less inflammatory on average.
Part of why the situation of kids is so bad is as the filings from the AG's show,
Facebook underinvested specifically in kids' experiences and in Instagram, because that's where the kids were.
And I want to be fully honest, advocating for kids was not my core issue when I came forward.
It hasn't been for the last two years.
And when I read the AG filings, I felt like I got radicalized because I had no clue how bad it was.
And I'm like the person who's known as like the person who understands Facebook the best.
And I think the thing to remember is that the people who built these products don't have teenagers for children, right?
They're not old enough.
You know, Mark is 39.
His kids are like six and eight or something or five and eight.
I don't interact with teenagers.
All my friends have kids that are younger.
And now that the AGs have gone and like ripped open the dirty laundry, now we're like, oh my God, something really stinking.
because we now know how bad the problem is,
because Facebook knew how bad the problem was,
and yet they didn't do anything to fix it.
The AG lawsuit is actually not about
that they made products that were addictive to children.
It's that they lied to the public
that the products were not addictive to children
and that they didn't intend to make them addictive of children.
Because technically, under Section 230,
they can make products that are addictive for children.
They're allowed to do that,
but they are not allowed to lie about it
on your consumer protection laws.
That's what got the tobacco companies
was actively hiding in.
And one of the things that got called out
throughout the hearing
was senators would say,
you told us this on this date,
you told us that on this date,
your other executive,
Antigney Davis,
the head of Instagram,
Adam Asseri,
he said these things to us.
We now know those are lies.
Do you think you have the right to lie to Congress?
And so one of the things
that I kind of read
into the energy of the hearing
was you had these people
who knew the AGs
were about to get a lot of press coverage
for going really hard
and really serious.
on these tech companies.
And there was no longer space to be seen as going soft.
A week before the hearing, Snap announced that it was endorsing the Kids Online Safety Act,
or COSA.
Microsoft also announced its endorsement of the bill just a day before the hearing.
And Linda Yakorino of X, formerly Twitter, said at the hearing that X endorses it as well.
Camille, can you talk through what the Kids Online Safety Act does?
Yeah, absolutely. The Kids Online Safety Act is a federal online safety bill. Right now, it has
almost 50 bipartisan co-sponsors. So in terms of the bills that were discussed and are on the
table, it's really a leading candidate. And what it does is it creates a duty of care. So a duty
of care is similar to a fiduciary responsibility that a doctor might have to prioritize the health
of its patients above all else. And what the duty of care means on this bill is that services like
social media platforms, video games, messaging apps, they have to take reasonable measures to prevent
harm to kids who are using their platforms. It requires platforms to put all of the privacy
settings at the highest level by default for users. It gives kids and teens the opportunity
to completely turn off data-driven recommendation algorithms. It gives parents tools to track
screen time, spending, and report emergencies to platforms, which
has been a big issue thus far. And it also requires an annual audit to assess risks to minors.
So all of these together are great interventions that we need. There's not one silver bullet
solution, but where COSA is right now is that it has the most support that we've seen for
a kids' online safety bill at the federal level. And I think these features that you're talking
about are so significant, right? I mean, I'm a parent of three. And to just start by having all
apps have the highest level of privacy is such a game changer, right? Right now, if each of my kids
downloads different apps, I have to figure out how each one works, go in, change the settings,
it's a ton of work. Why shouldn't it be the safest possible setting for all children? And then if
you're a parent, let's say you're Kim Kardashian, you want your kids to be famous and accessible
to everybody, sure, you have the right to go in and make their settings public. But it isn't
the default that way. So it protects kids by default. Francis, why do you think
Snap and X decided it was advantageous to endorse those bills now? And why are the other platforms,
meaning Discord, meta, and TikTok, not doing the same? I think what Linda Yakorino said throughout
the hearing, you know, three or four times she said the same line, which was 1% of X's users
in the United States are under the age of 18. You know, it's not a, quote, trendy app for younger
people. It's easy for her to say, yeah, we're fine with that. We can make it a lot safer. We can
go to safer defaults, because that's not really where their business lies.
In the case of SNAP, when the Australian East Safety Commissioner goes and asks for data across
applications on things like child exploitative imagery, Snap actually comes back with the best metrics.
You know, in the case of SNAP, I can't remember the exact number.
It's like it takes them about three or four minutes to take down a piece of child exploitative
imagery after it's reported.
In the case of SNAP, they need to be more transparent.
They need to be audited.
These are all the things that will happen with COSA.
But they have paid down their debt more than others have because they made a lot of really irresponsible decisions for a really long time.
In the hearing itself, they got called out for the fact that they really began as an app for being able to, quote, safely send your nudes to other people.
And so I think it's one of these things where Instagram's going to have to do a lot of work to get up to compliance.
And so we should expect them in apps like Discord, particularly to drag their heels as long as they can.
There was this one point in the hearing that really startled me.
It was when Mark Zuckerberg said there is no scientific evidence proving that social media harms teen mental health.
What was your take on that, Francis?
This is a great illustration of how Facebook answers with data because he's like,
there's no evidence at a population level that social media is a net harm.
And the challenge for the statement like this is his own lieutenant appeared before the Senate in 2020.
maybe a week before I did.
So I had not come forward as a whistleblower.
And she said, four out of five kids are fine on social media.
So when you do the math on that, if four out of five are fine, that means one out of five is not fine.
Because the kids that are being harmed are not being harmed a little bit more than the other children.
It's that they're being harmed a lot more.
And it happens to be good on a population level.
And as Mark said, washes out.
Camille, one of the things I know that you were hoping to address,
but it doesn't feel like they addressed very much,
is that these platforms have addictive design features.
Can you talk a little bit more about that?
What had you been hoping would get discussed there?
I think one of the things that it is really difficult
about these issues that we find with online harms
is that they're all connected.
you cannot separate child sexual exploitation online from the design of these platforms
and from the investments that these platforms make in their own safety teams and in their products.
And so while the main focus was really to target, yes, child sexual exploitation online,
what was missing was kind of this greater conversation around how are platforms
prioritizing and investing in safety?
Is there a risk mindset first or a safety mindset first?
and how do their active design choices enable this type of harm online?
And you saw this come through a little bit.
So, for instance, when Senator Cruz asked Mark Zuckerberg about a design choice that they had within Instagram to enable and allow people to continue seeing the results of child sexual exploitation online, this question was, why does this button exist?
why is there even an option on the platform to move forward and see this content?
And then you gave users two choices.
Get resources or see results anyway.
Mr. Zuckerberg, what the hell were you thinking?
And so this is a design choice made by meta in order to enable users to keep going.
And so the question that I think is really important here is why was this choice made?
and what did this choice prioritize?
And what we've found time and time again
is that this design choices tend to prioritize
keeping people on platforms longer,
making sure that they are getting revenue
from user engagement over protecting people from harms online.
When you think about the financial system in the United States
and how much regulation there is
and the size of the compliance teams at all the banks,
it's astounding that the,
these mass media platforms, because really social media is a form of mass media, that they are allowed
to be the central purveyors of information for people in our country, and that there aren't really
any regulations that force them to have acceptable amounts of internal, both compliance and
trust and safety officers, but also that they have transparency and have to share the information
and activities with the public, with lawmakers, with an outside body. Camille, what would you say
are the biggest hurdles that are getting in the way of Congress taking action now?
So I think that there are two main things that we need to be tracking.
First, we touched on a little bit, but this is really the power of lobbying in this country.
Many of these platforms, whether it's meta or SNAP or Google, they are a part of lobbying groups
where they don't have to show their name.
And these lobbying groups go forward and kind of do.
their dirty work, do their dirty bidding, while these companies to the public say, yes, we support
child safety legislation. We want this type of reform. We want to work with Congress to make a
difference. And then behind the scenes, you'll see TechNet, you'll see Net, you'll see Net Choice,
killing bills, basically, you know, doing what they do in order to scare policymakers. So this is
one big issue that we have. I think the other big issue that we have is it has been so, so long.
since we've passed a bill, I think that we're kind of scared. We're really looking for this
silver bullet solution. And that's just not what we need. And that's just not the reality of how
change happens. There isn't going to be one single bill that fixes all the problems that we're
seeing. And some of the bills need to be focused on tech directly and others need to be focused
on social solutions. But we need to start getting into the habit of actually passing more and more
bills that react and respond to the speed that tech is coming out. So we need both kind of a more
practiced legislative mechanism and we need more transparency around the efforts of lobbying groups.
Frances? Like the number one thing we should fear is fatalism, right? Like if you feel fatalism,
it's a sign that someone's trying to steal your power. You can go and hit your head against the
wall over and over again. It can feel like there's never going to be movement. But the reality,
This is like a rule in social sciences.
Any trend which cannot continue forever eventually ends.
And you cannot keep stacking the bodies of children up.
Like we put eight-year-olds in car seats, right?
There are some issues where we don't accept any kids being harmed.
You cannot keep accumulating parents of brave children
because these people become like hungry ghosts.
You don't want to keep accumulating too many of them
because they will come for your business.
Well, I'm with you, Francis. I mean, remember the Ford Pinto and all the people who were burning to death because of the way the gas tank was designed? You know, this went on for years and finally there was a memo that revealed that they knew that people were burning to death and they just didn't want to spend the money to make it safer. So finally, after 27 deaths, there was enough pressure on Ford that they recalled these Pintoes. How many kids have to die before we put some boundaries on what these
algorithms can do to our children. So I agree with you that we're really, I mean, I'm scared
to say it's a watershed moment, but it really felt like a watershed moment in that hearing
room. And I would add there framing thought, which is, you know, if we look back on the history
of the 20th century, it is like this story of impossible things happening, right? Like no one thought
the Soviet Union was going to fall. No one thought apartheid was going to end. No one thought
we'd get the Civil Rights Act, right?
Like the riots in the streets.
And the thing really, like, the reason why I'm always trying to cut off
people's fatalism is, like, the way we actually win against seemingly impossible
foes is we believe we can win, right?
We believe we can ask for more.
We deserve to demand more.
And so, you know, if you can keep the seed of hope going, like the seed of hope is the
most catalytic and, like, dangerous thing in the world.
And so that's my only wish to leave you with.
Thank you, Camille, and Francis, for joining us today.
And thanks to you for listening.
I've loved being with you as a guest host.
Tristan and Eza will be back for the next episode.
Your undivided attention is produced by the Center for Humane Technology,
a non-profit working to catalyze a humane future.
Our senior producer is Julia Scott.
Kirsten McMurray and Sarah McRae are our associate producers.
Sasha Fegan is our executive producer,
mixing on this episode by Jeff Sudaken,
original music and sound design by Ryan and Hayes Holiday,
and a special thanks to the whole Center for Humane Technology team
for making this podcast possible.
You can find show notes, transcripts,
and much more at HumaneTech.com.
If you liked the podcast,
we'd be grateful if you could rate it on Apple Podcast,
because it helps other people find the show.
And if you made it all the way here,
let me give one more thank you to you
for giving us your undivided attention.
Thank you.
