Front Burner - Facebook’s bad week
Episode Date: October 8, 2021After a major outage and stinging whistleblower testimony, NPR tech reporter Bobby Allyn breaks down Facebook’s bad week....
Transcript
Discussion (0)
In the Dragon's Den, a simple pitch can lead to a life-changing connection.
Watch new episodes of Dragon's Den free on CBC Gem. Brought to you in part by National
Angel Capital Organization, empowering Canada's entrepreneurs through angel
investment and industry connections. This is a CBC Podcast.
Hi, I'm Jamie Poisson.
Facebook has had a rough week. On Sunday, 60 Minutes ran this highly critical
interview with Frances Haugen, a whistleblower who worked on Facebook's civic misinformation team.
There were conflicts of interest between what was good for the public and what was good for
Facebook. On Monday, you probably noticed that the site and its apps,
like Instagram and WhatsApp, were totally down.
Total outage, no signs of life. This is huge.
Not just for a few minutes, but for more than five hours.
Then on Tuesday, Haugen testified before the U.S. Congress about harm she believes the company knowingly caused.
Because I believe Facebook's products harm children,
stoke division, and weaken our democracy.
The company's leadership knows how to make Facebook and Instagram safer,
but won't make the necessary changes
because they have put their astronomical profits before people.
All this comes on the heels of the extraordinary cache
of internal Facebook documents leaked by that same whistleblower
to the Wall Street Journal, which served to buttress her testimony. Today, NPR tech reporter
Bobby Allen on this latest Facebook firestorm and how it differs from past controversies the company
has faced. Hi, Bobby. Thanks so much for being here.
Thanks for having me.
So this whistleblower, Frances Haugen, she has some really strong words for Facebook at this congressional hearing this week, in particular with regards to its effects on children and youth. So let's start there.
and youth. So let's start there. Yesterday, we saw Facebook get taken off the internet.
I don't know why it went down. But I know that for more than five hours, Facebook wasn't used to deepen divides, destabilize democracies, and make young girls and women feel bad about their
bodies. When this whistleblower says she believes Facebook products harm children. What is she talking about exactly?
So Frances Haugen is this former Facebook data scientist who had unique access to some internal documents at Facebook. And when before she left the company, she secretly copied these confidential
documents and left and shared them with US regulators and the media. And when she accuses Facebook of harming children and sowing division and amplifying misinformation,
she's not just saying this as someone who has an opinion.
She's citing internal research that Facebook conducted,
which shows that its platforms, Facebook and Instagram,
which shows that its platforms, Facebook and Instagram,
the algorithms that drive them push misinformation and images that make especially teen girls feel worse about themselves.
It pushes it far and wide throughout these platforms.
Facebook's own research about Instagram contains quotes from kids saying,
I feel bad when I use Instagram, but I also feel like I can't stop.
So basically, her point to Congress is Facebook knew about the harms of its platforms,
hid research about it, and then lied to the public about it.
Just one example of that, the research showed that 13.5% of teen girls said Instagram worsens their suicidal thoughts.
And 17% of teen girls said Instagram contributes to their eating disorders.
I feel a lot of pain for those kids.
They say they fear being ostracized if they step away from the platform.
So imagine you're in this situation, in this relationship,
where every time you open the app it makes you feel worse.
But you also fear isolation if you don't.
How exactly is Frances Haugen saying this happens?
Like, what is it about Instagram and the way that it works that seems to have these effects?
Right.
So Instagram and Facebook, their algorithms are driven by engagement.
It's engagement-based ranking. And what that means in plain English is if there's a post that gets a lot of comments,
that gets a lot of likes, that gets a lot of shares, it is basically supercharged on
Instagram.
So if you go on Instagram, the photos that you're seeing are not the photos that were
just posted.
It's not chronological, but it's based on how much engagement posts are getting.
And Facebook's own research has found that the things that tend to get the most attention, especially on Instagram,
are, you know, I mean, we all use Instagram, right? Everyone paints their best life on there.
It's like vacation photos. It's people getting married. It's engagement photos. It's all sorts
of images that you want to present to the world that make yourself, it's a sort of fantasy of yourself.
The dangers of engagement-based ranking are that Facebook knows that content that elicits an extreme reaction from you is more likely to get a click, a comment, or reshare.
But when it comes to body image issues, I mean, that could be very harmful. If you're a 17-year-old girl and you're using Instagram and you already have body image issues and you're seeing constant, constant photos that
are making you feel bad about yourself. Facebook knows that its amplification algorithms, things
like engagement-based ranking on Instagram, can lead children from very innocuous topics like
healthy recipes to anorexia promoting content over a very short
period of time. Yeah, you know, it was really striking to hear Haugen in her testimony talk
about how the company was aware that even when these teens were aware that that scrolling was
making them feel bad, they also felt that they couldn't stop. They want the next click.
They want the next like.
The dopamine, the little hits all the time.
Facebook responded to this in part by saying that this research is being mischaracterized
and basically that it's focusing on the research that found negative effects, that it's cherry
picked.
It doesn't take into account other research that shows the positive effects on teen mental health, right? And I
wonder how you would respond to that. That has been Facebook's standard response, as you know,
that most of the attention is training on the studies that make it look as if teen girls,
And the studies that make it look as if teen girls, you know, thoughts of suicide and image issues, body image issues are worsening after using the platform. And yeah, Facebook will say, well, there's other research that shows that people feel more connected when they use social media more, that people actually feel better about themselves when they use social media more.
actually feel better about themselves when they use social media more. One thing, you know,
putting these two sort of thoughts together is no study has proven that Facebook or Instagram causes people to have suicide thoughts that causes people to, you know, be anorexic. But what the
studies are showing that if you have one of those issues already, that it is exacerbating it. Like the
one of the statistics that I think is very salient is, you know, one in three young women who already
have pre existing body image issues are finding that Instagram makes it worse. So there's not a
cause and effect here that being on Instagram is creating the problem. But I think that Instagram is exacerbating
it. And yes, Facebook can say, well, here is this other research. And that is fine. There is other
research. But if you have studies pointing to your platform being harmful, I think Haugen and
others would say it's about time that you do something about it.
I just want to read some responses from Facebook.
Mark Zuckerberg himself made a Facebook post after Haugen's testimony.
He wrote, amongst other things, many of her claims don't make any sense. If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place?
Would we create an industry-leading research program to understand these important issues in the first place?
And then in an interview with your network, with NPR this week,
Neil Potts, Facebook's VP for Trust and Security, said the company is committed to doing this research and that they're investing billions of dollars on these issues.
For any one person that experiences these, we want to make sure that we try to eliminate that.
But on balance, we are doing the work, investing in the research so we know how to approach these issues, and really,
even sending interventions to people who may be impacted by such harms.
Yeah, sure. I mean, look, I will say that Facebook, all the way up to Mark Zuckerberg,
in the wake of this scandal, has said we're sticking with this kind of research.
There could have been another corporate response from a company as large as Facebook, and it could
have went, you know what, doing this research that takes a really critical look at the harmful
effects of our platform, it's too much of a vulnerability. It could get leaked like it did
here. It could cause a huge PR nightmare for us. Let's just stop it completely.
And to Facebook's credit, they said, we think this kind of research is important. We're glad
we're doing it and we will continue doing it. So, I mean, that's something that's worth pointing out.
At the same time, the Civic Integrity Division of Facebook, which Frances Haugen, the whistleblower,
was a product manager on, was disbanded shortly after she joined it.
And the employees were sort of scattered to other projects across Facebook.
And when they got rid of Civic Integrity, it was the moment where I was like,
I don't trust that they're willing to actually invest what needs to be invested to keep
Facebook from being dangerous.
On that note, I wonder if you could tell me a little bit more
about who Frances Haugen is. What is her deal and how did she become this whistleblower?
Yeah, so Frances Haugen is a data scientist who has worked at companies like Yelp and Google.
She worked at the dating app Hinge. And her specialty has been studying how algorithms shape what we see on social media.
She's kind of a numbers person, right?
She got a job at Facebook after she had a friend of hers.
They had a big falling out after this person got radicalized on Facebook.
And she was so upset about this that she was like, you know what?
I'm going to try to work there and change it from within.
I never wanted anyone to feel the
pain that I had felt. And I had seen how high the stakes were in terms of making sure there was high
quality information on Facebook. And so she was she was working on civic misinformation, she was
doing very important work. But then she had a crisis of conscience. She realized that Mark
Zuckerberg and other Facebook executives were publicly saying that Facebook wasn't creating the kind of harms that she knew they knew it was, and she was at a crossroads.
And she thought, do I just sit here in this company, see this research, and see it not reflected in the public statements, or do I do something about it?
And she actually asked her mom, who is an Episcopal priest, what should I do? Well, I don't
want to betray this company. But at the same time, I'm having a crisis of conscience. And her
priest mother told her to follow her heart, right? And so she was emboldened by that,
copied thousands of pages of documents, internal confidential documents from Facebook,
left the company and shared them with the Wall Street Journal and with regulators in Washington. And here we are today. But there have been other ex-employees
who've been critical, but she's a special case. Yeah, she's an incredibly effective communicator.
That seems fair to say. Absolutely. She's a very effective communicator. She speaks with emotion.
She speaks with conviction. She speaks armed with data.
She doesn't just have an opinion.
She's a very sort of like science-driven person.
She has a real science mind.
And she knows what she's talking about.
She was inside the company and had really special access to documents.
And she's definitely a fierce critic.
The word noting here that Facebook is saying, like, essentially that she doesn't know
what she's talking about, right? But before, when I was reading that quote from Zuckerberg, he says
that many of her claims don't make any sense. And then a statement from a company spokesperson
said that she worked for the company for less than two years, had no direct reports, never attended a
decision point meeting with C-level executives, and testified more than six times to not working
on the subject matter in question. So I mean, that is certainly how they're responding.
Yeah. So Facebook's response has been to try to discredit Frances Haugen. And I will say
that she admits freely herself that she was never in these C-suite meetings with top executives that she says she worked on civic misinformation. She doesn't try to say that she worked in any other
division of Facebook. But former Facebook executives who were actually above her and
on that team said, hey, I'm raising my hand to say she may not have been in these executive
meetings, but I was and I think the points that she's raising are actually valid.
So in the face of attacks from Facebook, she's getting support from other Facebook employees,
which shows that she's not waging this battle alone.
She has support behind her.
I'll see you next time. Canada's entrepreneurs through angel investment and industry connections. Hi, it's Ramit Sethi here. You may have seen my money show on Netflix. I've been talking about money for 20 years. I've talked to millions of people and I have some startling numbers to share
with you. Did you know that of the people I speak to, 50% of them do not know their own household
income? That's not a typo. 50%. That's because money is confusing. In my new book
and podcast, Money for Couples, I help you and your partner create a financial vision together.
To listen to this podcast, just search for Money for Couples.
What explanation does she speculate for Facebook not thoroughly addressing these things internally?
Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site.
They'll click on less ads.
They'll make less money.
Haugen says it's because Facebook is most interested in delivering strong profits to shareholders, right?
I mean, Facebook's already nearly a trillion- dollar company. It's like 60,000 employees. It's an enormous organization. And they treat people
like data points, right? And they know if you spend more time on Facebook and Instagram,
that that is a data point that they can sell to advertisers. You know, something like 98%
of Facebook and Instagram's revenue comes from
advertising. And they are, according to Haugen, just really want to juice profits, just really
interested in growth, growth at all costs. And if that means growth will come just as there's
rampant misinformation and other harmful content flying around these platforms, so be it. So
she says that they place, Facebook is placing profits over
public safety. It is about Facebook choosing to grow at all costs, becoming an almost trillion
dollar company by buying its profits with our safety. Right, right. And just worth noting in
his post, Zuckerberg says that that is not true. The idea that they prioritize profits over.
Exactly. Zuckerberg's response is, look, we talk to advertisers all the time. They don't want their
ads next to hateful, harmful, or angry content. So it just doesn't make sense. So that's his sort
of rebuttal. I think it's also worth noting here that what we've talked about so far,
it doesn't cover all of the challenges that Facebook is facing right now. For example,
as I mentioned in the intro, Facebook, Instagram, and WhatsApp all went down
for more than five hours on Monday.
And can you give me a sense of some of the other things
on Facebook's plate right now?
I understand the outage was just kind of like a boring thing
about servers not talking to each other properly,
but what else are they going through?
Yeah, I mean, that was something that happens
kind of often, outages. But this one was remarkable, because, like you said, it lasted for so long and affected so many of its or all of its products. And I think really revealed like just how vulnerable even a company that has like a third of the world on its platform, how vulnerable these companies are to the internet breaking. And when the internet breaks, I mean, you know, like business people can't use WhatsApp and like people can't communicate on Instagram. It creates all sorts of problem for small business owners and for people around the world who rely on these platforms. is now facing some of the most serious bipartisan pressure in Washington that it has ever faced.
There are lawmakers calling for Facebook to be broken up. There's a number of legislative
proposals that are squarely taking aim at how Facebook operates on the internet and trying to
figure out ways to sort of rein them in. This is a company among many companies in Silicon Valley that prospered and became a global titan in a regulatory
environment in which they face almost no regulation. So that is about to change. The FTC,
the Federal Trade Commission in the US has sued Facebook and calling it an illegal monopoly. And
the Justice Department under President Joe Biden is taking a very,
very hard look at Facebook and other tech companies, which is very different than
the Obama years when tech and the White House were sort of glad handing. And even Obama,
you know, he was all about supporting innovation and cheerleading these companies. And now it's a
very different tone in the White House. So yeah, we have this whistleblower controversy. We have this outage that's a problem. But
politically speaking, and from a legal standpoint, they are really in a vice grip right now.
It was interesting to me that Frances Haugen doesn't see this antitrust angle as a solution to the problems that she's actually flagging. She doesn't think breaking up Facebook will solve the
problems that she sees. That's right. So Frances Haugen says if you break up Facebook from Instagram
and WhatsApp, basically you're not actually addressing the root cause of the problem,
which she says is its algorithm.
Its algorithm needs to be recalibrated,
and there needs to be more transparency
about what exactly that algorithm is amplifying around the world.
And she said if you break up Facebook,
all of the advertising dollars are going to go to Instagram,
and then you'll have this separate entity, Facebook,
that she said will turn into almost like a Frankenstein. It will be this totally separate
monster of a company that has far less advertising revenue than it had, but still has a third of the
world on it and far fewer resources to combat some of the misinformation, some of the hate,
some of the other harmful content that flies around Facebook so rapidly.
So her response to the we should break up Facebook is just, that's actually going to
make the problems worse, not better.
I will say there are other critics who respond to that by saying it's not either or.
You can both agree that these companies are illegal monopolies and they're too big to
fail and they're too entrenched and they ought to be broken up and examine the underlying algorithms
that power the companies.
So this really does provide a window
into a very nuanced divisive policy debate
about how are we going to respond
to these big tech companies.
But yeah, it was super interesting
that she's against breaking up Facebook
as a former Facebook insider and critic.
You know, Facebook has obviously gone
through turbulent times before, But do you see this
moment as different? Do you think that something will actually come out of this?
Yeah, so we have this whistleblower entering a situation in which Facebook is feeling the
walls closing in on it. This is a company that sees its stature under siege is an embattled
company that sees its power and influence in the world declining. It is an embattled company. It sees its power and influence in the
world declining. It is trying so hard in Washington and around the world to maintain its relevancy and
to keep its grip on power. And I think the whistleblower, you know, raising really serious
existential questions about Facebook at this moment makes the company extremely vulnerable, not just to regulation in
Washington, but to the scrutiny of lawmakers and regulators and users around the world. So I think
if Frances Haugen came out in 2009, 2010, a different era of Facebook, it may have washed
over. But because there is a tsunami of criticism that is sort of supporting her, that she just
might be the thing that spurs serious internal change at Facebook.
Okay.
Bobby, thank you so much for this.
Thank you.
Thanks for having me. All right, so before we go today, a different kind of Facebook drama.
On Thursday, it was announced that the tech giant will be the focus of a scripted series titled Doomsday Machine.
Claire Foy, best known for playing Queen Elizabeth in the first two seasons of The Crown,
is set to play Facebook COO Sheryl Sandberg.
That's all for today.
FrontBurner is brought to you by CBC News and CBC Podcasts. The show was produced this week by Imogen Burchard, Simi Bassi, Ali Janes, and Katie Toth.
Our sound design was by Brittany Amadeo, Julia Whitman, and Austin Pomeroy.
Our music is by Joseph Chavison of Boombox Sound.
The executive producer of FrontBurner is Nick McCabe-Locos.
And I'm Jamie Poisson.
We'll be back with a new episode of FrontBurner on Tuesday after Thanksgiving.
For more CBC Podcasts, go to cbc.ca slash podcasts.