The Daily - The Facebook Whistle-Blower Testifies

Episode Date: October 6, 2021

The Senate testimony of Frances Haugen on Tuesday was an eagerly awaited event.Last month, Ms. Haugen, a former Facebook product manager, leaked internal company documents to The Wall Street Journal t...hat exposed the social media giant’s inner workings.How will Ms. Haugen’s insights shape the future of internet regulation?Guest: Sheera Frenkel, a technology reporter for The New York Times. Sign up here to get The Daily in your inbox each morning. And for an exclusive look at how the biggest stories on our show come together, subscribe to our newsletter. Background reading: Ms. Haugen told how Facebook deliberately made efforts to keep users — including children — hooked to its service.Here are other key takeaways from her testimony.For more information on today’s episode, visit nytimes.com/thedaily. Transcripts of each episode will be made available by the next workday. 

Transcript
Discussion (0)
Starting point is 00:00:00 From The New York Times, I'm Ested Herndon, in for Michael Barbaro. This is The Daily. Harmful, dangerous, and toxic. That's how The Wall Street Journal describes Facebook and its new investigative series, The Facebook Files. Last month, The Wall Street Journal released a trove of internal documents from Facebook. The Wall Street Journal had reported an internal document from a Facebook presentation that said this. Our algorithms exploit the human brain's attraction to divisiveness.
Starting point is 00:00:33 If left unchecked, it will feed more and more divisive content. Detailing how much the company knew about the harm it was causing. The Wall Street Journal revealed internal Facebook research found one in three teenage users said Instagram has contributed to their own body image issues, eating disorders, anxiety, and depression. Today. The whistleblower who leaked internal documents to the Wall Street Journal is now prepared to testify before Congress.
Starting point is 00:01:03 The whistleblower behind those documents testifies before Congress. I spoke with my colleague, Shira Frankel, about that testimony and why it could be a turning point for the social media giant. It's Wednesday, October 6th. today, October 6th. Shira, can you tell us about this hearing? So since the Wall Street Journal published this series of articles based on internal documents from a Facebook whistleblower, we've been waiting for this hearing. This is the moment where this whistleblower has finally been able to go public and speak
Starting point is 00:01:43 to senators directly about this cache of documents that she unearthed from Facebook. These are a number of research papers and turtle documents and memos, which for the first time really give us insight into what's happening at Facebook, from how Instagram affects teenagers to how Facebook makes decisions about hate speech on the platform. And so we've been waiting for the Senate Subcommittee on Commerce to hold this hearing when we finally get to hear this whistleblower speak for herself. The meeting and hearing of the Subcommittee on Consumer Protection of the Commerce Committee will come to order. So from the start, this hearing feels different. Ms. Haugen, we thank you for your address.
Starting point is 00:02:24 It opens with Senator Blackburn and Senator Blumenthal, who are the co-chairs of the committee, basically in agreement. Facebook knew its products were harming teenagers. You have a Republican and a Democrat both saying that they're incredibly alarmed by what the research shows on Instagram and Facebook. They invade the privacy, not only of adults, but of children. They bring up everything from the way that Instagram affects teenage girls. Again and again, Facebook rejected reforms recommended by its own researchers. To the way misinformation has been growing on Facebook's broader platform. And I think that it will be this Congress that is going to lead the way to online privacy, data security, Section 230 reforms.
Starting point is 00:03:08 And they seem to be sort of rallying around this idea that Facebook needs to be regulated. What we have is a bipartisan congressional roadmap for reform. And to have that kind of cohesive opening argument from a Republican and a Democrat senator feels incredibly unusual just right off the bat from the start. So who is this whistleblower and what did she do at Facebook? Thank you for the opportunity to appear before you. My name is Frances Haugen. I used to work at Facebook. So the woman who testified in front of the Senate Commerce Committee today is a woman named Frances Haugen. I have worked as a product manager at large tech companies since 2006. And she opens her testimony by giving her resume, which is impressive.
Starting point is 00:03:51 Including Google, Pinterest, Yelp, and Facebook. She is a person who graduated from Harvard. She worked at a number of companies across Silicon Valley, including Google, before she came to Facebook. My job has largely focused on algorithmic products like Google Plus Search and recommendation systems like the one that powers the Facebook newsfeed. And she worked on a really important team within Facebook on civic integrity.
Starting point is 00:04:13 During my time at Facebook, first working as the lead product manager for civic misinformation, and later on counter-espionage. That means that she worked on issues like misinformation and hate speech that led her to really understand the problems on the platform. And with that firsthand knowledge, what was her message about Facebook?
Starting point is 00:04:31 I saw Facebook repeatedly encounter conflicts between its own profits and our safety. Facebook consistently resolved these conflicts in favor of its own profits. Her basic message was that Facebook was making money by keeping people's attention on the platform. And it was making conscious decisions day in, day out to keep your attention by showing you really harmful and problematic content. This is not simply a matter of certain social media users being angry or unstable or about one side being radicalized against the other. It is about Facebook choosing to grow at all costs, becoming an almost trillion-dollar company
Starting point is 00:05:09 by buying its profits with our safety. So her message is that Facebook had chosen profits over people. Right, and that's something we've known about this company from the beginning. I mean, Mark Zuckerberg used to pump his fist in the air and say, company over country at the end of every meeting. I did not know that. That's crazy. Yeah. So they have been making these kinds of decisions, which have made them incredibly wealthy and powerful. This is a trillion dollar company. But they've been doing so,
Starting point is 00:05:38 sitting on a mountain of research, showing how these decisions are harmful, how they amplify things like hate speech and misinformation. how these decisions are harmful, how they amplify things like hate speech and misinformation. Okay, so let's get into that. When the senators had a chance to question this whistleblower about what she saw at Facebook, what specifically did she reveal about how those decisions are made? Well, I think firstly, she showed that Facebook has some pretty incredible researchers. Facebook has one of the top ranked research programs in the tech industry. They've invested more in it than I believe any other social media platform. And that research kind of fell into two buckets in today's hearing.
Starting point is 00:06:11 One looked at algorithms and how they amplify things like hate speech and misinformation. And then another bucket of research really focused on teens, and specifically Instagram. And it looked at how problematic that platform has become for so many teenagers. There is a broad swath of research that supports the idea that usage of social media amplifies the risk for these mental health harms. And Facebook's own research shows that. So there are these number of research papers.
Starting point is 00:06:38 Basically, Instagram data scientists have been looking at this for years. They've been focused on how their platform affects teenagers. And they look at, I think, what is kind of a commonly held premise for anybody that knows a teenager on Instagram, which is that a lot of teens feel bad about themselves after being on Instagram. And they start asking these teenagers these really basic questions like, you know, does Instagram make you feel bad about your body? If you were already having suicidal ideations, does Instagram amplify that? Kids are saying, I am unhappy when I use Instagram and I can't stop. But if I leave,
Starting point is 00:07:10 I'm afraid I'll be ostracized. Right. So they know that. That's what their research shows. They ask these things that parents have been saying for years, but now they're actually gathering data about it. Is it worse? How do we compare what we know about Instagram to other places on the internet? I know that those type of feelings, that type of self-esteem questions, I feel like you can find that in so many places online. What's unique about Instagram? or YouTube. I think what's unique about Instagram here is just how many hours of the day it's capturing teenagers' attention. This is one of the world's biggest social media platforms. It's incredibly popular with that subset, that 13 to 18-year-old group, and it's visual, right? The whole thing of Instagram is that you're looking at pictures of someone else. And it's fostered this culture where people have professional hair and makeup, and they use all sorts of technology that lets them Photoshop themselves to create an incredibly unrealistic standard of beauty.
Starting point is 00:08:10 Whereas other apps have fostered this idea that you want to look more real, you maybe even want to appear with blemishes on your face or slightly rumpled clothing. Instagram still really has this culture of looking really polished all the time. And some of that research looked specifically at that and how does that affect teenagers' negative images about their own bodies? Yeah, there's an authenticity piece that's still important on some other apps that Instagram has kind of shined away.
Starting point is 00:08:34 What else did the research say about how it's impacting young people and specifically young girls? So Facebook's algorithms, I think this was a key part of the hearing, are across all their platforms, from Facebook to Instagram, pushing people into potentially harmful content. My office created an Instagram user identified as a 13-year-old girl. She followed a few easily identifiable accounts on weight loss, dieting, eating disorders, and she was deluged literally within a day
Starting point is 00:09:07 of content pushed to her by algorithms that in effect promoted self-injury and eating disorders. Are you surprised by that fact? I'm not surprised by that fact. Facebook has internal research where they have done even more gentle versions of that experiment, where they have started from things like interest in healthy recipes, so not even extreme dieting. And because of the nature of engagement-based ranking and amplification of interests, that real account was pushed towards extreme dieting and pro-anorexia content very rapidly. So Facebook does this research showing very clearly that its own platform is harmful. What does Instagram do then in response?
Starting point is 00:09:55 What's interesting is what they don't do. They don't take action. They don't come public with this research. They don't bring in professionals on things like eating disorders and self-harm and show them the research and say, how can we change this, right? We don't see any of those steps taken. Instead, despite executives seeing this evidence, we see the current practices continue and Instagram's algorithms continue to push teenage girls towards this type of content that has proven to be harmful to them. Does it make sense that having a younger person get hooked on social media at a young age makes them more profitable over the long term as they have a life ahead of them? Facebook's internal documents talk
Starting point is 00:10:31 about the importance of getting younger users because they know that children bring their parents online and things like that. And so they understand the value of younger users for the long-term success of Facebook. They even decide to go one step further and start developing this thing known as Instagram Kids, which is going to target kids that are under the age of 13. And what we now see from internal research, they had these meetings where they talked about leveraging something as, you know, innocent as a playdate to try and get kids more hooked on Instagram
Starting point is 00:11:00 and to try to get really, really young children onto this platform. So that's the first bucket. What else were the senators interested in? You know, the senators did something unusual. They were really interested in the technical details of how Facebook works. Let me, Ms. Haugen, just ask you, we've learned from the information that you provided
Starting point is 00:11:18 that Facebook conducts what's called engagement-based ranking, which you've described as very dangerous. Could you talk more about why engagement-based ranking, which you've described as very dangerous. Could you talk more about why engagement-based ranking is dangerous? And I think this is something in the past that they've been scared of because you say the word algorithm and people think about, you know, long, complex equations and they don't want to ask about that. But one thing this whistleblower did was make it really simple. The way that they pick the content in Instagram for young users, for all users, amplifies preferences. She talked about what an algorithm is, which is Facebook making
Starting point is 00:11:50 decisions for you about what you see. And Facebook does that every day. It says, let's show them more of this type of content instead of that type of content. And so a lot of this research pointed to how Facebook is consistently making decisions that show you hate speech and misinformation despite having options to not do that. So give me some examples of how that played out. We have a few choice documents that contain notes from briefings with Mark Zuckerberg. So let's take a look at 2018.
Starting point is 00:12:18 Where he chose metrics defined by Facebook, like meaningful social interactions. Mark Zuckerberg says that he really wants to focus on meaningful social interactions. That sounds great, right? Like everybody wants meaning in their life. So MSI is meaningful social interaction. But this internal research shows that meaningful social interaction becomes MSI at Facebook. And what it actually is, is showing people more content from their friends and family that is potentially harmful.
Starting point is 00:12:46 Content that elicits an extreme reaction from you is more likely to get a click, a comment or reshare. Instead of seeing legitimate news sources about things like the election or about COVID, people are shown posts by their friends and family. And as anyone knows who's been on Facebook, posts by your friends and family aren't necessarily as factually accurate as that that was coming from a news article or from the CDC. So what sounded like a good goal, Zuckerberg saying that they want to focus on meaningful social interactions or MSIs, actually plays out in a way that speaks to Facebook's larger issues, plays out in a way that speaks to Facebook's larger issues, that in pointing people to family members and friends on the platform, it can actually spread more misinformation, particularly if it comes at the expense of seeing a more vetted news source. Exactly. What a lot of these MSIs are that Facebook is promoting is that really emotive content. It's that video that goes viral of somebody spreading a rumor or
Starting point is 00:13:45 conspiracy about COVID. Facebook is now amplifying that instead of the stuff that their own fact checkers is telling them is healthier for our news ecosystem. And so this was supposedly done, it was said to be done in an effort to make Facebook less toxic. But it seems as if what you're describing is that it actually created new problems for Facebook. What did they do once that became clear? Mark was presented with these options and chose to not remove downstream MSI in April of 2020, even though... What's interesting here is after that, even after being shown research that it's potentially
Starting point is 00:14:22 harmful, that it's creating more misinformation and hate speech, they don't reverse course. I think that's the key thing in this research and in the testimony. Like we've just read 100 pages on how downstream MSI expands hate speech, misinformation, violence inciting content, graphic violent content. Why won't you get rid of this? So when they are shown the information, Facebook doesn't change. Why? It's good for their bottom line. And I think this goes back to kind of Francis Haugen's original line. There is a pattern of behavior that I saw at Facebook, of Facebook choosing to prioritize its profits over people. That Facebook is a business and they're really good at making money for themselves. And they stress this idea of company over country.
Starting point is 00:15:03 Facebook executives in these documents we see from Francis Haugen are regularly shown alternatives. They're given, let's say, five options of things they can do ahead of the election to stop conspiracies. And they're told five is the most extreme, and if we do option number five, people are going to spend five hours less a day on Facebook. But it's probably the best for our current sort of environment.
Starting point is 00:15:24 And then, let's say, option number one is the least intrusive. If we do that, people will continue to spend as much time on Facebook. But it's probably the best for our current sort of environment. And then, you know, let's say option number one is the least intrusive. If we do that, people will continue to spend as much time on Facebook as they currently do. We see executives consistently choosing to take that least intrusive option because they don't want people to leave Facebook. They don't want to risk you spending fewer hours of the day on Facebook. The metrics make the decision. Unfortunately, that itself is a decision. So what this whistleblower is saying is that Facebook did the research, that it had the information, and it knew that its product was causing harm for a lot of vulnerable populations, but they did not roll that back partially because it was a growth opportunity and it keeps people on the platform for longer periods of time. Exactly. They essentially decide to keep doing things the way they have been.
Starting point is 00:16:12 And Haugen kind of looks at all that and summarizes it as these are morally bankrupt decisions. Part of why Facebook needs to come out and say, we did something wrong, we made some choices that we regret, is the only way we can move forward and heal Facebook is we first have to admit the truth. The way we'll have reconciliation and we can move forward is by first being honest and declaring moral bankruptcy. We'll be right back. So what does Haugen say should change about Facebook? Facebook wants you to believe that the problems we're talking about are unsolvable. I mean, she starts off, this is a company insider, so she starts off with things that Facebook can be doing itself to try and change things. They want you to believe that this is just part of the deal. I am here today to tell you that's not true. She talks about this idea of friction, which is a way of saying that Facebook
Starting point is 00:17:21 can create more friction before people share something. So make it harder to share misinformation. You know like how on Twitter, if you have to click through on a link before you reshare it? Small actions like that friction don't require picking good ideas and bad ideas. They just make the platform less twitchy, less reactive. One of those things that she said she raised herself was asking that people read an article before they actually share it. Now, as a journalist, I'm obviously in favor of that. But we now know from research at Twitter and other companies that this has really had an impact on people not sharing as many conspiracy theories and misinformation. If they have to click through and actually read something before they share. And Facebook's internal research says that each one of those small actions dramatically reduces misinformation, hate speech, and violence inciting content on the
Starting point is 00:18:09 platform. That does sound like a kind of achievable Facebook action. What else did she suggest? She suggests going back to something that some people might remember from the very earliest days of Facebook. Let's say, imagine we ordered our feeds by time, like on iMessage. Which is a chronological news feed. You would log into Facebook and you would just see posts in the order in which people made them. It would be a timeline, essentially,
Starting point is 00:18:35 instead of what we have now, which is Facebook's algorithms make decisions on what you see first. We don't want computers deciding what we focus on. We should have software that is human-scaled, where humans have conversations together, not computers facilitating who we get to hear from. And they favor the posts that are most vitriolic, the most emotive, the things that are going to make you the most angry or the most sad. And that's something that they have amplified over the last couple years. They want that kind of content at the top of your newsfeed because it keeps you really engaged.
Starting point is 00:19:07 But there's an argument to be made for going back to something that's purely chronological. But they don't want us to have that conversation because Facebook knows that when they pick out the content that we focus on using computers, we spend more time on their platform, they make more money. But why would Facebook ever make those changes voluntarily? Why would a for-profit company do something that they know would make them less money? I mean, Facebook could make a call that it wants to make those suggestions because it wants to create a healthier ecosystem for everyone. But unfortunately, that's not what most executives think. And so it does seem like pressure is going to have to come in from the outside for them to change. Left alone, Facebook will continue to make choices that go against the common good.
Starting point is 00:19:47 Our common good. When we realized big tobacco was hiding the harms it caused, the government took action. And that's why she was bringing up this analogy to big tobacco. The tobacco industry for years sat on research about how harmful it was to people's health, and they didn't make that public. And she kind of draws this analogy that Facebook is doing the same thing. It's sitting on a body of research
Starting point is 00:20:08 about how some of its products, especially Instagram, are bad for people, and it's not making the changes it needs to make. And so just like tobacco, she's saying that members of Congress have to take action. Even those who don't use Facebook are impacted by the majority who do. A company with such frightening influence
Starting point is 00:20:24 over so many people, over their deepest thoughts, feelings, and behavior needs real oversight. What would be the Facebook equivalent of a Surgeon General warning on cigarettes? Cigarettes are products. Facebook is this behemoth so ingrained in our society, the mode in which we communicate. Right. So this is obviously a lot more complicated than something as, you know, straightforward as tobacco. But we're seeing headway being made in how people talk about it. Ms. Haugen, you know, whistleblowing shows that Facebook uses harmful features that
Starting point is 00:20:57 quantify popularity, push manipulative influencer marketing. The fact that senators today even asked questions about the algorithm. And that's the algorithm. That's the algorithm. That algorithm could be changed. The algorithm definitely could be changed. Show that they now understand the basics, the fundamentals of how this company operates.
Starting point is 00:21:18 What that suggests to me is that while they're saying they're not targeting teens with those ads, the algorithm might do some of that work for them. And that's the key difference here. Until now, senators have been fixated on these speech issues. We've heard, I think all of us, senators asking Mark Zuckerberg about their personal Facebook accounts or about why this account was banned versus that account.
Starting point is 00:21:40 Getting into speech issues is so complicated in a place like the United States that values the First Amendment. And instead of focusing on that, they now have a path forward with focusing on what Facebook amplifies. You know, there's a woman named Renee DiResta, who's an academic and misinformation expert at Stanford University. And she coined the phrase, freedom of speech is not freedom of reach. Let's think about the reach. Let's think about what Facebook is promoting to people. You can say whatever you want, but Facebook doesn't necessarily have to promote it. So in some ways, it seems like this hearing marked a turning point for even our own elected officials or government bureaucrats. They have finally caught up to understanding what seems to
Starting point is 00:22:18 be the central issue with Facebook, the entire algorithm business model. Exactly. You know, Francis Haugen really did a great job of putting into layman's terms things like the algorithm. Now, I'm still waiting to see what comes of this, right? Because we all know that there can be really interesting and effective hearings, but whether senators actually propose legislation after that is left to be seen. I do think what this hearing showed us is that Republicans and Democrats appear uniquely united on wanting to do something and on agreeing on a range of issues like the algorithm or, you know, teenagers on Instagram that they can move together on a bipartisan way.
Starting point is 00:22:54 Well, let's take the unlikely scenario that the government actually does act, that they actually do agree to do something to regulate that algorithm. How much would that affect Facebook's bottom line? Well, it depends what kind of action they promote. If they're telling Facebook to change its basic newsfeed, right? If they're saying, hey, we want you to make this newsfeed chronological instead of favoring certain types of posts over another, that's probably going to have a huge effect. But I can't imagine they go for that kind of extreme option. I think instead what they do is say, you yourself have said that you don't want to promote certain things like COVID misinformation. Facebook has been really public by saying, we don't want as a company to promote COVID misinformation. Now, we as the government are going to hold you accountable and responsible if you continue to promote COVID misinformation. If you suggest to people to follow accounts that are known promoters of COVID misinformation
Starting point is 00:23:45 or amplify and recommend COVID misinformation groups and pages, we're going to hold you responsible for that. That is your recommendation algorithm pushing people towards something that you know they shouldn't be pushed towards. And how was that prospect received at Facebook? Well, if you're a Facebook engineer, and I spoke to one Facebook engineer late last
Starting point is 00:24:05 night who was like, yes, please, you know, this is what I want. This is what I've been asking my managers for. Let's do what we say we're going to do and let them hold us accountable. I think there are actually a lot of people inside Facebook that are hoping that's what happens. I think if you're a Facebook executive and you're thinking about the bottom line and you're thinking about competing against China and TikTok and all these other companies out there, this is awful, right? This is going to slow down your business. This is going to force you to think about something really in the weeds and mundane instead of your huge growth aspirations that you've been able to focus on until now. But couldn't making the platform less toxic be a good business decision? How do we know that
Starting point is 00:24:43 changes to the algorithm, looking at this research to minimize harm, isn't a good long-term strategy that people will spend more time on the platform that they find less toxic? Well, that's the argument that Haugen is making, that Facebook can make decisions to make itself a healthier platform and that in the long run,
Starting point is 00:25:00 it'll actually improve their business. But the truth is we don't know because no one's tried it yet. They've decided that they really want short-term gain. They really want more users. And so they've made decisions to amplify hate speech and misinformation because it gets people coming back over and over again. And I think it's a great question, right? Like what happens if they were to decide the opposite? Would they end up with people coming back fewer times a day, but actually staying lifelong customers and wanting to encourage their children when they're old enough to join a place like Instagram or Facebook.
Starting point is 00:25:29 You know, listening to this hearing, I was wondering why Frances Haugen wasn't advocating for Facebook to be broke up. She wasn't saying burn it down. And we know that there has been growing calls for some who say to regulate Facebook almost into oblivion. Did that surprise you? You know, it didn't because all the Facebook insiders I've spoken to have taken the same line. They don't want to see Facebook broken up because then you create 10 mini problems. Right now you have really one big problem which you can try and solve. And Facebook has the resources and the manpower to try and solve it.
Starting point is 00:26:02 has the resources and the manpower to try and solve it. And for a long time, we just didn't have a focus on this, right? For a long time, the FTC, the Senate, really global regulators weren't paying a lot of attention to what Facebook was doing. Now we're in a different time. We're in a different era. Republicans and Democrats are uniquely united in agreeing that something has to be done. And we are seeing
Starting point is 00:26:25 momentum. But, you know, as you know better than anyone, that kind of momentum can move at a snail's pace, especially in Washington. Thank you, Shira. Thank you. We'll be right back. Here's what else you need to know today. A Texas parole board has recommended a posthumous pardon for George Floyd, whose murder ignited worldwide protests against racial injustice and police brutality. Floyd was convicted in the 2004 drug case in Houston at the word of a Houston police officer who has since been charged with fabricating evidence and with murder. who has since been charged with fabricating evidence and with murder. The final decision on the pardon rests with the governor of Texas, Greg Abbott.
Starting point is 00:27:37 And the Times reports that White House officials have acknowledged that their ambitious $3.5 trillion budget will have to be significantly cut to $2 trillion or less in a concession to moderate Democrats. The scale-back budget will force cuts in key areas of Biden's agenda, including tackling climate change and investments in education. Today's episode was produced by Clara Tennis-Gedder, Rob Zipko, Asta Chaturvedi, Rachel Quester, and Nina Pacek. It was edited by Paige Cowett and engineered by Chris Wood and contains original music by Dan Powell. Our theme music is by Jim Brumberg and Ben Landsberg of Wonder League. That's it for The Daily.
Starting point is 00:28:24 I'm Ested Herndon. See you tomorrow.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.