The NPR Politics Podcast - Hear What A Facebook Insider Told Congress About How Its Apps Hurt Kids

Episode Date: October 5, 2021

Former Facebook product manager Frances Haugen told senators that the company knows its products harm children and stoke division, but that executives have continued to prioritize growth over safety.T...his episode: White House correspondent Scott Detrow, congressional reporter Claudia Grisales, and tech correspondent Shannon Bond.Connect:Subscribe to the NPR Politics Podcast here.Email the show at nprpolitics@npr.orgJoin the NPR Politics Podcast Facebook Group.Listen to our playlist The NPR Politics Daily Workout.Subscribe to the NPR Politics Newsletter.Find and support your local public radio station.Learn more about sponsor message choices: podcastchoices.com/adchoicesNPR Privacy Policy

Transcript
Discussion (0)
Starting point is 00:00:00 Hi, this is Kirsten from Grand Canyon, Arizona. Yes, that Grand Canyon. I'm on my lunch break from being a park ranger. This podcast was recorded at... I assume that has better lunch break views than like any other place in the world. It is 207 Eastern on Tuesday, October 5th. Things may have changed by the time you heard it, but I will still be asking guests to not feed the squirrels. All right, here's the show. Hey there, it's the NPR Politics Podcast. I'm Scott Detrow. I cover the White House. I'm Claudia Grisales. I cover Congress.
Starting point is 00:00:37 Claudia, I recently learned the White House is actually a national park as well. The ground it is on is a national park. There are squirrels there too. No one is telling me not to feed the squirrels though. I know. I think we have more squirrel freedom here with what we want to do. Yeah. Take that. So today in Congress, a whistleblower who used to work at Facebook is in front of a Senate committee testifying about the way she says the company has misled the public. During my time at Facebook, I came to realize a devastating truth. Almost no one outside of Facebook knows what happens inside of Facebook. The company intentionally hides vital information from the public, from the U.S. government, and from governments around the world. The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the
Starting point is 00:01:25 safety of children, the efficacy of its artificial intelligence systems, and its role in spreading divisive and extreme messages. I came forward because I believe that every human being deserves the dignity of the truth. That's Frances Haugen, a former product manager for Facebook. She has provided documents backing these claims to federal authorities and to news outlets. NPR's Shannon Bond is also here again. Hey, Shannon. Hey, Scott. And Shannon, you cover Facebook.
Starting point is 00:01:54 We do need to say right off the top that Facebook is among NPR's financial sponsors. But we are going to talk about this testimony today and how this possibly changes the picture when it comes to how Congress regul testimony today and how this possibly changes the picture when it comes to how Congress regulates Facebook and social media. Shannon, can you start by reminding us how Haugen found her way in front of this committee? Because she has made a remarkable series of choices that have been coming public in very different ways over the last few weeks. So Haugen used to work at Facebook. She spent about two years there working on a team called Civic Integrity combating political misinformation. And what's important to know about her is her background is actually in how algorithms are designed. Right. So like how social felt like it was failing to make its platform safer if those changes risked its growth and profits. And it was hiding what it knew about the problems on its
Starting point is 00:02:51 platform from the public and from regulators. So when she left Facebook earlier this year, before she left, she copied tens of thousands of pages of internal documents. These were things like internal research, communications. And a lot of these documents appear to show that the company was very well aware of the ills of its platforms, things like the toxic risks of Instagram to some teenage girls, to mental health, the prevalence of drug cartels and human traffickers, and issues with how its algorithms seem to be actually encouraging a lot of negative, angry, divisive posting. And she says all of that is evidence that the company puts profits ahead of people. And so
Starting point is 00:03:30 these documents, they formed the basis of this big investigative series by the Wall Street Journal. But she also shared them with the SEC and with members of the Senate committee. And so today, the Senate committee wanted to hear from her. One of those moments was Minnesota Senator Amy Klobuchar talking to Haugen about this specific example. And I think it's some of the reporting that's gotten the most attention about how this content, you know, how Facebook employees were bluntly talking about the way that that some of the content on Instagram in particular can lead to eating disorders among teenage girls. Facebook knows that their engagement-based ranking, the way that they pick the content in Instagram for young users, for all users, amplifies preferences. And they have done something called a proactive incident response, where they take things that they've heard, for example, like, can you be led by the algorithms to anorexia content? And they have literally recreated that experiment themselves and confirmed, yes, this happens to people.
Starting point is 00:04:31 So Facebook knows that they are leading young users to anorexia content. Do you think they are deliberately designing their product to be addictive beyond even that content? Facebook has a long history of having a successful and very effective growth division where they take little tiny tweaks and they constantly, constantly, constantly are trying to optimize it to grow. Those kinds of stickiness could be construed as things that facilitate addiction. Shannon, I have several questions for you based on that. First of all, just to help listeners out, can you give us a good definition of the difference between engagement-based ranking and other ways that social media platforms can show you what you're going to be familiar with this. So what Facebook is doing is choosing what to show you, right? They're not going to show you every single post that comes out.
Starting point is 00:05:29 What they're trying to look at is what have you interacted with in the past? You know, what are you most likely to be interested in? What are you most likely to comment on or reshare yourself? And they're using all of that, those measures of engagement to decide what to show you, what they think is going to be most relevant, ultimately to keep you on Facebook, to keep you engaged. Same thing on Instagram. And so what Haugen is saying is that kind of engagement-based algorithms, that kind of ranking, which is what a lot of her expertise is in, you know, is actually having really, really harmful effects because, you know, in the example she was giving about eating disorders, I mean, what, you know, what may happen is, in her words, a teenager is on Instagram and searches for
Starting point is 00:06:10 something about dieting content. Well, that's a signal to Instagram about what she's interested in. And then, you know, that signal can be reinforced. Suddenly the algorithm is like, okay, great, we're going to show you more dieting content, more fitness content, and that can quickly escalate into, youate into eating disorder content and other issues. Claudia, was anyone on the panel defending Facebook or even just saying, hey, maybe this is more nuanced? Or was this a situation where, shockingly on a lot of fronts, Republicans and Democrats came into this with the same worldview?
Starting point is 00:06:43 Yeah, that was another detail that stood out today is kind of the bipartisanship, if you will, in terms of concerns that they had and in some ways are being validated by these documents, that they were worried that these social media products were having a destructive impact on individuals, including children. And they're raising that alarm along with this witness and saying that something needs to be done. Shannon, how has the way that Congress has approached issues like this changed over the years? Are lawmakers more serious about putting actual regulations in place? And more broadly, do lawmakers seem to have a better understanding of how these massive companies
Starting point is 00:07:24 actually work? Because we have seen many examples over the years of lawmakers not quite getting it and almost talking a foreign language when they try to ask serious questions to executives from Facebook and Twitter and places like that. I think they're just taking this a lot more seriously. There's been a huge amount of engagement. I mean, today's hearing was really quite different because it was a real novelty not to just see these senators yelling at a representative of the company, right? Like in that sort of performative way. A lot of the questions really were incisive and I think reflected, you know, a much better grasp of how this company works, how these algorithms work, you know, what are these design decisions that Facebook is making? I think they're really,
Starting point is 00:08:02 you know, getting in much harder at that. All right. A lot more to talk about on this. But first, we're going to take a quick break. We are back. As all of this has come out, as all of these damning reports have come out in this testimony, Facebook has, of course, been responding. A Facebook representative was on NPR's Morning Edition this morning. Shannon, what's the general overview of what Facebook is saying about Frances Haugen's testimony and the documents she's provided? Yeah, I mean, the company is, I think you would expect, pushing back pretty aggressively on this. They've actually just, in my inbox just now, I have a statement from their policy communications director. The issue that
Starting point is 00:08:41 Facebook is hitting on right now in the immediate aftermath of the hearing is basically saying, you know, look, Haugen didn't work on a lot of these issues. She didn't work on Instagram. She didn't work on child safety. They're saying she didn't have any direct reports. She wasn't sort of in a decision-making role at the company. I think they're implying here she's out of her depth. And they've also said they don't agree with her characterization of many of these issues, of the research. They say it's been mischaracterized and that they do care about user safety. And it's not accurate to say that they put profits over safety. I mean, I think what we've seen, though, I mean, Haugen admitted this much in the hearing. There are many times when she was asked about things by the
Starting point is 00:09:21 senators and she said, well, I didn't work directly on that, or that's outside of what I did. But what she was able to do, what's so powerful about her testimony, right, was point to the internal documents, point to the research and say, this is what people at Facebook were saying. This is what the results of their own surveys were saying. And I think that's really compelling. And I think it's harder in a lot of ways for Facebook to push back against that. And that's what I want to ask you about. Because when you think about whistleblowers in the past who have really changed the direction of things, I think there's two camps, right? There's the shocking new revelations that nobody knew anything about camp. And there's the camp where they're talking about something that a lot of people generally had a sense of, right? Like, is tobacco
Starting point is 00:10:02 bad for you? Of course it is. But here are lots of incriminating or really damning details of how that conversation is happening inside the company. For you, you have covered Facebook for a long time. Which camp is this falling into? Like, are you learning things you have never heard before about Facebook? Or is this kind of, oh, that's how they're having these conversations? I mean, I think it is a bit of both. I mean, I think on the latter point, you know, I think what makes Haugen a really remarkable witness here and makes these documents so compelling is that they are coming from inside Facebook. These are criticisms we've seen lawmakers make of the company. We've seen outside researchers ring alarm bells, right, about, you know, how engagement is driving polarization, is driving, you know, really divisiveness and anger?
Starting point is 00:10:46 Or, you know, what are the mental health impacts of, you know, a very photo-focused platform like Instagram on, you know, young users? But what's different here is that we are seeing how the company itself is talking about it. What I've learned and what surprised me in a way, you know, is like seeing in these documents just how concerned Facebook is about its future. Right. I mean, this is something The New York Times, Kevin Roos wrote about today. You know, this this issue of many of their products are not growing as fast as they used to, especially in the U.S. You know, this whole focus on Instagram and teenagers and this idea that the company has now put on hold of creating a version of Instagram for younger users, a lot of that motivation seems to be driven by the fact that Instagram is losing market share among kids to rivals like TikTok. And they're worried. What is the future of the platform if they don't get this next generation of users?
Starting point is 00:11:39 You know, if I could add, this testimony today really resonated with me in two different ways. As a journalist, wondering about the issues they're dealing with and really bringing that out in detail. And also as a parent to two teen girls, Instagram was a huge battle in our house. I had to bribe our girls multiple times to keep them off Instagram for a number of years because I was so worried about the impact. And just hearing all this, you just realize, wow, there really were. These are some of the issues I suspected might be happening, that it could negatively impact teens this way. And it's coming out in living color today. Eating disorders are serious, right? There are going to be women walking around this planet in 60 years with brittle bones because of choices that Facebook made around emphasizing profit today. Or there are going to be women in 20 years who want to have babies who can't because
Starting point is 00:12:30 they're infertile as a result of eating disorders today. They're serious. And I think there's an opportunity here for having public oversight and public involvement, especially in matters that impact children. Claudia, let's end with this, though. You have noted that there is a surprising, I guess it's surprising and not surprising, because on one hand, this is a very clear cut issue. And and these details are really alarming on so many fronts. So of course, there would be
Starting point is 00:12:54 a consensus. But on the other hand, the the situation in Congress right now is so toxic in a whole different way of the two parties not being able to get anything together. Yes. Given both of those very different factors, do you see any path towards some sort of legislative fix here, some sort of effort to really regulate Facebook in a different way? I think that will be very hard to come by. Even though we saw this spirit of bipartisanship, these worries from both sides of the aisle today, they are also very dug in, both parties, on some very big issues right now. And it's not clear that they're going to get past that and get to something like this in terms of this legislation, like we're hearing Blumenthal and others saying, you know, the days of no oversight are over
Starting point is 00:13:43 numbered for Facebook. But getting from there, those statements to actual legislation, that's going to be really tough to do in Congress today with how partisan really it comes to passing any bills on this. All right, we're going to wrap this conversation up for today. Shannon Bond, thank you so much for coming back on the podcast. Always happy to be here. All right. And I'm Scott Detrow. I cover the White House. I'm Claudia Grisales. I cover Congress.
Starting point is 00:14:10 Thank you for listening to the NPR Politics Podcast.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.