The NPR Politics Podcast - The Facebook Papers Show How Quickly Radicalization Can Happen Online

Episode Date: October 25, 2021

Thousands of leaked documents from Facebook were viewed by more news organizations over the weekend including NPR. The internal sources show the company struggling with how to combat misinformation an...d researchers worrying about the impact of the platform.This episode: White House correspondent Ayesha Rascoe, political reporter Miles Parks, and tech correspondent Shannon Bond.Connect:Subscribe to the NPR Politics Podcast here.Email the show at nprpolitics@npr.orgJoin the NPR Politics Podcast Facebook Group.Listen to our playlist The NPR Politics Daily Workout.Subscribe to the NPR Politics Newsletter.Find and support your local public radio station.Learn more about sponsor message choices: podcastchoices.com/adchoicesNPR Privacy Policy

Transcript
Discussion (0)
Starting point is 00:00:00 Hi, this is Guillermo. And this is Miyuki. And we're both epidemiologists currently in a haunted corn maze looking for something other than COVID to give us a fright. This podcast was recorded at... It is 2.06 p.m. on Monday, October 25th, 2021. Things may have changed by the time you hear this, but hopefully we'll have found our way out of here. Because we are completely lost. Alright, here's the show.
Starting point is 00:00:28 Oh, wow. We're going to get another timestamp from them saying, we're still here. Please help. Oh, no. Hey there. It's the NPR Politics Podcast. I'm Aisha Roscoe. I cover the White House. I'm Miles Parks. I cover voting and misinformation. And we've got NPR's tech correspondent Shannon Bond here with us. Hi, Shannon. Hey, guys. And Shannon, I'm sure that you have been really busy over the past few days. A Facebook whistleblower leaked a bunch of documents from the company to a number of news organizations, including NPR. And everyone may remember that this whistleblower, Francis Haugen, testified before Congress about these
Starting point is 00:01:14 documents earlier this month. The choices being made inside of Facebook are disastrous for our children, for our public safety, for our privacy, and for our democracy. And that is why we must demand Facebook make changes. When she testified, these documents had only been shared with a few reporters. Now they have been shared much more widely, and there are many thousands of pages that have now been reviewed by a number of outlets, and more is on the way. Shannon, can you start by like just breaking down some of the major takeaways from
Starting point is 00:01:52 these documents? Yeah, I mean, I'd say like the sort of overall theme here is that Facebook employees, many Facebook employees sort of as documented in internal research, in conversations on the company's internal message boards, just so much material here in this trove of documents from Haugen. It shows employees were aware of, and in many cases, ringing alarms about these destructive effects Facebook can have on its users and on the broader world. And this is really important. It's not just here in the U.S., it's around the globe. But these employees say, in large part, leadership has often resisted their efforts to make the platform safer. So whether that's slowing down the spread of misinformation, looking at the way people can
Starting point is 00:02:36 become radicalized in groups and, you know, foment these conspiracy theories like QAnon, and driving real world violence in places like India, Ethiopia, Myanmar. It's a really damning portrait from inside the company of just how extensively Facebook sort of has been grappling with these problems. And at least as it's framed in these documents, the shortcomings in terms of how the company has actually responded to these problems. Facebook is this massive entity that is constantly chasing problems,
Starting point is 00:03:08 often after they've already become so big that it's hard to get a real handle on them. I mean, Shannon and Bobby Allen had a story come out this week. Shannon, you could talk about it a little bit more, but there's a lot of stuff in these documents that kind of document how Facebook groups, how much of a role they played in allowing the concept of the Stop the Steal movement, this idea that, you know,
Starting point is 00:03:32 Joe Biden's victory was fraudulent. People were able to use these groups and basically grow them so quickly that Facebook wasn't able to react before, you know, this idea had such a broad following. There's a whole section of these documents on kind of what the company was doing, both ahead of the election and then after the election, in the run up to what happened on January 6. And, you know, frankly, Facebook spent years trying to make sure there was not a repeat of 2016, that there wasn't foreign interference. You know, they did all kinds of game planning for worst case scenarios, you know, whether it was Russian trolls or a big hack. And as Election Day got closer, they were also, you know, I think really very concerned and preparing for the possibility
Starting point is 00:04:14 of a contested outcome and civil unrest or even violence. I mean, we all remember what the summer of 2020 was like, right? So one thing that the company did was put in place this emergency playbook. They call these break the glass measures. And the idea, this is a way of basically kind of slowing the platform down. So slowing down the spread of misinformation, of hate speech, giving their reviewers more time to sort of look at what's happening before things are getting really, you know, going viral. And Facebook did roll out many of these measures before Election Day in November. So it stopped recommending groups. It suspended political ads. It declared the U.S. a high-risk
Starting point is 00:04:51 location so it could more aggressively just delete posts it thought might be harmful. And then voting went off pretty much without major incident on Facebook. And employees were feeling relief. But that didn't last long because, you know, as Miles said, suddenly we saw the rise of this. These angry Trump supporters out in the hours after polls closing started coalescing around this slogan, Stop the Steal, and asserting this baseless claim that the election had been stolen. Late on election night, this group was created called Stop the Steal. And it grew really, really fast. It was adding tens of thousands of people every hour. And, you know, despite the fact that it had some of these measures in place, you know, it just wasn't able to control the growth of this group. And so they were kind of playing whack-a-mole, trying to tamp down these groups, but it obviously didn't work. Like, what does the whistleblower feel like Facebook could have done differently?
Starting point is 00:05:51 Right. Well, I mean, we should be clear. Like, Facebook did shut down this first Stop the Steal group. But as you say, it was a game of whack-a-mole because new groups kept popping up, making some of these same claims, you know, the same with calls for violence, all kinds of issues. And, you know, Facebook was trying to take them down, but it was in a very piecemeal sort of way, right? And it seems that they've kind of failed to recognize that this wasn't just sort of these one-off groups violating the rules. There was much more coordination happening here. So one of the things Haugen has spoken about, and also I've spoken to other former employees at Facebook who were there at the time. You know, one thing they say just very directly is some of the emergency measures that I mentioned that were in place, you know, ahead of the election and on Election Day, Facebook started lifting after Election Day.
Starting point is 00:06:34 And so they think that was just really premature. There was clearly still misinformation spreading. There was clearly still heightened risk. And the company should have kept some of these in place longer. But there also sort of is a broader failure here that these employees, including Haugen, point to, which is that Facebook's own researchers have been warning for years that some of the very core ways that make Facebook Facebook, right, the emphasis on groups that we've seen grow in recent years, the way its recommendations work, right, where it tells you, we think you're
Starting point is 00:07:05 going to be interested in joining this group, we think you're going to be interested in this post. And the way posts can go quickly viral has real risks that the company just hasn't addressed. And they say it's because it's more focused on keeping the platform growing. And that gets at Haugen's broader criticism that Facebook puts its profits ahead of safety. What is Facebook saying about all this? And we should note here that Facebook is a recent funder of NPR. But what is Facebook saying? Well, you know, Facebook's pushback specifically on the stuff about Stop the Steal and January 6th is that, you know, it's the line it has taken is responsibility lies with the people who stormed the Capitol, with the people who incited them. And the company has talked about how it's invested a
Starting point is 00:07:50 lot of money in planning for the election. It did a lot of work to take down harmful groups and posts and that just focusing on these break the glass measures doesn't capture the totality of what it did, you know, ahead of 2020 and in the weeks after. But in more broadly, the company also disputes Haugen's claim that it prioritizes growth over safety. It says that's just a misrepresentation. But I think, you know, it is hard in a way for them to push back against a lot of these stories that are coming out. And so much of this that I think is truly damning is hearing from Facebook employees themselves saying, you know, we feel the company is falling short. All right. We're going to take a quick break here.
Starting point is 00:08:28 And when we get back, we'll talk about what else we've learned from these Facebook papers. And we're back. like really interesting parts of of these papers was like this discussion of how facebook researchers actually created a fake user account to see what a kind of quote regular person's experience might be like so they created this profile of carol smith you know a fake person, but was interested in politics, Fox News, and Donald Trump. And it seems pretty clear that they were able to see how people might be radicalized, because only two days after the account had been created, Carol Smith was being directed to QAnon groups. So it shows how you can go down the rabbit hole.
Starting point is 00:09:29 And as we've said, this doesn't only happen like in the US, like there was a similar project in India, right? Yeah, that's right. I mean, this is like some of the most interesting research, I feel like, you know, that I came across looking at this is, you know, as you're seeing these Facebook employees kind of trying to grapple with, like, how do our recommendations work? And it's not sort of like, oh, right, I'm kind of getting into, like, you know, conspiracy theory light, and then that leads me down the path. It's like, literally, they start this account, or the example, the accounts in India, they start a user's account, you know, they're following a couple sort of mainstream pages, news outlets.
Starting point is 00:10:07 And then they're sort of following this rule. Like they're going to start to follow what Facebook recommends to follow. Right. So a lot of the way Facebook works is it looks at what you interact with and what you're interested in. And then it says, OK, you might be interested in this. And the results in India was apparently pretty horrifying. I mean, this user was, you know, ended up being served up tons of hate speech, misinformation, violence. I mean, the researcher said, following this test user's news feed, I've seen more images of dead people in the past three weeks than I've seen in my entire life total. I do think the other thing about this whole concept of experimenting and creating Carol Smith and
Starting point is 00:10:45 finding out she's into QAnon in three days. This is not a situation where the platform hasn't gone public yet and a researcher is like trying to figure out how it works. There are, you know, millions and millions and tens of millions of people already using the platform and the researchers on the back end are then trying to figure out, oh, how is this affecting people? And I think back to a conversation I had with a researcher a couple months ago when I was doing that story on mental health and Facebook. And he said, this is the largest scale science experiment in the entire world. Like basically, no one, it seems like from these documents at Facebook, has a great understanding of how these are
Starting point is 00:11:25 affecting people, and yet we're all using them and finding out in real time how they're affecting people. So the bigger issue, kind of a broader question here, is like Facebook is a platform, it's a company that makes money from people using it and people sharing content the users who then went and stormed the capital they were just using the tools of the platform like like is there a way for a social media platform to exist where you bring together all these people and people often can get together and if you have your interest is like overthrowing the government and you can link up with all these other people and also try to get other people on your side, like how does it not devolve into violence and mayhem? That is the key question.
Starting point is 00:12:17 This is something you see in these documents Facebook employees really struggling with. You know, it's informing the internal research they're doing into exactly how Facebook is causing real world harm. You see it in these very, you know, sort of agonizing discussions on Facebook a safe way? Now, you know, one person who says that it is possible is Facebook's former head of civic integrity. He was actually Haugen's boss. His name is Samir Chakrabarti. And he left the company this year, but he's been tweeting a lot about everything that's been happening with these documents. And one thing he said recently is, you know, he thinks it is possible to have sort of a network at this scale, you know, with even this kind of virality, you know, in some cases, but you need far more guardrails, right? So you need to identify where the potential points of harm are and really work to mitigate them. And sort of as Miles was saying, you know,
Starting point is 00:13:20 what we often see is sort of Facebook, Facebook kind of creates a problem and then has to scramble to solve the problem it's created. You know, I think there's is sort of Facebook, Facebook kind of creates a problem and then has to scramble to solve the problem it's created. You know, I think there's a real feeling among many of these former employees that they've got that backwards. They need to be sort of thinking about how they're building these features in the first place, taking into account how they could be abused. The other thing that Samid has said is that you need to have a point of view on what is good and what is bad. And, you know, in that kind of like position, you know, it's something that we've seen Facebook and CEO Mark Zuckerberg really avoid, right? You know, he's famously said that the company doesn't want to be the arbiter of truth. All right. Well, Shannon, thanks so much for reading those thousands of pages.
Starting point is 00:13:58 So we didn't have to. And thanks for joining us. Thanks for having me, guys. It's fun. Good luck, Shannon. I'm Aisha Roscoe. I cover the White House. I'm Miles Parks. I cover voting and misinformation. And thank you for listening to the NPR Politics Podcast.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.