The Daily - Facebook’s Plan to Police the Truth

Episode Date: July 20, 2018

The last time Facebook came under such intense scrutiny was when Mark Zuckerberg, the company’s chief executive, defended himself before Congress in April. But his latest policy on false news has tu...rned the spotlight back to the social media giant. Guest: Kevin Roose, who covers technology for The New York Times. For more information on today’s episode, visit nytimes.com/thedaily.

Transcript
Discussion (0)
Starting point is 00:00:00 From The New York Times, I'm Michael Barbaro. This is The Daily. Today, the last time Facebook was under intense scrutiny was when Mark Zuckerberg had to defend himself in front of Congress. Now, he's bringing scrutiny back on Facebook with his plan to police the truth. bringing scrutiny back on Facebook with his plan to police the truth. It's Friday, July 20th.
Starting point is 00:00:47 We welcome everyone to today's hearing on Facebook's social media privacy and the use and abuse of data. So in April, we saw Mark Zuckerberg testifying on Capitol Hill about Cambridge Analytica, this data firm that had gotten illicit user data and was using it to help, among other people, the Trump campaign. Kevin Roos covers technology for The Times. We didn't take a broad enough view of our responsibility, and that was a big mistake. And it was my mistake, and I'm sorry. And what resulted from those hearings was sort of a parade of grievances. You had all kinds of issues coming up.
Starting point is 00:01:21 Mr. Zuckerberg, I will say there are a great many Americans who I think are deeply concerned that Facebook and other tech companies are engaged in a pervasive pattern of bias and political censorship. I mean, you have people who think Facebook is biased against one political party or the other. We've seen how foreign actors are abusing social media platforms. You have people who think that Facebook is too easily exploited by Russian information operations and other foreign influence campaigns.
Starting point is 00:01:55 One of the key issues here is, is Facebook too powerful? Are you too powerful? And then you have people who just don't think that Facebook is capable of regulating itself, who think that this company has been allowed to grow to this enormous size with very little oversight, and that now they should have a little more scrutiny. So this concludes today's hearing. Thanks to all the witnesses for attending. The hearing is adjourned.
Starting point is 00:02:21 The hearing is adjourned. So after these hearings, Facebook embarks on this kind of corporate makeover. It meets with lawmakers. It expands its fact-checking programs. It opens an independent investigation led by a former senator into whether Facebook is biased against conservatives. And most importantly and most visibly, it takes out a very large ad campaign. We came here for the friends. And we got to know the friends of our friends.
Starting point is 00:03:07 Then our old friends from middle school, our mom, our ex, and our boss joined for us to wish us happy birthday. And we discovered our uncle used to play in a band and realized he was young once too. And we found others just like us. And just like that, we felt a little less alone. But then something happened. We had to deal with spam, clickbait, fake news, and data misuse. That's going to change. From now on, Facebook will do more to keep you safe and protect your privacy, so we can all get back to what made Facebook good in the first place. Friends. Because when this place does what it was built for, then we all get a little closer.
Starting point is 00:03:54 So the overall message is, something bad happened. We're going to fix it for you. Right. And then, in the last few weeks, they've started inviting reporters into their office to tell them how they're going to do this. Mm-hmm. Did you go? I did.
Starting point is 00:04:11 Last week, there was a group of reporters that went to Facebook's office. They're little, you know, appetizers and drinks, and then we sit down around a big conference room table, and they play a video. Facebook and other social media sites are being criticized for not doing enough to stop bogus stories that seem to dominate the election cycle.
Starting point is 00:04:35 This 12-minute documentary that they've made called Facing Facts. It's basically their version of a chronicle of how they approach false news and misinformation on Facebook. For a time, we felt our responsibility was mostly around just trying to help organize that information that you in some sense had asked to see. And it has interviews with people on their teams. One of the challenges in misinformation is that there is no one consensus or source for truth. It shows them thinking through and sort of trying to figure out what is or isn't false news.
Starting point is 00:05:13 The truth has this unfortunate aspect to it that sometimes it is not aligned with your desires. It is not aligned with what you have invested in, what you would like. Facebook has had rules forever about what you can and can't put on its platform. They've had rules against nudity. They've had rules against graphic violence. They've had rules against harassment. But the one category that they have resisted wading into until pretty recently is this idea of what's true and what's false
Starting point is 00:05:44 and what's deliberate misinformation and what's people just being wrong on the internet. And why have they resisted that? Why is that the last holdout? Well, because how the hell do you build something that can accurately and completely rid a platform of misinformation that has 2 billion people on it. And frankly, it's also where they face the most political danger because as we know, our political climate is not dominated by people who agree on a single set of facts. And it's basically their attempt to say, look, this is really hard. We're dedicated to finding our way through it, but it might take us a while and it's going to involve some really important and hard trade-offs.
Starting point is 00:06:27 One famous example of this is the photo of the Seattle Seahawks supposedly burning an American flag in a locker room. So they had a story about a Photoshopped picture of the Seattle Seahawks burning an American flag, which never happened, but the Photoshopped picture went viral on Facebook. One of the cases of ideologically motivated misinformation was the story of an undocumented immigrant.
Starting point is 00:06:51 They had a report that was about an illegal immigrant who allegedly committed arson, which was not true. These are things that were explicitly designed and architected to be viral. These are the hoaxes of the world. These are things like Pizzagate. This is just false news. I mean, all this anger and frustration, they're basically saying, look at what we actually have to deal with every single day. Right. I mean, I think it's pretty remarkable how little we know about how Facebook makes decisions,
Starting point is 00:07:19 given that they sort of govern the speech and media consumption of a big chunk of the world. This was really one of the first times that we've actually been able to see how they make these decisions. Once we were able to define the tactic or the problem area, we're able to make more progress on it. And it was a very controlled and sort of sanitized version of that. But it did show some of the thinking behind what they eventually put into policies. Bad behavior is using tactics like spamming to try to spread a message. So this documentary is meant to say it's going to be very hard to do this, but we're finally going to develop a system for misinformation
Starting point is 00:08:02 that gets at this incredibly thorny question of the nature of truth. We have to get this right, not just for our platform, but for the community of people that we serve around the world. And what do you make of this presentation inside Facebook's offices? How do the reporters in the room respond to it? It got pretty contentious pretty fast. I mean, I think after the video ended... Hi, Kevin Roos from The New York Times. I asked a question about, well, in the video, I thought that was a very interesting video. Thank you for showing it. Some of the examples
Starting point is 00:08:49 that were used in the video, like the Pizzagate example, the Wine County arsonist story, the fake photo of the Seahawks burning, those came from, in some cases, organizations with millions of Facebook followers. One was InfoWars, one was Breitbart, one was a group called Vets for Trump. Have you taken any action against those specific organizations that you called out in your video? And what is the response from the Facebook officials standing near you? Well, they said basically that pages that repeatedly put out false news, pages that repeatedly violate their guidelines, pages that are threatening or harassing people,
Starting point is 00:09:30 will be penalized. They will be demoted, essentially made less visible to their followers. So kind of buried. Yeah, buried. And, you know, their distribution will be cut in Facebook's words. And then a CNN reporter raised his hand and said, Buried and, you know, their distribution will be cut in Facebook's words. All regards from CNN.
Starting point is 00:09:46 And then a CNN reporter raised his hand and said, well... I'm kind of curious, like, Facebook is really devoted to fighting fake news and false news. How does InfoWars have an account on your website? What about InfoWars? I mean, if you are so dedicated to combating misinformation on Facebook, you are so dedicated to combating misinformation on Facebook, why do you let this page exist that has repeatedly spread conspiracy theories, things about Pizzagate, things about the Parkland shooting, about Sandy Hook before that? Why are they even allowed on the platform at all? And what was the response?
Starting point is 00:10:19 The guy who runs News Feed basically says, look, just being false doesn't violate our standards and that InfoWars has not violated a rule that would require taking them down. And he said, you know, basically we created Facebook to be a place where different people have different points of view and different publishers think about these things differently. And we want to respect the diversity of opinion on Facebook. Thank you for taking the time now. Have a good rest of your day. You too. Thank you.
Starting point is 00:10:50 So the meeting wraps up and we all go back to write our stories. And that InfoWars question sort of becomes the story of this meeting. Then it generated a sort of groundswell of controversy on both sides that persisted and led up to Monday when... Good morning. We welcome everyone to this morning's hearing on examining the content filtering practices of social media giants. Representatives from Facebook, Twitter, and YouTube went to Congress to talk about censorship and moderation on social media.
Starting point is 00:11:28 There were lots of questions from lawmakers. Lawmakers on the Democratic side brought up InfoWars. So just to explain, what's happened with InfoWars? Because they've made a cottage industry out of this. What they do is they deny that these events have happened. Why are they still on Facebook? We have removed content from the InfoWars page to the extent that it's violated our policies. They have not reached the threshold at which they are entitled.
Starting point is 00:11:49 What's the threshold? It depends, Congressman, on the nature of the violation. So there are sometimes more severe violations. Lawmakers on the right said, why are you censoring sites like Gateway Pundit, which is another sort of right-wing site that traffics in conspiracy theories. It's a matter of congressional record that Gateway Pundit, which is another sort of right-wing site that traffics in conspiracy theories. It's a matter of congressional record that Gateway Pundit, Mr. Jim Hoft, has introduced information into their record that in the span of time between 2016 and 2018, he saw his Facebook traffic cut by 54 percent. And could you render an explanation to that for him and for me, Ms. Bickert? Thank you, Congressman.
Starting point is 00:12:27 I can't speak to any one individual's decline in reach or popularity on Facebook. I can say that we do change the way that our newsfeed algorithm works. The algorithm, basically, it's individualized, and it gives people, it sorts or ranks content for each individual user based on people that they follow, pages that they follow, groups that they belong to. So it's an inventory of content that they have chosen. This seems to be a pretty clear sign that Facebook really can't win. It comes up with a system to demote misinformation, to kind of bury it online, and they think it's a good solution. And Democrats say it's not enough.
Starting point is 00:13:10 Republicans say it's too much. They're improving Facebook's own point that coming up with a system to divine truth is really hard. It's hard unless you're willing to make some politically dangerous and unpopular choices. What do you mean? Well, it would be very easy for Facebook to say,
Starting point is 00:13:32 we're not going to allow people to say that the earth is flat on Facebook and we're going to take down any content that says that the earth is flat. I'm a member of a mid-sized flat earth Facebook group that is fascinating. You are? I am. You actually think the earth is round, Michael? I'm a member of a mid-sized flat-earth Facebook group that is fascinating. But— You are? I am. You actually think the earth is round, Michael? No, it's a fascinating corner of Facebook.
Starting point is 00:13:54 I'm there for journalistic purposes, I promise. But I think it would be very easy for them to say, that's obviously false. We're going to take that down. Right. And then you demand and your pals in that group, gone. Exactly. But when it comes to more sort of politically salient issues, no matter what you do, you're going to make someone angry. It makes me wonder how much of Facebook's decision making in all of this, do you think is built around a fear of the politics of this moment and the
Starting point is 00:14:22 intensity of the political divisions? It seems like we now know that Facebook helped in many ways bring us to this moment because of the way social media amplifies divisions. And that may have been inadvertent, but it seems like on some level, they're now making decisions based on fear of making it worse by seeming to pick a side. Absolutely. They don't want to be accused of bias. That has been one of their guiding principles for the last several years. And what do you make of that? I think this is a symptom of something larger, which is that Mark Zuckerberg appears to be
Starting point is 00:14:58 profoundly uncomfortable with the power that he has. He built a company that swallowed communication and media for much of the world. And now I think you're seeing him sort of back away from that, saying, I don't want all this power. I mean, the problem with ruling the world is then you have to govern the world. And that's not, it seems like, what he wants to do. Some people feel you are a nation state in a lot of ways.
Starting point is 00:15:29 We're not. We're a company. Okay. You know that. You know people think of you in a much more powerful manner, I guess. I think we have a lot of responsibility. Well, that's kind of a beautiful segue to the real reason why we're talking to you, Kevin, which was this interview that Mark Zuckerberg did with our future colleague, Kara Swisher, about Facebook and about misinformation. Yeah, so he has this interview with Kara Swisher at Recode. What is that response?
Starting point is 00:15:57 What do you feel like? Do you think you have understood it? Because there's a lot of ways someone was saying to me, you can't just pass power along. You have an enormous amount of power. Do you understand that or do you think about that or you don't think you have? No, I think we have a big responsibility, but I'm not sure what you mean by pass power along, but I actually think one of the things that we should be trying to do
Starting point is 00:16:24 is figure out how to empower and build other institutions around us that are important and can help figure out these new issues on the internet. And he keeps saying, I think about it as a responsibility. He's very uncomfortable with the idea that he is the most powerful, one of the most powerful people in the world. And he keeps insisting, you know, I want to empower other people. I don't want to be the emperor on the throne. I want to give other people the ability to govern their own communities and lives. Let's talk about Infowars. Let's use them as the example. Sure. Make the case for keeping them and make the case for not keeping, not allowing them to be distributed
Starting point is 00:17:07 by you. Kara and Mark start talking about InfoWars. And we feel like our responsibility is to prevent hoaxes from going viral and being widely distributed. So the approach that we've taken to false news is not that, not to say you can't say something wrong on the internet. I think that that would be too extreme. Everyone gets things wrong. And if we were taking down people's accounts
Starting point is 00:17:34 when they got a few things wrong, then that would be a hard world for giving people a voice and saying that you care about that. But at the same time, I think that we have a responsibility to when you look at the top 100 things that are going viral or getting distribution on Facebook within any given day, I do think we have a responsibility to make sure that those aren't hoaxes and blatant misinformation.
Starting point is 00:17:55 Kara asked Zuckerberg about Sandy Hook and the massacre that happened there. Okay, Sandy Hook didn't happen is not a debate. It is false. You can't just take that down not a debate. It is false. You can't just take that down. I agree that it is false. And this was really sort of InfoWars' flagship issue for years. What they were known for was their promoting of a conspiracy theory
Starting point is 00:18:17 that Sandy Hook had been a hoax. And I also think that going to someone who is a victim of Sandy Hook and telling them hey no you're a liar that is harassment and we actually will take that down but overall you know I mean let's let's take this a little closer to home right so I'm Jewish and then he unprompted or brings up this example of the holocaust and there's a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don't believe that our platform should take that down because I think that there are things
Starting point is 00:18:55 that different people get wrong. I don't think that they're intentionally getting it wrong. In the case of the Holocaust deniers, they might be. It's hard to impugn intent and to understand the intent. But I just think for as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I'm sure you do. I'm sure a lot of leaders and public figures we respect do too.
Starting point is 00:19:25 And I just don't think that it is the right thing to say we are going to take someone off the platform if they get things wrong, even multiple times. So he's essentially appealing to ideas that we've developed in our democracy over hundreds of years. I mean, he's saying that's the way we think about free speech is the way the U.S. government thinks about free speech. There's no rule that that has to be the case. There's no reason he has to say we're going to draw the line on free speech here as opposed to somewhere else. That's a choice that he's making. And I think that's sort of a way of avoiding the choice altogether is just to say, well, free speech is protected in the public sphere, so we're going to protect it in largely
Starting point is 00:20:10 the same way on Facebook. But Facebook can do whatever it wants. I mean, this is one thing that I think is broadly misunderstood about Facebook. We speak about it as if it is a democracy, something that has to have a consistent set of values, that it applies equally to all of its users, and that there is some principle of fairness and equality that it must apply. It's a company. Facebook has no obligation to protect any particular kind of speech or ban any other kind of speech. If they wanted to, they could say, well, if you deny that the Holocaust happened, we're going to lock you out and make you write an essay about the Holocaust until we find it acceptable, and then we're going to let you back
Starting point is 00:20:53 on. To be clear, I don't think they should do those things necessarily, but they could. I mean, the point here is that Facebook has a ton of agency, but it is acting as if it's constrained by these American notions of free speech unless there's a reason to depart from them. I mean, there are certain things that Facebook is legally required to take down in certain countries. So in countries like Germany, there are strict hate speech laws where you actually have to take down denials of the Holocaust and other types of hate speech, or you're legally liable for that as a platform. Right. Facebook could be more like Germany if it wanted to. But doing that would force Zuckerberg to make his power more known and more seen. Right. It would force him to be proactive rather than reactive. Mark Zuckerberg
Starting point is 00:21:42 wants to give power to other people, but he wants to make sure that they use that power responsibly. And that's a really hard thing to do. We've seen what happens when Facebook builds a thing and gives people the power to use it in whatever ways they want. And it is really destructive. And so now we're sort of asking him to accept the power that he has and to use it wisely. And I still think he's reluctant to do that. He still seems like he has accidentally put himself in the throne. And now he is trying to give that back. And we're saying, no, you can't do that. Like, you broke it, you buy it, you know? And, like, that's... You make it, you own it. Yeah, you make it, you own it. And, like, this is the fundamental tension that I think they're working through
Starting point is 00:22:32 is whether he wants to or not, Mark Zuckerberg has to solve this problem now that he's created. And if he can't solve it, and if he's not willing to solve it, maybe he shouldn't have all this power. Maybe he shouldn't be in the position of getting to determine what is free and acceptable speech. Thank you, Kevin.
Starting point is 00:22:54 Thank you for having me. We'll be right back. Here's what else you need to know today. On Thursday, White House Press Secretary Sarah Huckabee Sanders announced that President Trump plans to invite President Putin to Washington for another meeting this fall. We have some breaking news. The White House has announced on Twitter
Starting point is 00:23:23 that Vladimir Putin is coming to the White House in the fall. Trump's director of national intelligence, Dan Coats, found out about the invitation from NBC News anchor Andrea Mitchell while on stage at a conference. Say that again? Vladimir Putin coming to the... Did I hear you? Yeah, yeah.
Starting point is 00:23:47 Okay. Yeah. That's going to be special. During the interview, Coats acknowledged his frustration with the first meeting between Trump and Putin, especially the president's decision to speak with Putin without any aides present for nearly two hours.
Starting point is 00:24:10 If you had asked me how that ought to be conducted, I would have suggested a different way. But that's not my role. That's not my job. So it is what it is. Is there a risk that Vladimir Putin could have recorded it? That risk is always there. The Daily is produced by Theo Balcom, Lindsay Garrison, Rachel Quester, Annie Brown, Andy Mills, Ike Srees Kamaracha, Claire Tennesketter, Paige Cowan, Michael Simon-Johnson, and Jessica Chung, with editing help from Larissa Anderson. Thanks for listening. is by Jim Brunberg and Ben Landsberg of Wonderly. Special thanks to Sam Dolmett,
Starting point is 00:25:06 Michaela Bouchard, Lee Mangistu, David Boddy, and Stella Tan. That's it for The Daily. I'm Michael Barbaro. See you on Monday.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.