Consider This from NPR - As Social Media Giants Plan For Disinformation, Critics Say It's Not Enough

Episode Date: October 1, 2020

Facebook and Twitter have plans for an election season rife with disinformation on their platforms. Facebook Chief Operating Officer Sheryl Sandberg explains what lessons the company learned from 2016... and what they're doing differently this time. She spoke to NPR's Audie Cornish about that, and about the burden of work falling on women during the pandemic. Hear more of their conversation here.Critics say the social media giants are too large to realistically enforce their own policies. NPR's Life Kit has a guide to voting by mail or in-person this election season. In participating regions, you'll also hear a local news segment that will help you make sense of what's going on in your community.We're working on an upcoming episode about pandemic precautions and we want to hear from you. Fill out the form on this page and we may follow up on your response. Email us at considerthis@npr.org.Learn more about sponsor message choices: podcastchoices.com/adchoicesNPR Privacy Policy

Transcript
Discussion (0)
Starting point is 00:00:00 Disinformation is already widespread this election season. In our last episode, we looked at how much of it is coming from President Trump and his allies. The radical left are laying the groundwork to steal this election from my father, President Donald Trump. Like this Trump campaign video featuring Donald Trump Jr. calling for an army of poll watchers. We need every able-bodied man, woman to join Army for Trump's election security operation at... So what we did with this ad is it links very clearly to the bipartisan policy center. Facebook chief operating officer Sheryl Sandberg told me this week that if you see this video on Facebook...
Starting point is 00:00:40 President Trump is going to win. Don't let them steal it. on Facebook, you'll see just below it a message from Facebook itself that says, voting by mail has a long history of trustworthiness in the U.S. and the same is predicted this election year. And there's a link to more information from a bipartisan nonprofit organization. I think what people are worried about in this ad is that he says army of supporters. We believe the language army of supporters is not really calling for an army, but is calling on people who are normal campaign volunteers. But there have been other instances where people very senior have called on real violence. Those come down immediately.
Starting point is 00:01:22 And that is a judgment call. One critic say Facebook in a lot of cases is making too late in the game, or not at all. Consider this. With barely a month until November 3rd and more votes being cast every day, what are the most powerful social media companies doing about disinformation? Sheryl Sandberg answers our questions. From NPR, I'm Adi Cornish. It's Thursday, October 1st.
Starting point is 00:01:57 There are these networks of staunchly pro-gun groups on Facebook, and one of them is run by these three brothers, the Doerr brothers. It turns out they don't just do guns. The Doerr family name has been attached to other causes. Their goal is to eliminate public education and to replace it with Christian schooling. The roots of the Doerr family on the No Compromise podcast from NPR. It's Consider This from NPR. As the head of site integrity at Twitter, Yoel Roth says a lot of things keep him up at night. Having a vivid imagination is key. None of the threats are off limits. Roth, who spoke to NPR's Shannon Bond,scale spam or bot attack to the risks of foreign interference. And we really undertook a process to try and predict what the worst case scenarios were based on what we had seen previously in 2016, 2018. Aside from preparing for those hypotheticals,
Starting point is 00:03:06 last month, Twitter announced it would add fact-check labels to tweets or remove tweets altogether if they create confusion, undermine the elections process, or call for interference. Now, you don't have to spend more than a couple of seconds on the president's Twitter page to realize Twitter rarely applies labels to his tweets that contain false information about voting. The major gap now is not in the design of the policies on the platforms. The major gap is in enforcement of those policies. Chloe Colliver studies online extremism at the Institute for Strategic Dialogue in London.
Starting point is 00:03:41 And that is something that is an existential issue. There is no realistic way for them to actually implement and enforce the guidelines that they've laid out. So let's be clear, we are taking a lot of things down. We obviously have to let candidates speak. People need to know what candidates think and what candidates are saying. Facebook COO Sheryl Sandberg points to the information labels they apply to misleading posts and the company's voter registration effort as counterweights. Facebook said it helped 2 million people register to vote in 2016. This year, they've doubled that goal. We're getting accurate information. We're not just saying, oh, this looks bad.
Starting point is 00:04:22 We're saying, here's how you vote. Here's your state's link. Here's how you can make sure your ballot's counted. One difference between Twitter and Facebook is that Twitter banned all political advertising last year. Facebook announced last month it would block new political and issue ads during the final week of the campaign. But neither of those efforts will stop false information or even calls for violence from non-advertisers, meaning they won't stop regular people from writing and sharing those kinds of messages. You know, we've seen Facebook groups literally utilize this summer to organize violent actions on the streets and undermine the Black Lives Matter protests. Jonathan Greenblatt is the CEO of the Anti-Defamation League and also a board member of a group called the Real Facebook Oversight Board.
Starting point is 00:05:10 It's a collective of academics, journalists, and others who say their goal is holding Facebook accountable. Greenblatt points to what happened in Kenosha, Wisconsin this summer. Reporting by BuzzFeed revealed that armed militants, including Kyle Rittenhouse, organized their plans to show up in the city, heavily armed, in a Facebook group. And despite calls for violence in that group and reports to Facebook about the group from other users, the company left it up. Mark Zuckerberg later called it an operational mistake. Any other company, any other media business would have shut this down long ago.
Starting point is 00:05:47 The fact that Facebook doesn't enforce their own terms of service, at best, it's inexplicable. Sheryl Sandberg and I spoke about whether the bad parts of Facebook outweigh the good, and how the company is answering tough questions about the spread of disinformation and hate. Here's some of that conversation. So look, fighting hate on our platform is one of the most important things we do, something we take very seriously, and it is an ongoing battle. We have a platform that billions of people are using. I think we are making progress and I think we've proven that. So in 2018, when we started this process, 24% of the hate we took down, we found before people pointed it out to us. We just put out our latest report. We're at 95%. That's real progress. That's billions of dollars
Starting point is 00:06:38 of investment, so much work. But 5%, we're still not taking down until it's reported. And that is the work we are doing. So we have made and demonstrated real progress, but there's still more to do. I guess when you say there's still more to do, what do you see as the problem? Because I think for a lot of these activists, what they say is we hear about policies, but we don't feel like Facebook can effectively enforce, monitor, and take down bad content in a timely manner? Well, look, this is a challenge, right? Our question is how do we find the stuff and how do we take it down? And let's talk about the enforcement because it is complicated. Our systems are, what our enforcement report shows is if you have nudity, we get it right 99% of the time. That's because you can
Starting point is 00:07:26 train a computer to recognize a nipple and you can take it down and you get it right. Let's say you put up a swastika. You might be putting up a swastika to celebrate neo-Nazism or celebrate Nazi Germany. That comes down immediately. That's it. But what if you put it up because you're talking about the history of Germany? Or what if you're putting it up and saying we stand against this? Because hate is contextual, it is more difficult. So are you saying it's a technical issue? It's a technical issue. It's an enforcement issue.
Starting point is 00:07:54 Our enforcement is not perfect, but I think it's actually the best of our industry, not the worst, and it will continue to improve. And we share the goals. We want platforms that people can express honest views, ethical views. We want a platform where people can speak, but we do not want any form of hate or calls for violence. One of the things that people have concerns about is not that there's not enough speech, not that there's not enough information, but that we're seeing a growing trend where there are a great many people who have
Starting point is 00:08:25 trouble sorting the truth from falsehoods, sorting through that information. And it seems like this stance that you guys have as one of the world's biggest distributing and sharing networks of information doesn't help with that problem. So we are a platform where people can speak, but it does have this challenge that things are not checked. We do not want to be the arbiter of truth. We don't think that's the right thing to do. So we refer things to third party fact checkers and they mark them as true or false. And then we take from there. But not that you're arbiters, but that if in an era where Facebook is designed to increase engagement, it amplifies things that are viral. In an era when a lot of viral behavior is conspiracy theories or false information, it seems like your business is at odds with the problem? So what you're saying is a very common belief, but I don't believe accurate at all. And I really want to explain it. We've taken real steps to have third party
Starting point is 00:09:30 fact checkers in place. And when something is labeled as false or partially false, we dramatically dial down distribution. It gets about 20% of the distribution it would otherwise get. And we try to make sure it doesn't go viral. The idea that Facebook is designed to amplify things that are polarizing is not true. Facebook is designed to let you share with your friends and family. That's the great majority of what happens on our service. When that amplification happens, we've taken very strong steps to dial down that distribution so that things don't go viral in bad ways. That's not something we had in place for the 2016 election, but elections have changed and so have we,
Starting point is 00:10:09 and we are working really hard on that. You've always been out there in the public saying, we have a long way to go and we're working hard. Can you tell me what you would like to see? Like when you say there's a long way to go, what is that extra yardage you would like to cover? So this election is a major test for us. We know that. We've been incredibly focused on protecting the election, and we want to do more and better. So I'll give you an example. In 2016, we didn't know what coordinated inauthentic behavior was, and we didn't find
Starting point is 00:10:43 it on our platform in advance of the election. In 2017, we took down one network. In the last 12 months, we've taken down more than 50. We now know what these things are, and we find them, and we continue to take them down. Now, the other thing we're very focused on here is making sure the good happens. Our Voter Information Center is new for this election. And I think we're probably the largest place people are getting those messages. It's interesting because you have a climate information center, you have this voter information center, there was a COVID information center. Does it feel like whack-a-mole trying to keep up? It feels like essentially you guys have to keep
Starting point is 00:11:21 coming up with good sources of information, so to speak, because so much false information spreads so quickly. Yes, it is whack-a-mole. And that's why we're learning. We learned from coronavirus to set up the voter information center, to set up the climate information center. Adi, I don't know what's next. I don't think you do either. I don't think any of us do. But we know that- But at your scale, can you do it? Like, is this actually reflecting the fact that you're so big like is is facebook truly governable in that way well when people get to say what they want without being edited first when they're not publishing for a newspaper but they're publishing in their own voice there's going to be good and there's going to be bad and we need to enable the
Starting point is 00:12:02 good enable the good people want to say to each other. And there is an awful lot of good that happens on these services. I'll give you one of my favorite examples. In the UK, there was a mom's running club. It was set up on Facebook. It was moms who run. When coronavirus hit, that running club had women, strangers, running to pharmacies to fill prescriptions and running those prescriptions
Starting point is 00:12:25 to elderly people. That's what happens when people use their voice. And our goal is to take down the bad and continue to enable the good. And that's what we're going to keep doing. I don't think anyone disputes that good can't come from Facebook, right? I think people are starting to question whether the bad is starting to outweigh it, especially when they are feeling like the democracy is fragile here in the US. And I know that Mark Zuckerberg has said that he believes the democracy is strong enough to withstand this challenge, deliver a free and fair election. But a lot of people are seeing straining at the edges of our institutions and look at Facebook and say, it's not helpful when it comes to truth and trust. So to them, I would say that we worry about that too. And we have taken aggressive steps to take down the bad. But I think if you look at what happens, go to your newsfeeds, go to your Instagram, you do see an awful lot of good that's enabled by people having voice. And
Starting point is 00:13:25 again, any technology that's ever been used enables both. We have to, and I think we have, taken very aggressive steps to get the bad down and very aggressive steps to put out the good. Facebook Chief Operating Officer Sheryl Sandberg. Now, you can hear more of our conversation where we spoke about how the pandemic is affecting women who are juggling work and way more than their fair share of parenting. That's at the link in our episode notes. It's Consider This from NPR. I'm Audie Cornish.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.