Today, Explained - Is Facebook ready for the election?
Episode Date: September 1, 2020Facebook CEO Mark Zuckerberg says the company made an “operational mistake” in its handling of Kenosha militia groups. The Verge’s Casey Newton explains whether that has implications for Novembe...r’s election. Transcript at vox.com/todayexplained Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
BetMGM, authorized gaming partner of the NBA, has your back all season long.
From tip-off to the final buzzer, you're always taken care of with a sportsbook born in Vegas.
That's a feeling you can only get with BetMGM.
And no matter your team, your favorite player, or your style,
there's something every NBA fan will love about BetMGM.
Download the app today and discover why BetMGM is your basketball home for the season.
Raise your game to the next level this year with BetMGM,
a sportsbook worth a slam dunk and authorized gaming partner of the NBA.
BetMGM.com for terms and conditions.
Must be 19 years of age or older to wager.
Ontario only.
Please play responsibly.
If you have any questions or concerns about your gambling or someone close to you,
please contact Connex Ontario at 1-866-531-2600 to speak to an advisor free of charge.
BetMGM operates pursuant to an operating agreement with iGaming Ontario. Kenosha, Wisconsin isn't going away.
President Trump visited today,
even though the state's governor wrote him a letter asking him not to.
Vice President Biden kicked off the week
talking about how President Trump can't stop the scenes we saw there last week.
And even Facebook is going through this internal reckoning over what role the social network may have played in fomenting violence there.
In particular, the shooting of three protesters by Kyle Rittenhouse, a 17-year-old who apparently went to Kenosha to support the police.
Casey Newton has been covering what happened at Facebook for The Verge.
So it begins with a Facebook page for something called the Kenosha Guard,
which had amassed about 3,000 members.
And in the day or so leading up to the big protest at which the shooting occurred, advertised an event on Facebook encouraging an armed response to the unrest.
So the event was called Armed Citizens to Protect Our Lives and Property.
And it said, quote, Any patriots willing to take up arms and defend our city tonight from the evil thugs, no doubt they are currently planning on the next part of the city to burn tonight.
Do we know how many people saw this Facebook post and then went to Kenosha?
Do we have any idea?
Was Kyle Rittenhouse one of those people? So I have been asking around about this, and no one at Facebook
can tell me whether this written house saw the post. They have said that he did not follow the
page and that he had not been invited to the event. Look, there are other places where he
could have learned that these protests were taking place, right? But regardless of that, I think there is still some appropriate concern about they are currently planning on the next part of the city to burn tonight, is obviously a dangerous,
potentially violent thing to post on Facebook. Is that something you're allowed to post on Facebook?
So the answer turned out to be no, but the interesting question is why. So on August 19th, Facebook introduced a new policy that said you're
not allowed to organize a militia on Facebook, which, you know, maybe you didn't realize that
up until a couple weeks ago, you could have put together a whole militia on Facebook if you had
thought to do so, but not anymore.
Right, and we actually covered why last week on the show in our QAnon episode.
Today, Facebook banned about 2,000 groups pushing the QAnon conspiracy theory, accusing them of promoting violence and leading to crime.
So one of the questions that I had for folks at Facebook was like, would this post have been taken down if the policy hadn't been
updated? And the message I got back is maybe, because you're not supposed to incite violence
on Facebook. And so it seems possible, maybe even likely that eventually, given what happened,
this event would have been removed for inciting violence. But the policy under which it was
eventually removed was, hey, you're not allowed to form a militia on Facebook.
So the post goes up. Though it's violating a policy, it stays up for a while. What happens while it's up? Well, it gets reported to Facebook
455 times by people who saw it saying this is inciting violence. And we know that because
Ryan Mack at BuzzFeed got a hold of the internal report inside Facebook showing just how many times
this had been raised to the attention of moderators.
Is 400 or in this case over 400 complaints a lot?
Yes, it's highly unusual for a single post to probably get even a dozen reports.
So once you get into the hundreds, that's a real outlier.
So we know that moderators saw it.
We know that at least four of them decided to leave the post up regardless.
And the open question is why they left it up for as long as they did.
And that question remains open today?
It is currently being investigated.
And I'm hoping that once Facebook is able to determine why it was left up, they will let us know. So this post violates Facebook's policies around forming a militia.
It's flagged over 400 times.
It stays up anyway.
We don't yet know why.
But we've spoken to you about Facebook moderation before, Casey.
What do you know about this process?
Things get reported.
They're sent to human beings to look at them.
The human beings will pull up the relevant policy.
They'll take a look at the post and they'll try to make their best determination as to
whether the post violates the policy or not.
But of course, things are never that clear.
There's always room for judgment calls for gray areas.
I talked to a former moderator who said that often when it came to posts about protests where people would say things like they should shoot the protesters, the moderators would be told, well, leave that up because it's not clear that you're trying to kill them.
Maybe you're just firing your gun in the air to scare them, that sort of thing. And so historically, moderators I've spoken with have been encouraged
to leave up posts like this, even though you or I might look at them and say, well, gosh,
that seems like a pretty clear incitement to violence. And it looks like that may be what
happened here. Yeah. Until finally enough noise got made about it. And some of the adults at
Facebook took another look at it and said, it's got to come down.
And of course, it happened after this horrible violence that transpired.
How does this back and forth play out within the social network?
So there's just kind of increasing frustration with Facebook. Like, hey, we reported this 455 times. Why didn't you do anything about it?
Mark Zuckerberg has his weekly Q&A late last week, and he kicked off this Q&A by talking about this particular incident. There have been a bunch of media reports asking why this page and
event weren't removed sooner, especially when, in this case, a bunch of people did report the page. And
the reason for this, it was largely an operational mistake. He said they were investigating,
but Ryan Mack reported in BuzzFeed that there was just a lot of anger inside Facebook. And the
reason is because they keep getting in trouble over enforcement
decisions. Like clockwork, every few weeks, it seems like something either stays up that we say,
well, that should have come down, or they take something down that we say they should have
left up. And so it's just kind of a never-ending cycle. And I mean, keep in mind, this has been
a summer where Facebook employees have been quite attuned to issues of
civil rights. There was a two-year audit of Facebook's effect on civil rights that was
completed just in July. Well, the main takeaways from the report is that Facebook has a lot of
work to do. At the heart of the big problems that we're facing is the way in which the platform is weaponized in our elections.
And there had been a virtual walkout at Facebook in the aftermath of the company's decision to leave up a controversial post by the president in June.
Employees are angry and frustrated with CEO Mark Zuckerberg's handling of some of President Trump's posts. Zuck's workforce was so angry at him that hundreds of employees
simply walked off the job yesterday in virtual protest of company policies.
One Facebook employee said he disagrees with Zuckerberg's decision
to do nothing about the post and says there isn't a neutral position on racism.
And that post said when the looting starts, the shooting starts in what was widely
interpreted to be an incitement to violence. And indeed, after the president posted that both on
Facebook and Twitter, there was violence at protests the following week. So Facebook employees
have been asking themselves a lot of questions about the degree to which their work is contributing
to racial injustice and other civil rights issues. Is this sort of like a uniform call inside Facebook
to do better on these issues
or is there internal opposition as well?
Some people think that Facebook
has generally made the right calls here.
Facebook is not a monolith, right?
It has 50,000 people who work there.
They are people of every political stripe.
There are some very conservative people who work at Facebook.
There is a person who is best friends with Brett Kavanaugh and sat behind him during
his confirmation hearings, and he's the head of policy at Facebook, right?
So there is a whole range of viewpoints inside that company.
And one of the interesting things about it is that,
you know, while it is effectively a monarchy, right, Mark Zuckerberg has total control over
the company, he can't be removed from the board, it has adopted a lot of democratic aspects,
starting with the fact that employees can ask questions of Zuckerberg every week.
What I think is different this time is that the people who are critical of Facebook,
instead of sort of keeping all of the discussion inside the company,
which is what it has historically done,
you're now seeing more and more of it spill out through leaks
so that reporters like me hear about it. Facebook made a mistake with Kenosha last week,
but there's an election in two months.
Is Facebook ready?
More with Casey after the break. Thank you. They were named the number one digital photo frame by Wirecutter. Aura frames make it easy to share unlimited photos and videos directly from your phone to the frame.
When you give an aura frame as a gift, you can personalize it.
You can preload it with a thoughtful message, maybe your favorite photos.
Our colleague Andrew tried an aura frame for himself.
So setup was super simple.
In my case, we were celebrating my grandmother's birthday.
And she's very fortunate.
She's got 10 grandkids.
And so we wanted to surprise her with the AuraFrame.
And because she's a little bit older, it was just easier for us to source all the images together and have them uploaded to the frame itself.
And because we're all connected over text message, it was just so easy to send a link to everybody.
You can save on the perfect gift by visiting oraframes.com
to get $35 off Aura's best-selling Carvermat frames
with promo code EXPLAINED at checkout.
That's A-U-R-A frames.com promo code EXPLAINED.
This deal is exclusive to listeners
and available just in time for the holidays.
Terms and conditions do apply. Casey, another reason it feels like this
really matters right now is because we're just about two months out from a presidential election. Is Facebook ready for this election?
Well, that's the open question.
It's definitely the case that they have invested a ton
in making sure that they don't repeat the mistakes of 2016.
Facebook is receiving criticism for how it handled evidence of Russian election interference
on the social network.
The Times reports security experts at Facebook flagged attempts by Russian hackers to probe
accounts as early as the spring of 2016.
That's months before Zuckerberg downplayed the effect of Russian propaganda on the 2016
election.
So it seems less likely in 2020 that we're going to
find out that some foreign state, you know, successfully hacked into Facebook and distorted
public opinion on a bunch of issues, although that fight is ongoing and we shouldn't consider that
battle won for good. The dicier question is, is Facebook equipped to handle all of the unique issues that 2020 is presenting, such as the fact that we will be voting during a global pandemic?
Many of us will be relying on mail-in voting.
And the president of the United States is currently conducting a disinformation campaign about the safety and efficacy of mail-in voting.
Mail-in voting.
Mail-in ballots are a very dangerous thing.
They're subject to massive fraud.
So the really crazy thing about this year is that Facebook has to contend with a bad actor in the White House who is actively making their job more difficult.
And it's still kind of an open question of how much they can do on that front.
What kind of political speech is performing the best on Facebook right now? Do we know?
Yeah, so my friend Kevin Roos is a reporter at the New York Times and he does this gimmick on Twitter where he posts every day, now through a separate account, the top 10 pages that get the
most engagement on Facebook. So this isn't the most
viewed content on Facebook, but of all of the public pages, you know, like the New York Times,
CNN, conservative commentators, who gets the most attention? And almost every single day,
what he finds is it's the conservatives, the right wing, sometimes the far right wing,
that is getting by far the most
clicks of anyone on Facebook. Why is it important to look at most engaged versus most viewed?
So in other words, these aren't just all of the links that you're seeing in the newsfeed that
your friends are posting themselves. It's the pages that are then being re-shared into the
newsfeed, or it's just engagement on the page itself, right?
Maybe you're a big CNN person,
so you go to the CNN Facebook page
and leave a comment there.
That's the engagement that we're able to measure
because Facebook has a tool called CrownTable,
which allows us to view it in real time.
It's not public,
but a lot of journalists and researchers have access to it.
And so we're able to get this data.
And what it shows pretty consistently is that conservatives are doing fantastic on Facebook.
Okay, so instead of CNN or the New York Times, people are engaging with Ben Shapiro, Breitbart, or commentators like the Hodge twins way more.
The cops tried to arrest him, take him down to the ground.
That didn't work.
They tried to tase him. That didn't work. They tried to tase him.
That didn't work.
Jacob got up, pulled out a knife.
So let me ask you woke NBA players, you woke African-Americans, what else it to do?
What else could you do?
Those two cops, what?
Pull out a damn magic wand?
And what does that mean for the veracity of the posts that people are engaging with most?
The New York Times, CNN, they fact check.
They have reporters with decades of experience in journalism.
Right-wing and even far right-wing outlets like Breitbart, less so.
They do make stuff up.
Facebook is okay with that?
Generally speaking, it's fine to lie on Facebook. But if you tell a big lie, it might get sent to a fact checker for review.
A big lie would basically be a lie that goes viral.
And so one of these independent fact checkers that Facebook does not control directly will
review it.
And if they find that it's false, they will rate it as false.
And then basically fewer people will see it on
Facebook. Okay, so a big lie being told by some right-wing outlet might have some chance of
getting fact-checked if it's really big and a lot of people are flagging it, and then it might get
taken down, but it still has a better chance of succeeding than some legitimate fact-checked news.
Yeah, the news outlets are losing.
Great. Okay. So what kind of dangers does that present for the election?
Well, you know, some of us believe that it's important to live in a democracy where people
have a shared sense of reality and basic understanding of the facts as they are.
And there is some fear that if, oh, say half of the country were to sort of cleave off into a
separate information ecosystem where up is down and left is right, then we could find ourselves
in some pretty scary times. So the more I look at the world around me, the more it seems like
those are the times that we're living in, Sean. How about you?
How has this fact-checking process Facebook has instituted played out in recent weeks or months where they've been saying that they're going to do a lot better job with information in the lead-up to an election?
They've taken down many things that Trump and his campaign have posted. I mean, I understand why that's not a
bigger story, but I mean, it is pretty wild, right, that this company is having to say,
you know, that the president of the United States is so far outside the bounds of acceptable
discourse. But among the things that he did say was that children were, quote, almost immune,
end quote, to COVID-19, which they are not. And of course, that's a very dangerous thing to say for many reasons. And so Facebook removed that. There was another case where the campaign had posted
some imagery that drew on Nazi symbols. Those are banned on Facebook. So Facebook removed those.
So, you know, the company does work to get rid of a lot of stuff. And I think a lot of the debate
that we're having is around like, are they taking down enough? Or like, is the system just so fundamentally
broken that we want to replace it with something else?
And is this election going to be a test of that?
It is. But it's like, you know, I go back and forth on how big a role I think Facebook will play
in this election. I think the information ecosystem is bigger than Facebook. It seems
wrong to me that we talk so much more about Facebook than we talk about Fox News.
Like, yes, Facebook's audience is exponentially bigger.
But the academic research shows that Fox News does a much better job of convincing people of terrible, radical ideas.
And so I think it's super important to talk about Facebook a lot.
There's a reason why I do it all the time and have made it my job and my business. But I also don't think you should
talk about them in a vacuum. Like a lot of times the problem that some people want Facebook to
solve is that Donald Trump is president and that's not a problem that Facebook can solve.
And that seems to sort of be in the vein of an argument that Mark Zuckerberg himself would make, that it's not Facebook's job
to tell you the president is a liar, but maybe more so it's Facebook's job to let the president
lie so you can go like, hey, look, the president's a liar. We don't do this to help politicians,
but because we think people should be able to see for themselves what politicians are saying. Yeah, and historically, I think that's been a pretty good argument.
But I think that the introduction of platforms like Facebook have complicated that question
because they enable a lie to get around the world instantaneously.
They allow it to spread virally.
They use recommendation algorithms that can pull people into groups like the anti-vax
movement, the Boogaloo Boys, the QAnon groups. And so these platforms are not neutral actors
on that subject. They are playing a role in changing up that information ecosystem.
So what do you make of what's going on at Facebook right now? I mean, we've spoken to
you about content moderation before. Does this incident regarding Kenosha feel different? Does the sort of reckoning that
Mark Zuckerberg had late last week in his weekly Q&A feel different?
At this point, I sort of feel like a cop who's been on the content moderation beat for like 20
years. And so when I hear there was a bad post on the internet and it didn't get taken down quickly, I think, well, yeah, that happens sometimes.
It's bad and it's bad if you know
that there were real world consequences because of it.
I still haven't seen any kind of causal chain
that suggests some innocent boy saw a Facebook event
which radicalized him and led him to go buy an AR-15
and go shoot up a crowd of protesters.
Like I don't think anyone is saying that that's what happened. But I do think that Facebook should be
a place that is free of militias, right? And I think that they should live up to the standards
that they have set for themselves. These are sort of private company matters that are actually
matters of public concern, right? It matters whether this guy saw that event page. It matters why the moderators didn't take it down. And it
matters how they're going to try to do better next time. So my advice to the company is,
after you investigate, issue a public report. Maybe it's a page long, but I think you owe it
to the democracy to let us know how you are investigating these issues and how you are
regulating speech on the internet. Facebook Newton, thank you for your time. I really appreciate it.
My pleasure. Explained. Pleasure explained.