The Daily - How Facebook Is Undermining Black Lives Matter
Episode Date: June 22, 2020Companies like Facebook, Twitter and YouTube have come out in support of Black Lives Matter and its mission. But are their platforms undermining the movement for racial justice? Guest: Kevin Roose, wh...o covers technology, business and culture for The Times. For more information on today’s episode, visit nytimes.com/thedaily Background reading: Kevin Roose explains why shows of support for Black Lives Matter from Facebook, Twitter and YouTube don’t address the way racists and partisan provocateurs have weaponized the platforms.
Transcript
Discussion (0)
In the summer of 2014, Michael Brown, an unarmed Black teenager, was shot and killed by Darren Wilson, a white police officer in Ferguson, Missouri.
And for weeks...
I said he died! Convict! Send that killer cop to jail! protests in Ferguson and across the country. And social media really became the primary organizing tool
for the Black Lives Matter movement.
And back in 2014, if you remember,
like you would go on Twitter or Facebook
and almost every post in your feed
or most of the posts in your feed
would be some sort of call to action.
Black Lives Matter! Black Lives Matter!
People posting the Black Lives Matter hashtag, people sort of organizing support and supplies
and financial donations for this emerging civil rights movement.
Black Lives Matter!
Right. And it feels like we're in a very similar moment right now.
So what do things look like on a Facebook, for instance, at this moment?
Well, they look a little different.
All lives matter.
It's overwhelmingly posts that oppose Black Lives Matter.
You got to explain this to people.
All lives matter.
Calling it a racist movement.
The Black Lives Matter Foundation is apparently preparing for a, quote, war on police.
Black Lives Matter is the con of the century and a sham.
We're going to find out how many liberals have contributed.
And a sham.
We're going to find out how many liberals have contributed.
Some could have possibly orchestrated and manufactured a lot of stuff that we see today.
About how Black Lives Matter is not a social justice movement.
It is something far darker and more dangerous.
I mean, Kevin, how do you explain that? Because how did hashtag Black Lives Matter go from being a very powerful organizing tool that helped give birth to a movement to something that's now being Americans support Black Lives Matter. You know, the data we have suggests that Black Lives Matter is actually much more popular among Americans than it was in 2014.
I think the thing that has changed is social media.
From The New York Times, I'm Michael Barbaro.
This is The Daily.
Today.
Facebook and Twitter are publicly voicing support for Black Lives Matter and its mission in this moment.
Black Lives Matter and its mission in this moment. But my colleague, Kevin Roos,
took a close look at what's actually happened on their platforms.
It's Monday, June 22nd. Kevin, you just spent the past few months making a series about the rise of extremism on the internet, rabbit hole.
Right.
But the last time you and I talked here on The Daily, it was back in the fall.
And the two biggest names in social media, Facebook and Twitter, were under fire for the huge amount of misinformation on their platforms. And Congress brought the heads of those companies, Mark Zuckerberg from Facebook,
Jack Dorsey from Twitter, out to testify. And it felt like a pretty defining moment for them when
it comes to this question of what is the role of a social media company in our civic discourse?
So what has happened since then? Well, I think these social networks have continued to grapple with these questions
of responsibility and free speech
and what's allowed and what's not allowed
on their platforms.
We face a number of important issues
around privacy, safety, and democracy.
And I think every social media company
has realized that they dropped the ball in 2016.
But it's clear now that we didn't do enough to prevent these tools from being used for harm as well.
They allowed their platforms to be hijacked not just by Russia, but by dishonest actors inside America, by partisan media outlets and clickbait factories.
We didn't take a broad enough view of our responsibility,
and that was a big mistake.
So they're basically in this grappling moment.
They realize that what happens in November
is pretty existential for them.
Overall, I would say that we're going through
a broader philosophical shift in how we...
They will be either seen as helping the democratic process and free and fair elections in the U.S.,
or they will be seen, as they were in 2016, as having, in some ways, undermined it.
I started Facebook. I run it. And I'm responsible for what happens here.
And they have committed to getting it right or more right in 2020.
And do we see these companies start to take action eventually after this period of grappling?
Yeah. So the first hint that these companies were going to be somewhat different from one another when it came to enforcing these new rules happened last year.
Twitter is going to stop accepting political ads on the platform.
Jack Dorsey has just tweeted it out.
When Twitter decided to ban all political advertisements.
And he says, while Internet advertising is an incredibly powerful
and very effective tool for commercial advertisers,
the power brings significant risk to politics
where it can be used to influence votes that affect the lives of millions.
So if you're a politician or a super PAC or a lobbying organization, you cannot take out
a political ad on Twitter. They said these sort of targeted ads distort democracy. We don't feel like we can responsibly administer them.
And so they just said, we don't want any part of it when it comes to political ads.
And what about Facebook?
Facebook took a much different tack.
They basically said, we will continue to allow political advertisements, campaigns, outside
groups will be able to spend millions of dollars on political
ads. And even more so, we will not fact check politicians' ads. But outside of this political
advertising decision, the policies that Facebook and Twitter have with respect to like what is and
isn't allowed on their platforms, they're very similar, except when it comes to one particular user, President Trump.
What do you mean?
So Twitter in recent weeks has become much more aggressive in confronting President Trump over some of his tweets that they say violate the rules.
Twitter, for the very first time, is fact-checking the president.
The first one was this tweet about mail-in ballots that President Trump had that had
some misleading and false information about voting.
He claimed, without evidence, that voting by mail leads to fraud.
Twitter says that the tweets are potentially misleading.
And that, Twitter said, was a clear violation of its policies around voting misinformation.
So they put a sort of fact-checking label on those tweets, and Facebook did not.
And Kevin, this fact-checking label essentially says to anyone who looks at this,
hey, there are some issues here, and you shouldn't take this at face value.
Right. So Twitter applies this fact-checking label to President Trump's tweets.
And then the very next day...
We've been sharing with you President Donald Trump's recent tweets about the unrest in Minnesota.
President Trump put out this statement, which he posted to Twitter and also to Facebook.
Here's a quote.
These thugs are dishonoring the memory of George Floyd, and I won't let that happen.
Just spoke to Governor Tim Walz and told him that the military is with him all the way.
Any difficulty and we will assume control. Just spoke to Governor Tim Walz and told him that the military is with him all the way.
But when the looting starts, the shooting starts.
When the looting starts, the shooting starts.
What does that mean?
It's a phrase with a very long history.
It was fused by a police chief in Miami in the late 1960s,
and it was referring to violence against protesters.
And so in this context, it was clear that the president was threatening violence against protesters. And so in this context, it was clear that the president was threatening violence against protesters.
Which would make this a real test
of Facebook and Twitter's social media policy.
Right.
Both these networks have policies
prohibiting violent incitement,
posts that could create imminent harm to people.
And this clearly fell under that category.
And so what did the companies decide to do?
So Twitter decided to put basically a warning label on this tweet
saying that it had violated their rules about glorifying violence,
but that it would stay up because it was in the public's interest to know about it.
And they also made it so that you couldn't retweet it. And how big of a deal is it that Twitter took those steps? Well, I think
practically it didn't do much. I mean, I think people still heard about this tweet. They still
saw it. But I think symbolically it was a very big deal. It was the first time that Twitter had
ever applied this label to President Trump. And
it was sort of a moment where they were saying, like, this is a line that even President Trump
cannot cross. And how does Facebook handle this very same post from President Trump?
So Facebook sort of waits a while. And then they announce that they're just going to leave it up.
They're not going to label it. They're not going to take it down.
This post will just stay up.
And Kevin, why is the reaction by these two giant social media companies so completely different?
Well, part of it, I think, has to do with who's running these companies.
The founder of San Francisco-based Twitter is getting involved in the protests in Ferguson.
Jack Dorsey is from St. Louis, just 12 miles from Ferguson.
Jack Dorsey, the CEO of Twitter, he's been very vocal about his support for Black Lives Matter
and for sort of Twitter's role in helping these movements be heard.
The billionaire was on the streets of Ferguson last week and has been marching,
posting vines and tweets about all that's going on there.
But he also knows his users pretty well. I mean, Twitter is a place where a lot of
activists and journalists hang out. It's also got like a robust and vibrant black community on Twitter.
And I think he just understands that many of his users are going to feel pretty good about
this decision to take on the president. Now, I'm not going to sit here and tell you
that we're going to catch all bad content in our system. But Mark Zuckerberg, on the other hand,
he doesn't want to be the referee. We don't check what people say before they say it.
And frankly, I don't think society should want us to.
He doesn't want to be an arbiter of civil conversation.
Freedom means you don't have to ask for permission first.
And that by default, you can say what you want.
When it comes to politicians, he wants to do as little moderation as possible.
And in some ways, that lines up better with what Facebook's users want.
They tend to be a little older, potentially more conservative.
Right. So therefore, Mark Zuckerberg would be very reluctant to
label the president's tweet as factually inaccurate or label it as
potentially inciting violence, because that might very quickly alienate Facebook users.
Right. But I think an even bigger risk is that it could alienate President Trump and
Republican lawmakers in Washington.
The giant tech companies right now are eating up little tiny businesses, startups,
and competing unfairly.
I mean, in this election cycle, Facebook has become sort of a target for Democrats.
So what I'm saying is we've got to break these guys apart.
You want to run a platform? That's fine.
You don't get to run a whole bunch of the businesses as well.
But Facebook has been criticized by both Democrats and Republicans
who have said that they should be broken up or more heavily
regulated. So quit harvesting people's data, sell off these companies that you're using to create a
monopoly and do a third party audit. I think Facebook knows that Democrats in some ways have
turned against them. And if Republicans also turn against them, it will be that much harder for them
to stop new regulation, for them to avoid being broken up.
And I think that it's in their interest to keep President Trump and his allies happy.
That's really interesting because I feel like the dominant conservative view, at least as I
understand it and have read about it, is that Facebook has not been hospitable to
conservatives, that it has taken steps to censor them or limit what they say. And from what you're
describing, Facebook very much needs conservatives and is showing a lot of restraint when it comes
to conservatives. Yeah, I mean, their outreach to Republicans is in some ways an attempt to sort of correct this impression that conservatives have that they are biased against the right, which is not reflected in any of the data. past few weeks, there's this tool called CrowdTangle that you can basically use to like pull up the most popular and talked about Facebook posts from across all of Facebook.
So yeah, just looking at the most engaged posts from the last 24 hours on Facebook.
The first one is from Trump. It's the video of his rally in Tulsa, Oklahoma. Number two,
also by Trump, another picture from his rally. And then you've
got Franklin Graham, this right-wing evangelist and activist taking issue with Dr. Fauci. You've
got Hugh Jackman wishing his dad a happy Father's Day. That one's not political. And you've got
Terrence Williams, who's a pro-Trump activist. Breitbart has a video of Trump's rally. The vast
majority of these top 10 stories are usually from right-wing media outlets and right-wing politicians.
Is there anything that might be characterized as Democratic, liberal, or progressive in that list
of the top 10 or so? Almost every day, there are one or two posts in the top 10 from more
liberal outlets or politicians, but it is predominated by Fox News, by Breitbart, by right-wing
news outlets, and by President Trump himself.
So a list like what you just went through makes clear that Facebook's users and maybe its most active users are conservatives, right? It's not a question of an algorithm that puts those things at the top.
It's what's being most discussed, engaged, commented on.
It's what's being most discussed, engaged, commented on.
Yeah, there's clearly a large audience for conservative media,
conservative posts, conservative activism on Facebook.
We'll be right back. because it has made itself a very hospitable place for conservatives. But that fear means it has no incentive to do anything to crack down on conservative viewpoints,
no matter how extreme or factually inaccurate.
In some ways, that's right, except there's one complicating factor,
which is that Facebook's own employees are increasingly speaking out.
We are following a developing story out of Facebook.
Employees staging a virtual walkout today in protest of the company's policies
towards some of President Trump's posts.
Let's get to Julia Boyle.
There was a virtual walkout recently
where a bunch of employees sort of protested these decisions
by sort of logging off for the day.
Meantime, you've got Facebook employees resigning today,
publicly resigning because of this decision.
There have even been employees who have resigned
over Mark Zuckerberg's decisions
to essentially let Trump get away with things
that other users wouldn't be able to.
Facebook's employees, you know, many of them,
they live in Silicon
Valley. They tend to be more liberal. They don't like the idea that the product that they spend
their days and nights building is being used to put out propaganda and misinformation.
Employees like Andrew Crow, head of design for Facebook's Portal product,
have taken to Twitter, writing comments such as, quote,
portal product, have taken to Twitter, writing comments such as, quote, giving a platform to incite violence and spread disinformation is unacceptable. And they have started vocally
objecting to Mark Zuckerberg's decisions. While Jason Toth, identified as director of product
management, wrote, quote, I work at Facebook and I am not proud of how we're showing up.
And that's a real pressure because Silicon Valley is a very interesting place in that
the employees of these companies wield a ton of power. It's a very competitive labor market.
And if you're Facebook, you can't afford to have your employees be walking out and resigning.
It's not good for your business long term.
Kevin, from everything you've just said, Twitter is being proactive here, Facebook much less so.
But I just want to make sure that we're not oversimplifying this.
Because, yes, we know Twitter has chosen to label certain tweets, at least by the president, as violating their terms, fact check them.
We know that they have taken a pass on political ads and that Facebook has not done any of those.
But in reality, is what Twitter is doing meaningfully different and better than what
Facebook is doing? I mean, Twitter still has a lot of work to do to ensure that people are not,
you know, threatening and harassing and using hate speech against each other. But the thing
that I think is really starting to differentiate these companies from one another is how they're
responding to this moment, you know, this historic movement for civil rights, this historic election where in some ways, you know, a lot of people feel like the future of not just the next four years, but of democracy is on the line. And I think these companies are understanding that they will be judged by history in some ways for the decisions that they make right now.
And what could they be doing that neither of them are doing?
Well, there's plenty of things that they could do
to kind of take down the temperature
of the overall conversation on their platforms.
They could change the way that their systems rank information.
So instead of showing you the stuff that is most engaging,
they could more heavily curate the information that comes onto people's feeds. They could put a cap on like how viral posts can go to sort of keep bad actors from hijacking these conversations and undermining these movements.
and undermining these movements,
there's a lot that they could tweak about the basic structures of their platforms,
but that could cut pretty deeply
into their business models.
And ultimately, like,
I think the bigger thing that they're realizing
is that they have to pick a side.
There is no such thing as a neutral platform,
and all these decisions that they make about how their tools are designed, how they're used, what policies they have, they are all pushing in one
direction or the other. And in this moment, this national moment of reckoning, of activism, of
people speaking out against injustice, they have to decide
whether that's something that they want to support or whether they want to stand on the
sidelines.
Right, because if you don't pick a side as a social media platform, then a side will
be picked by your users.
Exactly.
I mean, I think a pretty vivid illustration of this happened the other day when Mark Zuckerberg came out with this long, heartfelt Facebook post about how he supported Black Lives Matter, how Facebook was going to donate millions of dollars to racial justice causes, how he stood with Facebook's black employees and with the movement for racial justice.
And that same day...
Hello, Facebook family.
On Facebook.
I have decided to do this video.
The post that was going the most viral...
It has been weighing very heavily on my heart.
...was this video by this conservative activist, Candace Owens.
So in my opinion, George Floyd was a criminal. He was a criminal. And just because he was a
criminal doesn't mean he deserved to die at the knee of a police officer. But it does mean that
I am not going to play a part of the broken Black culture that always wants to martyr criminals.
Who made an entire video essentially saying that Black Lives Matter was a liberal hoax.
It is certainly no excuse to accept a Democrat narrative, okay,
that Black people are being disproportionately hunted down by police officers
because of the color of their skin.
That George Floyd was a horrible person, that his death should not be mourned as a tragedy,
and essentially saying that this whole
movement is based on a false premise. Police brutality, racially motivated police brutality
is a myth. That there is no bias in policing in America. I have, you know, I have no apologies
here to make. George Floyd is not my martyr. He can be yours. That's all I have to say to black America.
So Mark Zuckerberg is out there saying,
I believe in Black Lives Matter.
I want to support this movement.
And on his platform, the thing that he built,
the thing that he oversees and controls every day,
a very different message was winning.
Kevin, thank you very much.
We appreciate it.
Thank you for having me.
We'll be right back. Here's what else you need to know today.
The Times reports that over the past week,
the number of new daily infections of the coronavirus
has hit a record high in 12 states,
most in the South, West, and Midwest,
but that the death rate is falling dramatically. Overall, fatalities from
the virus have dropped 42 percent over the past two weeks, in part, it is believed, because the
virus is infecting younger and healthier Americans. And President Trump has fired the United States
Attorney for the Southern District of New York,
who has overseen multiple investigations into conduct by the president, the president's lawyers, and the president's allies.
The U.S. Attorney, Jeffrey Berman, initially rejected a recommendation from the Attorney General, Bill Barr, to resign,
prompting Barr to announce his resignation and for Berman to publicly deny
that he was resigning. Berman eventually agreed to step aside after learning that one of his deputies
would fill his role for the foreseeable future. That's it for The Daily.
I'm Michael Barbaro.
See you tomorrow.