Today, Explained - Should Facebook let Trump lie?
Episode Date: October 17, 2019President Trump lied in a Facebook ad and Facebook didn’t take it down so Elizabeth Warren lied, too. The Verge’s Casey Newton explains. Learn more about your ad choices. Visit podcastchoices.com/...adchoices
Transcript
Discussion (0)
Hello, friends. I'm Sean Ramos-Firum. This is Today Explained. Just a warning that if
you're listening with people over the age of 18, today's episode features a political
ad from President Donald Trump's re-election campaign. And this ad we're going to play
is just teeming with lies. Joe Biden promised Ukraine a
billion dollars if they fired the prosecutor investigating his son's company. Sean, again,
just a reminder, there is no evidence that ever happened. Now, here's the rest of the ad.
If the prosecutor's not fired, you're not getting the money. Oh, son of a bitch.
Got fired. But when President Trump
asks Ukraine to investigate corruption, the Democrats want to impeach him and their media
lapdogs fall in line. They lost the election. Now they want to steal this one. Don't let them.
I'm Donald Trump and I approve this message. Okay, question. Should Facebook run an ad like this?
One it knows contains just garbage information,
information that might change the way people vote.
In the past, Facebook has said no.
So Facebook's advertising policies have said
you're not allowed to put misinformation in your ad
if you're trying to say something deceptive. Facebook can
and has removed that kind of ad from its systems. But recently, Facebook actually changed its mind.
Yeah. So a journalist named Judd Legum took the Biden ad to Facebook and said,
hey, Trump is lying about this situation. Why have you allowed it, given that your policy bans misinformation?
And Facebook said, well, actually,
our policy now says that we are not going to fact check
the contents of political ads.
We're just going to essentially let the people decide.
So is Facebook saying now that it's essentially okay
for politicians to lie in ads that it runs on its site?
Is that what's going on here?
Yes, although they're not phrasing it that way.
What they've essentially said is if you look at the big broadcast networks, they do not fact-check political speech.
They simply allow politicians to say essentially whatever they want to, and Facebook wants to take a similar approach.
The context for that is that after the 2016 election,
one of the major concerns that people had about Facebook
now that it is so big and so powerful
is that decisions that it made around speech
could have an outsized effect on the outcome of an election. So Facebook
has been under a lot of pressure to not influence elections. And I think their solution has been to
say, well, we're just going to stay out of the debate over what is true and false when it comes
to politics. And we will let essentially the news media and the citizens of the democracy
sort of work together to sort out fact from fiction.
What would it look like if we trusted Facebook to police political lies?
Think about some really hot button issue, like, say, abortion. Something where people have
incredibly strong feelings, where there are
disagreements about when can life be said to begin. And then imagine some politician in a
very closely watched national race takes out an ad where they say something about abortion
in which there is not total clarity on whether it's true or false. And so then Facebook has to
send this ad to a fact checker. The fact checker has to make a determination about whether it's true or false. And so then Facebook has to send this ad to a fact
checker. The fact checker has to make a determination about whether it's true or false, even if there's
some nuance. And then Facebook has to decide whether they're going to keep it up or take it
down. So then they make that decision. And one group has won. And the losing group is going to
make a great deal of hay about how Facebook is biased against them. And all of a sudden, Facebook's self-image as a place
that is designed to be for everyone to talk with each other is under assault because now it has
been painted as a partisan outlet, right? And you can just imagine how in all the elections to come,
there are going to be those kinds of ads and those kinds of controversies. And if Facebook
were to maintain that old policy,
it would just face an unending series of headaches over them.
Is there sort of a double standard here from what people expect from Facebook versus traditional broadcast media, TV, radio?
There's a distinction between broadcast and cable TV, just to name one. Some cable networks
actually do fact check political
ads before they decide to run them. And if somebody is saying something that is erroneous,
they just won't run it. CNN, for example, took down the Biden ad that has been at the center
of this controversy. Fox News demurred. So different cable networks make different decisions.
Broadcast networks generally do not fact check. So for different
kinds of media outlets, we apply different standards and Facebook is in the middle of
figuring out, you know, what it wants to be in that regard. What did you think when you first
heard about this policy change from Facebook? Initially, I thought Facebook had the policy
right. You know, my rationale was, look, lying has a long and proud tradition in American politics,
but we also have this really robust system of checking it.
We have a media that spends a lot of time and energy every day
fact-checking political speech and then reporting what it finds.
And I think there is something healthy about a world
in which politicians are free to lie,
if only because it then reveals that the politicians are liars.
That's good information to know heading into an election.
And if you're worried that Facebook is too big and too powerful, which I am,
then having them sort of put their thumb on the scales of political speech,
it makes me uncomfortable.
And I heard from a lot of people that they felt like I had gotten it wrong.
Right. You defended this policy change in your newsletter,
The Interface. How did your readers respond?
Just to give you some flavor of the response that I got, one person wrote,
I am canceling this crap feces book fanboy newsletter. You are obviously willfully
ignorant to who and what you write about where Feces Book is concerned.
Your giddy little exclamations exhorting Feces Book are transparently false.
Oh, wow.
He was very excited that he'd come up with Feces Book.
He was definitely going to use it three times in a three-sentence email.
Did you get anything more constructive than that?
I've gotten two really constructive pieces of feedback from people.
One thing that people seem comfortable with is
they would be okay with Facebook allowing political ads
if they didn't allow micro-targeting.
And the thought there is it'll be harder to go after,
I don't know, like really vulnerable or low information voters
with some sort of obvious lie
if you have to address the entire nation at once.
And then the other piece of feedback that I've gotten
that's very popular is
Facebook just shouldn't accept any political ads whatsoever.
Pinterest does that.
TikTok does that.
You can argue whether that is a truly neutral position
and whether it might not just help to preserve the status quo. But among people who want to see a different situation that we have
now, those are two of the most popular ideas. The last time we spoke, you told us about how
Mark Zuckerberg and Elizabeth Warren had been butting heads lately, calling each other out in
meetings and online because she wants to break the company up. I imagine she has some thoughts
on this policy change at Facebook.
This has been catnip for Elizabeth Warren, right?
She has been tweeting about it.
She has launched a fundraising campaign around it.
And the cleverest thing that she did
was she called Facebook's bluff.
She made an ad with a lie in it
in which she posted a picture of Zuckerberg meeting Trump and said,
Breaking news. Mark Zuckerberg and Facebook just endorsed Donald Trump for re-election.
You're probably shocked and you might be thinking, how could this possibly be true?
Well, it's not. Sorry, it continues.
But what Zuckerberg has done is given Donald Trump free reign to lie on his platform
and then to pay Facebook gobs of money to push out their lies to American voters. If Trump tries to lie in a TV
ad, most networks will refuse to air it. But Facebook just cashes Trump's checks. Facebook
already helped elect Donald Trump once. Now they're deliberately allowing a candidate to
intentionally lie to the American people. It's time to hold Mark Zuckerberg accountable. Add your name if you agree. Was Elizabeth Warren right? Will most TV networks refuse to run the ad that Facebook was allowing
here? So I'm not an election law expert, but my understanding is that big broadcast networks like
ABC, CBS, NBC, generally speaking, do not fact check these kinds of ads.
And that was basically Facebook's point when they tweeted back at her.
Looks like broadcast stations across the country have aired this ad nearly a thousand times as required by law.
FCC doesn't want broadcast companies censoring candidates' speech.
We agree it's better to let voters, not companies, decide.
And then they added the hashtags
FCC and candidate use,
which I don't know why they would add those hashtags,
but they did that.
Wow.
Did Elizabeth Warren have anything to say back to that tweet?
Oh, yeah.
I mean, she is, uh, yeah.
Let me see if I can find her response here,
because it was good.
Quote, You're making my point here.
It's up to you whether you take money to promote lies.
You can be in the disinformation for profit business or you can hold yourself to some standards.
In fact, those standards were in your policy.
Why the change?
Is that the question everyone's asking here?
Why this change right now?
It's the perfect framing of the question for her. She is making this a moral issue,
whereas I think Facebook wanted to make it a kind of election law issue. And she is trying to take
that move away from them by saying, no, no, no, we are going to talk about what you're doing
is ethically correct. And, you know, that's not an issue that the company can speak to because
ultimately it's one person making that decision and it's Mark Zuckerberg. And I think he is
very reticent to wade in because he is already somewhat unwillingly in the middle of a big feud
with the potential future president of the United States. So after all the back and forth, do you, Casey, still think it was the right policy change for
Facebook? I definitely think about it differently than I did the day that the policy was announced.
I still think there is a good case for Facebook trying to stay out of elections. I think there's a good case for a democracy's citizens to take
the primary responsibility for sorting fact from fiction with the help of a free press
and not outsource that responsibility to a for-profit corporation. Where I think I've
changed though is that I now believe that if Facebook allows this policy to continue
through the election, it is likely that it will still play a huge role in the outcome of that
election. Facebook has historically spread really inflammatory speech. And so if you're giving
politicians license to lie, it's quite possible that those lies are going to spread very far and very fast.
And in fact, this Biden Ukraine ad that's at the center of the controversy has already been seen five million times, even though it's a lie.
So this is not a theoretical fear, right?
These lies can spread very far.
And so this policy of neutrality is probably going to still wind up having a big effect on the election.
And that's not even taken into account, right?
The profit Facebook makes from these sorts of dubious ads.
And this is just where I think it gets very dangerous for them.
Because now, every time a politician tells a lie,
people in the media and their users are going to turn around and say,
not only are you making the world dumber by spreading misinformation, but you're making a profit from it.
Because of all the blowback that's going to generate every time another one of these ads goes viral,
I think it's just going to create an endless series of PR headaches for the company.
And it's the sort of PR headaches that the company hates and wants to avoid. So while I understand where the company was coming from, and I appreciate that
they were trying not to put the thumb on their scale, I think with this policy, the more that
I've thought about it, I think they're putting their thumb on the scale anyway. And I think that
ultimately, it's not tenable, and they're going to have to change it. It reminds me of that question
in the internal Facebook Q&A that you leaked that we featured
a few weeks ago on the show where someone was asking like, what are we supposed to tell
our friends who don't like Facebook anymore?
All of this policy change about political speech doesn't seem like it's going to help.
Exactly.
And that's why I think Facebook may eventually change this policy soon if it cannot
attract and recruit and retain the best employees then the company doesn't do as well and all sorts
of bad things start to happen so you can just imagine being a software engineer in the bay area
and you go get drinks with your friends this weekend and your friend says hey i heard you
guys are letting politicians lie and you're taking money for it. Like, what's the deal with that? If Facebook can't
give a good answer to its employees, I suspect we might see a lot of employees get very frustrated
by that. And so that's ultimately the point of leverage that I think will cause this policy to
change. It just feels like at best,
Facebook is sort of in the middle of some kind of identity crisis.
And maybe at worst,
it's just like trial by fire over there,
which is kind of wild to think about
because these are very smart people
who are very well paid
and our elections hang in the balance.
I mean, what are we supposed to take away
from all of this?
I mean, one of the things I take
away is that too much power has been concentrated into the hands of too few giant technology
companies. Another takeaway that I have is that questions about what belongs on the internet and
what has to come down are some of the trickiest problems that we face. There are always trade-offs involved.
There is not usually a super clear answer.
And frankly, I think it's the sort of thing
that we should be having a big national discussion about.
For better or for worse,
a great deal of our political speech
now does take place on a handful of large social platforms. And I think
every American should have an informed view about essentially what do you think should be the rules
of engagement here? What should be allowed? What shouldn't be allowed? In so many ways,
we are having a reckoning over free speech in the modern age. And this story is right at the heart of it.
Facebook Newton, thank you so much for making time for us today. Appreciate it.
It's my pleasure. Thank you, Sean.
Casey Newton is the author of so much more than a feces book fanboy newsletter.
He is the author of the Interface newsletter.
You can find it over at The Verge where Casey reports on technology and democracy.
This afternoon, just a few hours ago, Mark Zuckerberg spoke at Georgetown University about free speech on Facebook.
We don't fact check political ads and we don't do this to help politicians,
but because we think people should be able
to see for themselves what politicians
are saying. He acknowledged
how unpopular this new policy
of allowing lies in
political ads might be, and
he said Facebook has considered
banning all political ads
from the site. But
political ads can be an important part of voice,
especially for local candidates and up-and-coming challengers and advocacy groups
the media might not otherwise cover.
It's that way they can get their voice into the debate.
Banning political ads favors incumbents and whoever the media chooses to cover.
But for now, Facebook is sticking with its decision to allow lies in political ads.
Even if we wanted to ban political ads,
it's not even clear where you'd draw the line.
There are many more ads about issues
than there are directly about elections.
Would we ban ads about healthcare or immigration
or women's empowerment?
And if you're not gonna ban those,
does it really make sense to give everyone else a voice
in political debates except for the candidates themselves?
So there are gonna be issues any way you cut this.
And I believe that when it's not absolutely clear
what to do, that we should err on the side of greater expression.
It's going to be one hell of an election, folks.
But we're here for you.
This is Today Explained. Thank you.