Pivot - Land of the Giants: The Facebook Election
Episode Date: September 6, 2022Today, we're sharing a special episode from Land of the Giants: The Facebook/ Meta Disruption. To listen to the full series from episode one, Follow Land of the Giants to get new episodes every Wednes...day. https://podcasts.voxmedia.com/show/land-of-the-giants Facebook used to brag about how its tools helped politicians swing elections. Now, the platform’s relationship to politics is much more complicated. Today: the story of how one politician, again and again, forced Zuckerberg to confront his own role in democracy. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
Support for Pivot comes from Virgin Atlantic.
Too many of us are so focused on getting to our destination that we forgot to embrace the journey.
Well, when you fly Virgin Atlantic, that memorable trip begins right from the moment you check in.
On board, you'll find everything you need to relax, recharge, or carry on working.
Buy flat, private suites, fast Wi-Fi, hours of entertainment, delicious dining, and warm, welcoming service that's designed around you.
delicious dining and warm, welcoming service that's designed around you.
Check out virginatlantic.com for your next trip to London and beyond,
and see for yourself how traveling for business can always be a pleasure.
Hi, everyone. I'm Kara Swisher, and today we're excited to share a special episode of Land of the Giants from our friends over at Recode and The Verge.
This season, the Facebook meta-disruption is hosted by Recode senior correspondentereen Ghaffari, and The Verge deputy editor, Alex Heath. Land of
the Giants is a podcast that explores how the biggest tech companies rose to power and what
they're doing with that power. Past seasons have covered Amazon, Google, and Apple, but this season
they're covering Facebook, also known as Meta, the company that defines social connection, community,
and identity on the web.
In this episode, you're about to hear the hosts explore Facebook's effect on American elections.
Facebook used to brag how its tools help politicians swing elections,
but one politician again and again forced Zuckerberg to confront Facebook's role in democracy.
If you want to know more about how Meta is affecting our lives, culture, and government,
follow Land of the Giants wherever you find your podcasts.
New episodes are out every Wednesday. Enjoy the episode.
A student at Harvard builds a website for ranking classmates on campus by hotness.
Then he builds a college directory. Facebook's origin story is well known. That's why it was curious when, in October of 2019,
Mark Zuckerberg tried to recast it
around a much loftier idea.
Back when I was in college,
our country had just gone to war in Iraq.
And I remember feeling that if more people had a voice
to share their experiences,
then maybe it could have gone differently.
This is Zuckerberg framed by a giant American flag
speaking to a packed room at Georgetown University.
And those early years shaped my belief
that giving more people a voice
gives power to the powerless and pushes
society to get better over time.
The timing of this speech was pretty revealing.
Next on his D.C. itinerary, in just a few days, Zuckerberg would testify before Congress.
It would be his second experience getting grilled on Capitol Hill since he appeared
the year before to explain how Russia weaponized Facebook ahead
of the U.S. election. There was growing concern that Facebook, with its more than 2 billion users,
had become too powerful. Concerns politicians had campaigned on.
Yes, Mark Zuckerberg, I'm looking at you.
And here was Zuckerberg giving a speech in the nation's capital,
trying to position his company as something that was much bigger than a company.
People having the power to express themselves at scale is a new kind of force in the world.
It is a fifth estate alongside the other power structures in our society.
A new pillar of society.
I understand the concerns that people have about how tech platforms have centralized power.
But I actually believe that the much bigger story is how much these platforms have decentralized
power by putting it directly into people's hands.
With pressure mounting on Facebook to take a harder line on what speech it allows, here
is Zuckerberg doing something else entirely.
Distancing himself from that responsibility
and then branding it as good for the world.
But as it turns out,
reality is much messier than high-minded philosophy
because one politician would again and again
force Zuckerberg to confront his own power
to curtail political speech and even sway elections.
own power to curtail political speech and even sway elections.
This is Land of the Giants. I'm Alex Heath. Today, the story of how Facebook comes to reckon with its own role in democracy.
Back in the early 2000s, Katie Harbath was working for the Republican Party on Senate election campaigns in Washington, D.C.
Then Facebook called.
The company actually approached me following the 2010 election cycle because the goal of the company was still to try to get more influencers, celebrities and politicians using the platform more.
Believe it or not, at the time, Facebook's
role in politics felt full of potential. The Arab Spring kicked off in December 2010, becoming a
symbol for how social media could fuel democratic revolutions. Harbath joined Facebook as one of
its first political strategists on the policy team. They knew that they were going to have,
you know, more Republicans running for president in 2012 and President Obama would be running for reelection.
Obama's 2012 campaign ended up being a wild success story for the company.
His team used Facebook as a tool to find potential voters and tailor messages to key demographics.
Facebook and the Obama administration were pretty cozy basically right up to the 2016
election.
Here's President Obama in June of 2016 hosting a panel at
Stanford with none other than Mark Zuckerberg.
There's a good-looking group and I could not wear a t-shirt like Mark for at
least another six months but I will take off my jacket so that I don't look too formal.
Soon. Soon.
Soon.
It's going to happen soon.
Given how well Facebook had worked for Obama's campaign,
Harbath says the company was looking ahead to the 2016 election with optimism.
For history to mark 2016 as the Facebook election.
We really wanted Facebook to be seen as this place where major conversations of things that were happening in the world were happening on the platform.
And that it was seen as a crucial tool that candidates needed to be using to engage with voters in something that if they weren't on, it would be much harder for them to win.
And if 2016 was going to be the Facebook election,
then Donald Trump would be the Facebook candidate.
We need somebody that literally will take this country and make it great again.
While Trump was known for his incessant tweeting,
Facebook was actually the platform he relied on the most for his campaign.
Here's Brad Parscale, the architect of Trump's digital campaign,
talking to 60 Minutes in 2018.
Twitter is how he talked to the people.
Facebook was going to be how he won.
The campaign's secret, taking full advantage of Facebook's advertising system.
Parscale says the campaign would test thousands of variations of ads every day,
with each ad hyper-targeting specific messages.
Andrew Bosworth, who ran Facebook's ad platform at the time and is now the CTO of Meta,
has said Trump ran the most effective Facebook campaign in history.
But of course, it wasn't just ads.
Trump also used Facebook to talk directly to his voters.
But of course, it wasn't just ads.
Trump also used Facebook to talk directly to his voters.
We really didn't start to see some of these really difficult conversations around candidate and politician speech until December of 2015.
That was when then-candidate Trump posted something that would push Facebook
into an uncomfortable and new kind of political position.
Donald J. Trump is calling for a total and complete shutdown
of Muslims entering the United States.
I do think that was a turning point,
just in that, one, we hadn't had somebody post something like that
that was in that kind of position before
that we would have had to take punitive action.
That's Crystal Patterson, a former Democratic lobbyist for Facebook who
joined in 2014. We'd have members of Congress who would block people from their pages. And it was
like, is that legal? Those are the kinds of questions we were grappling with. Not what do
you do when the leading candidate for president posts an attack on an entire, the biggest religion
in the world? Like, you know, what do you do with that? Exactly.
This Trump post was a big deal inside Facebook.
Many employees had never grappled with this possibility,
that their platform could be used to stoke division like this by a presidential candidate.
And Trump's video targeting a religious group seemed to directly violate Facebook's rules.
Internally, Facebook employees pushed Zuckerberg to take the post down.
This was something that would be removed if it came from a regular user.
Shouldn't the rules be the same for an especially powerful person?
But Zuckerberg refused. Instead, he wrote on his public Facebook page, quote,
If you're a Muslim in this community, as the leader of Facebook, I want you to know that you are always welcome here.
He was no fan of Trump, but pulling down speech by a major political candidate was not something to be taken lightly.
The episode brought up a bigger question about what Facebook's role was in the democratic process.
Here's Katie Harbath. There are a lot of questions about what role should a tech company have overall in moderating any of this sort of
political debate. And so any decision around taking anything down would have been an incredibly
precedent setting. The decision not to touch Trump's post, that was also precedent setting.
Crystal Patterson. I'll just speak for myself. Like it was the first time I felt like
we were being, what's the word, pliable with how we were going to apply the community standards.
I used to be able to tell you chapter and verse what would be OK and what wouldn't on the platform.
And now all of a sudden this felt like the target was moving a little bit.
And the rub was that we were doing it for someone who had had real impact.
This Trump post would have a real impact on Facebook's policies, too.
The company started carving out an exception to its rules called newsworthiness.
The gist.
If a piece of content broke its rules but was deemed historically significant or of public interest, Facebook would leave it up.
or of public interest, Facebook would leave it up. So funny, I sound probably like more of an apologist
than I am, but I think there's a real effort
to try to not be a factor in these debates.
Like, they'd rather keep hands off as much as possible.
But that hands-off approach,
it wouldn't be easy for Zuckerberg to maintain.
Because of course, what happened next was,
Donald Trump became more than just a presidential candidate.
We sat here 12 hours ago and we looked at an electoral map that seemed impossible.
CNN can report that Hillary Clinton has called Donald Trump to say that she will not be president.
The shockwaves are already being felt around the globe.
On November 10th, 2016, Zuckerberg sat down for an interview at the
Techonomy conference in California. It was on this stage that Zuckerberg would say something
that would haunt him for years. Personally, I think the idea that, you know, fake news
on Facebook, of which, you know, it's a very small amount of the content,
influenced the election in any way, I think is a pretty crazy idea.
Even before the full scope of Russia's meddling on Facebook was known,
questions were being raised about the company's role in the victory of Donald Trump.
One part of this that I think is important is
we really believe in people, right?
And that they can, like, you don't generally go wrong
when you trust that people understand what they care about
and what's important to them.
And you build systems that reflect that.
Things had in fact gone very wrong.
How much revenue did Facebook earn from the user engagement that resulted from foreign
propaganda?
So wrong that two years later, Zuckerberg would be hauled before Congress for the first
time.
One of my greatest regrets in running the company is that we were slow in identifying
the Russian information operations in 2016.
Russian-backed operatives had used Facebook
to the fullest extent to stoke division
in the United States leading up to the 2016 election.
They made Facebook pages with names like Army of Jesus
and paid for ads comparing Hillary Clinton to Satan.
They even set up real-life dueling protests
using Facebook events.
Facebook admitted it sold at least $100,000 worth of Russian ads.
And the spread of regular content was much bigger.
In total, Russian-backed Facebook posts reached 126 million people ahead of the election.
Any goodwill left for Facebook in D.C. was gone.
And Democrats, who had cozied up to Facebook in the Obama years, were especially furious.
And Democrats, who had cozied up to Facebook in the Obama years, were especially furious.
Safe to say, the Facebook election hadn't gone the way the company thought it would.
I just felt culturally the organization was in quite a shell-shocked state, quite a defensive sort of crouch.
This is Nick Clegg. He joined in late 2018 to lead Facebook's policy and communications teams.
He's now the company's president of global affairs, reporting directly to Zuckerberg. Clegg isn't your typical Silicon Valley executive.
He was the deputy prime minister of the UK from 2010 to 2015. Then he lost his seat in parliament
entirely in 2017. And Facebook came calling. My first conversation with Sheryl Sandberg,
I remember I was sort of halfway up a mountain in the Alps on a hiking holiday. And I mean, it was so sort of
left field from my point of view that I just, I said, no, there's no way I'm going to go work
at Facebook and move to California. But Sheryl is persuasive and persistent.
Sheryl Sandberg was once the second most powerful person at Facebook. In 2008,
she became the chief operating officer.
And in early 2022, she announced she was leaving.
While it wasn't clear to the outside world at the time,
her hiring of Clegg was a key step in setting up her departure.
Before Clegg, Sandberg had been the top political figure at the company.
But with Facebook's reputation in shambles,
it was time for someone else to step in and reset things.
I came to this company partly because I'd said to Mark, Facebook's reputation in shambles. It was time for someone else to step in and reset things.
I came to this company partly because I said to Mark,
you know, in all our endless conversations before I joined,
your fundamental problem is the people's perception of your power.
The perception that Zuckerberg has too much of it, that is.
Not an unreasonable take.
Let's talk about why Mark Zuckerberg is uniquely powerful for a second. It's because he essentially can't be fired.
Zuckerberg owns a special class of company stock that effectively gives him majority
voting power over all company decisions, including acquisitions or who sits on the board of directors.
It's a strategy he borrowed from the co-founders and former leaders of Google, Larry Page and
Sergey Brin.
Zuckerberg has shown no desire
to give away his voting control or his CEO title.
But around the time that Clegg joined,
he was looking for a way to offload something else.
I understand that people are concerned
about how much control we have
over how people communicate on our services.
And frankly, I don't think that we should be making
so many important decisions about speech on our own either.
So that's why we're establishing
an independent oversight board,
for people to appeal our content decisions.
Enter the oversight board.
That's how Zuckerberg described it at his Georgetown speech.
The idea was the oversight board would function
like a Supreme court for Facebook.
It would be staffed by people who weren't Facebook employees and be able to make binding
judgments on certain kinds of content decisions.
If Zuckerberg was Facebook's executive branch, this would be his version of checks and balances.
Facebook would fund the board, but would do it through a trust so that once it was set
up, the company couldn't meddle.
Building this institution is important to me personally
because I'm not always gonna be here.
And I want to ensure that these values
of voice and free expression are enshrined deeply
into how this company is governed.
For someone who gave himself outsized control
of the governance of his company,
this was a bold statement.
And one met with skepticism.
I cannot tell you how many people told me that this was just this incredibly stupid PR stunt exercise and that it was going to be a waste of time.
Kate Klonick is a professor of law at St. John's University.
She spent over a year embedded in the making of the Oversight Board and wrote about it for The New Yorker.
It's no good if it's never going to have independence or any buy-in if it just comes,
kind of, as someone told me, like fully sprung from the head of Mark Zuckerberg, like Athena from Zeus.
According to Klonick, there were early questions about just how powerful Facebook would let the oversight board be.
There was some fear that they could change or kill News Feed.
Yeah, that was something that particularly I think the product managers were concerned about.
As we explained in our first episode, the News Feed has been at the center of Facebook's controversies since the very beginning.
But at least to start, that wouldn't be the oversight board's focus.
But, at least to start, that wouldn't be the Oversight Board's focus.
After over a year of talking to human rights and policy experts around the world,
the Oversight Board's initial remit was narrowed to this.
It would offer the final word on cases where Facebook users appealed the company's decision to remove their content.
Facebook itself could also refer content decisions to the board for advice,
though the board's recommendations wouldn't be binding.
Not exactly a Supreme Court,
but still more power than any other tech company had handed over to an outside body,
which also gave insiders pause.
There is something about not opening the door
that means that you can control the message
and that you can control the image of your product.
And the second you start a dialogue, you open yourself up to more liability and you open yourself to disappointing
people and people having expectations of you and you not meeting them.
So why open yourself up to this kind of scrutiny? Nick Clegg has some thoughts.
Keep in mind, this is his boss he's talking about.
He is immensely reluctant to deploy the very considerable power he has to curtail, marshal,
limit, curate human speech. I think he's almost intuitively, and it's very smart, he understands
that the very act of doing what he's often pressed to do will only exacerbate people's
concerns about unaccountable power.
It's not necessarily altruistic.
It's also a giant shit sandwich he has to eat every day.
Like, who wants to eat that every day?
No one wants to eat that.
Zuckerberg was no longer just a tech CEO.
Being the leader of his so-called fifth estate,
it made him a political lightning rod.
And all of this, according to Clegg, was distracting. I think we'd found through the Trump years that it had a somewhat paralyzing
effect on the whole of the kind of senior leadership of the company every time these
blowups happened. And look, you know, Mark has to run a company. His great passion and expertise
is on the sort of product side of things. The oversight board would help make some of the most
controversial content decisions at Facebook.
But the board would take another two years to set up.
Two more years of Trump in the highest office.
After the break, Zuckerberg is drawn back
into the responsibility he wanted out of.
George Floyd!
George Floyd!
In May 2020, George Floyd was murdered in Minneapolis by police.
Protesters gathered
around the country
to express their grief and outrage.
President Trump, of course, weighed in online.
Speaking about protesters in Minneapolis,
Trump posted a statement on Facebook and Twitter
that included a phrase with a racist, violent history.
Quote, when the looting starts, the shooting starts.
Another moment that was just,
the post itself was shocking. And also inside the company was just a turning point, I think,
in terms of a lot of simmering frustrations coming to the surface on a lot of fronts.
Crystal Patterson again, then still working on Facebook's policy team.
And also at that point, we hadn't hit January 6th yet, but we knew Trump's supporters are
empowered when he makes comments like that to take action.
And maybe it wasn't shooting, but, you know, they just feel kind of the moral authority
to go out and push the language that he uses, too.
If you hear it from the leader of the free world, it must be OK to say and say it with your whole chest.
It felt to many inside Facebook that Trump's statement put the lives of Americans, particularly black Americans, in danger.
Once again, Facebook had to decide what to do about Trump.
But this time, he was the president.
Nick Clegg was in the room for the debate.
Here's how he described the arguments for and against taking the post down.
The argument for was that it was a post that could be interpreted as sort of crossing a line in terms of inciting violence.
The argument against was
heads of state have a right to say they are thinking of deploying the force of the state,
and that would equally be a very big thing, a very big thing for a private company to say no.
Again, Zuckerberg decided to leave the post up.
I think Mark, in the end, felt as the ultimate decision maker
that the argument that, you know, you're crossing a Rubicon if you're going to start
saying that heads of government can't threaten to deploy a force to restore order, that that
sets a really quite a worrisome precedent. The blowback was intense. Hundreds of Facebook
employees staged a virtual walkout to protest the decision, posting messages about it on Twitter. To this day, it's the most public display of dissent from within the company.
They were livid and pressed Zuckerberg.
He said he didn't read Trump's comments as a dog whistle for vigilante violence,
despite the history of the phrase.
It had been used by segregationist cops while violently cracking down on civil rights protesters in the 1960s.
Crystal Patterson found the whole episode disillusioning.
She left Facebook the next year, in 2021.
When your hand's off when someone's hurting another person,
you know, you're part of the problem.
You'll never take back our country with weakness.
You have to show strength and you have to be strong.
We have a breach of the Capitol.
Breach of the Capitol.
They broke the glass in the United States Capitol
and now they are climbing through the window.
This happened a moment ago.
I know you're hurt.
We had an election that was stolen from us.
The next time Trump forced Facebook into a corner, Zuckerberg had become less involved in content policy.
into a corner, Zuckerberg had become less involved in content policy.
This time, what to do about Trump and the January 6th insurrection would fall to Nick Clegg.
So by that stage, Mark had anyway sort of decided that he was just keen to let me kind of take those decisions more fully on his behalf. Oddly enough, it wasn't a hugely difficult decision. It was
pretty clear that what had been
shared on Facebook and the run-up to the insurrection was clearly aimed at seeking to
interrupt the peaceful transfer of power, was clearly contributing to the violent kind of mood
of the time. And so it wasn't actually a very complicated question about that this was just simply not something that we want on our platform.
Trump crossed a line the company believed he had not crossed before,
past the point of newsworthy and into dangerous.
And in that sense was a classic example of exactly the limits to the kind of ethos that
Mark set out in the Georgetown speech. Of course there are limits to speech.
In retrospect, Clegg makes the decision to ban Trump
sound simple and definitive.
But it was actually a series of escalating decisions.
First, during the insurrection on Wednesday,
Facebook removed the video Trump posted
in which he told rioters to go home,
but also said, quote,
we had an election that was stolen from us.
Less than an hour after that,
Facebook took down another post where Trump wrote,
remember this day forever.
Then, based on those two violations,
Facebook went a step further
by putting a 24-hour block on Trump's ability to post.
It was the next day, on January 7th,
when Zuckerberg announced that the company
was banning Trump indefinitely.
But Clegg wasn't the only one who'd be on the hook for this decision.
Of course, we wanted to see whether our approach made sense,
so we referred it to the Oversight Board.
Which had just started taking cases in October, a few months before.
Yes, there was a fear that we would be, I would be used as a PR object.
That's Julia Wono, a member of the Oversight Board.
We left off with the board when it was just an idea.
Now it was real.
Wono is the director of a nonprofit called Internet Without Borders.
Her colleagues on the board include the former prime minister of Denmark
and a Nobel Peace Prize laureate.
And they were all wading into the most controversial,
high-profile content decision in Facebook's history.
It's former President Trump, you know? There was an election and a significant part of the
American electorate voted for him. So who are we to come into this conversation? But what I
appreciate is that the conversation was not about the election, was not about our role.
We focused on the rules.
What are the rules on Facebook and also beyond Facebook and international human rights law that could justify or not what has happened?
The board ultimately agreed that Trump had violated Facebook's rules, but it took issue with Facebook's ban being indefinite.
It just didn't make sense.
Like a judge sentencing someone to jail for an undetermined amount of time.
Oversight board member Julia Wono.
We told them that the indefinite aspect of the sanction, we didn't find that anywhere in the text, in the community standards, in the terms of services of the company.
And so we said, this is totally arbitrary.
You cannot come up with indefinitely, so give us a deadline. The board kicked the decision back to Facebook, pushing it to clarify whether Trump would be permanently banned or not. I mean,
they quite sensibly sort of slightly ducked the issue, but they equally made valid criticisms
that the way in which we had announced and explained our decision making wasn't entirely
sort of precise enough and in conformity enough with published rules. We arrived at this decision
to suspend him for two years and he will come back onto the platform in January.
January 2023. Depending on one thing.
They will do an assessment, an impact assessment of potential on human rights and potential
violation of the community standards that this return could represent.
According to Facebook, this report will measure whether the risk to public safety has receded.
If Facebook determines that Trump's rhetoric continues to be a threat, it could re-extend
its ban.
Two things are true here.
Yes, the Oversight Board pushed Facebook
to more clearly define how it handled Trump.
But this Trump case also showed
just how far the Oversight Board has to go
before it's anything like a Supreme Court.
Because something else came up
when the board got involved.
When we were working on the
Trump case, we did ask the company whether or not, you know, that there was exemptions for
this particular profile. The board's question to Facebook was, did the company have some internal
rule about how it treated high profile people differently than everyone else? A list, maybe,
of users evaluated with extra care?
Facebook said yes.
They just told us, you know, it's just a few, just a few people.
It's not just a few people, though.
No, it's not.
The program was called CrossCheck. In a published report, the board said that Facebook first told them this program
had only applied to a, quote, small number of decisions.
But then, a former Facebook product manager
named Francis Haugen leaked documents
revealing that in 2020,
the program actually covered 5.8 million accounts,
most of which were celebrities and politicians.
If you were on the list,
you weren't subject to the same rules as everyone else.
After Haugen revealed the full scope of cross-check, the board said Facebook acknowledged that
the phrase small number was misleading.
This all hits home a truth about the oversight board.
For now, Facebook is still ultimately the one with the power.
Here's Nick Clegg again.
So look, it is not a Supreme Court.
In a sense, it's unfortunate that people keep using
that analogy. It is the Facebook oversight board. It is what it says on the tin. I think what it has
done within the parameters that were set at the outset has way exceeded my expectations.
The thing is, even though the board is limited in what it can do now,
both Facebook and the board want its power to expand over time. The company just gave the
board another $150 million. I asked Julia Wano what specific power the board might push for next.
So we haven't in the past shied away from talking about things that we were not supposed to talk
about initially. That includes algorithms.
But wanting that power and getting it from Facebook,
those are different things.
In the meantime, this topic of algorithms,
how Facebook and Instagram decide what we see in our feeds,
that's what we want to turn to next.
Because right now, Zuckerberg is making a fundamental pivot to a new kind of social media,
a feed that places even more power in his company to determine what we see.
In our fifth episode, Facebook defined an era of social media built on our connections, our social lives.
We're watching that era come to a close.
So what is next?
News clips of Elizabeth Warren from CBS 8 San Diego.
Trump election clips from Morning Joe, CNN, and CBS This Morning.
George Floyd protest clip from the Associated Press.
January 6th news clips from CNN and CBS News.
Congressional clips from C-SPAN.
Land of the Giants, the Facebook meta-disruption,
is a production of Recode, The Verge, and the
Vox Media Podcast Network.
Megan Cunane is our senior producer.
Oluwakemi Oladesui is our producer.
Production support for this episode
from Cynthia Betubizo. Jolie Myers
is our editor. Richard Seema
is our fact checker. Brandon McFarlane
composed the show's theme and engineered this
episode. Samantha Altman is Recode's editor-in-chief. Jake Castronakis Thank you.