Big Technology Podcast - The Motivations Of Facebook Reporters, And Their Sources — With Ugly Truth Author Sheera Frenkel
Episode Date: August 11, 2021Sheera Frenkel is a New York Times reporter and author of the best-selling book, An Ugly Truth. She joins Big Technology Podcast to discuss her hit book and her reporting process. We address critics's... claims that Facebook reporters are harsh to the company because they’re mad Trump won, and they’re also upset that social media is eroding their gatekeeping power. Frenkel listens to these critiques and shares her perspective.
Transcript
Discussion (0)
Hello and welcome to the big technology podcast, a show for cool-headed, nuanced conversation of the tech world and beyond.
Today we're joined by Shira Frankel. She's the author of the best-selling book and ugly truth inside Facebook's Battle for Domination and a former colleague of mine from the BuzzFeed Newsday.
Shira, welcome to the show.
Hey, Alex. Thanks for having me.
Hey, I've enjoyed watching your reporting journey after we've both left BuzzFeed.
And I remember sitting next to you in the BuzzFeed newsroom when you wrote the tweet,
everything's been hacked.
We just don't know about it yet.
And it turned out that was pretty prescient, especially in the case of Facebook.
So it was nice reading a full-length book about how that was exactly what happened in the case of Facebook.
Yeah, I think I was just getting started in cybersecurity reporting and just sort of coming to terms
myself with how much of the internet had been hacked.
and we just didn't know that yet.
Yeah.
So let's get right into your book.
I want to start a bit meta because to many people,
the criticism from journalists about Facebook boils down to two things.
One thing they say is that journalists are just mad that Trump won and they're taking it out on Facebook.
The second thing they say is that journalists are mad that they're losing their gatekeeping powers.
And now everyone has a voice and they take that out on Facebook.
So I guess I'd like to start there.
So first question to you is, do you think Trump would have been elected without Facebook?
And do you think some of the reporting that's been done about the company, I'm not talking about yours necessarily, but some has been motivated by anger over his victory?
I, you know, I don't think myself or really any political analysts who's much better qualified than I am to talk about Trump's campaign would be able to tell you whether or not Trump would have won if it wasn't for Facebook.
I think we know that he was able to reach a lot of people through Facebook,
that he was an incredibly effective user of that platform and that he really or his campaign
really understood how powerful Facebook could be as a megaphone for people,
especially like Trump, who had quite, you know, newsworthy things to say.
What he said often was very emotive and it was the kind of content that was kind of
one was built to go viral on Facebook.
I, but yes, sorry, to answer your first question, I don't know.
I don't think any of us do know what would have happened if social media companies,
including Facebook and Twitter, were not around for Trump's campaign.
And then in terms of your second question, I can just say that, like, as a journalist,
I've never been motivated by that.
I don't know any journalists that have.
I've heard this among some people in the tech world, and I can see how for tech companies,
it's a good talking point because in their minds maybe we need a reason, you know, the reason we write
about tech companies is that they're incredibly powerful. Facebook reaches over 3 billion people
through its family of apps and it has less oversight than any government I've covered in my career
and I was a foreign correspondent for over a decade. I think it's our job as journalists to cover
powerful companies and, you know, that's why we write about them. Yeah, and there was a rosy view of
Facebook after, you know, Obama used the digital tools to ride that to his historic victory.
But I guess the difference is that with his campaign, there wasn't any foreign election
meddling that was used in order to get him into office. And that did happen in some cases
with Trump and Russia. Do you think that that's the core difference between the tone?
I think there was some critical coverage. It's funny. There was a version of this book that was
twice as long. And we had an early chapter, I still remember writing it, where we looked at the
Obama campaign and how they used Facebook. And sorry, I'm talking about his re-election campaign,
specifically his second run for office. And how he actually ran one of the first really big data
gathering operations on Facebook and tried to sweep up as much data about people who joined the
Yeah, that was during these campaigns. They were always talking about the Democrats' data
advantage over Republicans. And, you know, Obama understood that. Obama understood that. Obama
his campaign understood how powerful that data could be. And I think he actually, in some ways,
set a precedent that it's interesting, you know, Hillary Clinton's campaign did not choose to follow,
but Trump did. And so I think, you know, the roots of using social media as a tool to really,
you know, understand voter bases and reach voters was laid long before Trump. But, you know,
Trump was the one to use it more effectively than any others had before. Also, you know,
remember, like when Obama was running for office, Facebook wasn't.
as fully flushed out as a network, right? Like a lot of its capabilities and the, as we know,
the incredibly effective targeted advertisements, the only really, you know, they ramped up in those,
in that sort of interlude of time. And so Trump was able to fully take advantage of Facebook in a way
that Obama couldn't. I would love to ask people on Obama's campaign, if you could have done
what Trump did. Would you, would you have run those kinds of incredibly specific targeted
advertisements? Would you have A-B tested the way he did?
I imagine their answer would be yes.
Oh, they would for sure.
Yeah.
Yeah.
So we talked a little bit about the reporter motivations.
We think about the reader a lot.
There have been, I don't think there's any question that there have been, that readers have
seized onto this story and have done so for the reasons that some are looking to impugn the
credibility of journalists, right?
So I think a lot of readers do blame Facebook for, well, a lot of,
liberal readers are yeah a lot of liberal readers do blame facebook for trump's election and you know that
sort of powers i think a lot of it's funny right but it powers a lot of the sharing of the stories
that are about facebook and its failings and the russia uh stuff in cambridge analytica
um how do you deal with the reaction from readers to see some of these stories blow up and
see them interpret straightforward reporting of events uh and use it
for these political ends.
I mean, as a reporter, you can never, it's out, what's out of your control, it's out of your
control. And that includes what people tweet about your articles. And I'll say that I've seen
a lot of tweets from people that are right-leaning, you know, probably, you know, admittedly,
I think they say in their bios that they're Republican Trump supporters who've tweeted about
the book and have been excited about it because they think it reinforces some of their ideas
about Facebook. And yes, I've seen people on the left doing the same. And each one of them are
reading our book through a political prism and finding in it what they want.
Ultimately, our book is neither, I would say.
Our book is a look inside the company and why they made the decisions they made.
And if there's anything, I think, or I hope people will walk away with is this, this,
what we are covered, which is that, you know, a lot of Facebook's mistakes were really ad hoc,
right?
Like a lot of those decisions they make, which are hugely consequential down the line,
are just a group of like 12 people in a room,
remembering it out, trying to figure it as they go along, there isn't this, like, grand plan
motivating Facebook's actions that some people, Republican and, you know, Democrat, tend to put
on that company.
Yeah, no, I agree.
It's very apparent through the book.
I think that it's worth going through these questions because, yeah, I think it's a topic of
discussion that gets brought up a lot when it comes to Facebook reporting.
And if this is going to be, this is going to be the defendant book about what happened.
in 2016 to 2020, 2020, 2021, you know, we should run through it so people do have a sense as to
like, you know, how the authors themselves think about this stuff. The last question I'll ask
on this topic. Then we move on is how do you as a report, and I know we're like doing inside
baseball, but, you know, let's just go for it. How do you as a reporter think about the fact that,
you know, a lot of these leaks that came out of Facebook, you know, I've probably been motivated by
you know, anti-Trump Facebook employees who are trying to shape the narrative around Facebook.
Like, it's very clear that Facebook employees themselves were mad about the role that Facebook
played in the Trump campaign.
There was, in fact, I think you put it out there in the book, this question that got voted
up to, like, the top of the list of questions for Mark, saying, like, Mark Zuckerberg
saying, what can Facebook do to stop a Trump election in 2016?
And I imagine, you know, because Facebook didn't do that, there were a lot of left-leaning
employees that were ashamed, angry, you know, eager to see change and ended up breaking this
cone of silence that had always been around the company and starting to share some of these
details with the press. So how do you sort, you know, I mean, it's always tricky when you have
a source that wants to leak because you have to think about their motivations. But how do you
sort those motivations? And, you know, particularly when you end up going to write the story,
write the book, say, you know, have I heard from, you know, a majority of people who feel this way
and are talking to the press because of these reasons? And I, you know, and I can't let that influence
the direction of the story, you know, however much they might want to. I mean, I think Facebook
itself would tell you that the majority of people that work for it are probably, you know,
probably vote Democrat. No doubt. No doubt. Yeah. But, you know, yeah, I can tell you very honestly
that not a single person who spoke to us for the book did so because they were, you know, a Democrat voting American who was angry about Trump winning.
The people who spoke to us, every single one of them, I would say, including, so just to back up for a minute for those who haven't read the book, we spoke to over 400 people.
And the vast majority of those are people that are still employed at Facebook.
They spoke to us because they're angry about what's happened at Facebook.
They want to hold their own company responsible for mistakes that they feel they made.
it had nothing to do with who won the elections, but rather the bigger question of, well, we missed
Russian election interference. We messed up on Cambridge Analytica. We let ourselves down a rabbit
hole in terms of political speech and allow Donald Trump this carve out of being able to say what
he wanted, which down the road led us to allowing him to spread COVID misinformation on the platform.
Now, I actually am remembering right now vividly a conversation I had with one Facebook employee
who was a Republican who was like, yeah, I mean, I'm not.
not, you know, I'm not, I voted for Trump myself, but, you know, I'm still mad about how we
handled things. Like, we didn't have big policy decisions going into these. We didn't have
the foresight to imagine where Trump would take it. Like, there are just so many employees at
that company who feel like their own managers, their C-suite hasn't been held responsible and that
the truth has been really tightly controlled by a PR narrative of what Mark Zuckerberg and
Cheryl Sandberg think. And that's, that's why they want to come out and speak to journalists.
Well, you know, I mean, having lived out here, you know, just a few miles away from Facebook headquarters for a long time, it does seem like this place, you know, definitely is about as angry about Trump's election as almost anywhere in the country. So, and that's obviously the pool where a lot of, you know, Facebook employees live. A lot of them are draw from. Do you really don't, you really don't think that anger over the Trump election was their motivation at all for speaking? I mean, maybe the, the end.
things that they were upset about were policy and stuff like that. That's probably what they came
to discuss. But, you know, part of me has to, has to imagine that that angle for Trump's victory
had to be at least, at least have something to do with it. You don't think so? I really don't.
I have to, I mean, like, honestly, we had so many of these conversations with people because, as you
said, you want to understand what's motivating a person to come speak to you. You want to know
where a person's biases are. And yeah, it really wasn't, you know, it wasn't, it wasn't, it wasn't
the thing that, you know, I have to, I mean, a lot of them don't spend that much time thinking about,
I mean, they think about Trump being in the White House and they think about Facebook's role in
that. But really, like they're any other employee, day to day, they think about their managers,
they think about their bosses. They think about what they themselves did in their role to enable
misinformation to spread on Facebook, to ignore what happened in Myanmar. I mean, I cannot tell you how
many Facebook employees. If you want to talk about things that motivated Facebook employees, I was
shocked at how many Facebook employees came to me and they were like, I stay up at night because of
what happened in Myanmar. I feel really, I'm sorry, I'm not going to curse here, but, you know,
really, okay, really fucking responsible for that. And we sat back and all of us knew it. And why didn't
we bring it up at company Q&A's? And why didn't we like, you know, at all hands meetings with Mark
say, why aren't we doing something about this? Like, there are things that haunt them in terms of
real life, you know, real world deaths of people that a lot of them, you know, they go out to bars
and they drink about it. So, yeah, that came up spontaneously a lot more than, you know, I'm
really mad about Trump winning. Like, that's not something anyone said. Okay. Interesting.
I mean, yeah, it's, you know, they're both, they're both political things. Obviously,
the Myanmar situation was awful, remains awful. Also, a really important section of your book.
So I encourage everyone to go out and get it.
Let's move on because the second thing that people say when it comes to this type of reporting is,
oh, the press is upset because, you know, it used to be the gatekeeper and now user-generated content is the way people share information.
And they want to grab that power back.
What do you think about that argument?
Yeah.
I mean, you kind of hinted this before.
I just don't think that's what motivates reporters.
You know, I can say it's not.
what motivates me as a reporter, in no way have ever thought of myself or any of the journalist
as a gatekeeper of information. If anything, reporters think it's their responsibility to
hold powerful people accountable. You know, it's cheesy, but like a saying among many reporters
is that sunlight is the best disinfectant. And I think for much of Silicon Valley's history,
you know, they haven't had that much sunlight. It's been a very tightly controlled message,
as you know, from covering tech companies, you know, the army of PR people that work
this company, including Facebook, their whole goal is to see Facebook and its narrative shaped
in a certain way. I mean, more broadly, like, you know, Facebook relies on the public goodwill.
They rely on their public image so much. If you don't ultimately, if you, you know, if people
all over the world start to think of Facebook as really what it is, which is a company that
surveils you for data because they need that for their advertising model, you're not that inclined
to share, like, cute photos of your baby or your puppy. You have to think of it as like, oh,
a nice place connecting the world and making us open and connected because that's what's going
to motivate you to share with things with people and join groups and join pages and do all the
things they need you to do. So yes, sorry, Facebook has totally controlled its messaging like
many other Silicon Valley companies. And as journalists, it's about figuring out what's actually
happening at the company. They're incredibly powerful. And that's really the reason why we report
on them. And I just, I think anyone who thinks otherwise has to learn a little bit more.
about why journalists do what we do. Right. But I guess I think, I think that the argument isn't that, like, you know, the journalists want, you know, don't want Facebook to control its narrative. I think it is that, you know, not about reporting on Facebook, but it's more about that, like, you know, it used to be that newspapers, TV networks were the only place that people could go for information. And they made a decision about the type of stuff that they would cover and the type of topics they would discuss and they obviously missed things. And sometimes we're too
credulous when it came to what the official line, yeah, the official line was from government
and social media gives anybody an opportunity to talk. And, you know, this is the argument. I'm
not saying I agree with it, but the argument is that like, you know, the reporters who are covering
Facebook, not only Facebook, but all social media and, you know, talking about the need for content
moderation are trying to claw back the fact that they had like sort of this monopoly on the flow
of information and they don't anymore.
Yeah, that's, I can just repeat.
Like, you know, that's never been something that's motivated me.
Like, I, I, you know, I write about misinformation on Facebook and conspiracy theories on Facebook because it's a problem.
And because as someone who was physically on the ground in Myanmar, who was a foreign reporter for 10 years, a foreign correspondent for 10 years, sorry, I saw it.
Like, I saw first hand what hate speech does in much of the world, what misinformation does in the rest of the world.
And here in the United States, I should add, when it goes unchecked.
And so these are important things to cover because only Facebook can get a handle on it.
Only these Silicon Valley companies with their, you know, massive amounts of resources
can begin to get a handle on the problem that they themselves introduced.
So it's important to hold them to account.
And I'll know that like, Alex, you know, Facebook says themselves, like they often take this stuff
down only when journals bring it to them.
So only when I, for instance, today, I was looking at a number of extremely.
anti-Semitic posts on Facebook. And I sent it to them. And I said, hey, doesn't this, because
I'm reporting a story about this. And I was like, hey, I'm reporting on this. And I found these posts
and does it do these violate your rules? And they were like, yeah, of course these violate our
rules will take them right down. I will note that those same posts were reported by multiple other people
and the company didn't take it down. So I'm one person. I'm not getting paid anywhere near as much,
even like a project manager at Facebook gets paid. I'm not doing it because I'm, you know, I think
newspapers need to be any kind of like, you know, yeah, the idea that that it should only be
newspapers and radios isn't true, but ultimately newspapers and radios and TV, television stations
to some effect, you know, they have assumed some responsibility for the content that they share,
right? Like, if we get something wrong, we correct it, we issue a correction, we fact check things,
we take responsibility. Facebook has said that it's not a media company. It's not doing that
themselves but they need to find a middle ground because ultimately they themselves have said
they feel badly about what's spreading on the platform and so yeah they just they can't right now
it feels a little bit like they're trying to have it both ways and they're trying to rely on journalists
to bring to their attention everything that's wrong with the platform and that's just that's
not sustainable for anybody yeah I'm kind of curious what you think what is the right balance
between user generated content and content moderation like where would you draw the line
Do you think Facebook's policies are sufficient?
No, I mean, I think Facebook itself says its policies are not sufficient.
Its policies are haphazard, and they're figuring them out, and they know that.
I mean, any Facebook executive will tell you that, that they're still figuring it out.
So let me just ask it this way.
Where do you draw the line on content moderation?
Like, obviously, people need to be able to post stuff that's factually incorrect.
You can't take down everything that's wrong.
So where do you draw the line?
Like, there's no way that the Facebook feed.
can be held to the same standard as, you know, what's written down in an ugly truth or
on the pages of the New York Times.
Right.
I mean, the first thing you do is, is you admit there's no line.
You say, like, you know, it's, there's a lot of gray areas.
There's a lot that is, a lot of content that is people that are promoting things that
that fall within the realm of free speech, but which is damaging.
I mean, I think one thing Facebook could do is seek to work directly with government
agencies because right now there's a lot of like the ball being passed back and forth on the
court and sorry that's the closest I get to sports analogy but it just feels like you know
Facebook is asking the CDC or other you know health officials to come up with rules those health
officials like no you have to come up with rules like we're almost two years into pandemic at this
point you know it just it feels as though someone should sit down rules in which they say okay
it's a global health crisis it's pandemic here are the things that we think are really important
not to have spread. So whether that's posts that claim that vaccines are poison or posts that
tell people to burn their masks and effigy, like I am not a, I am not, you know, a member of the CDC,
nor my medical professional, but there are people who are those things who should, who should be
working with Facebook directly to say, here are the lines with content moderation, specifically
on this issue. I think more broadly, to answer your question, you know, Facebook knows doesn't have
enough content moderators. It knows that it has a problem, especially in the rest of the world,
in terms of how many people happen to moderate content in countries like India and the Philippines
and, you know, many others. But it knows this problem is ongoing. And despite its tremendous
resources, we still haven't seen them really build up resources in much of the rest of the world
they know they need. And therefore, it's just a matter of time before we see yet another case
of hate speech leading to real world violence in other parts of the world. So,
So, yeah, I would say what they're doing right now isn't enough.
They admit that themselves.
And I think the bigger question is, like, how they grew to be as big as they are
without getting a hold of these issues and figuring out some of these issues earlier on.
All right.
Why don't we talk about that a little bit more after the break?
We'll be back right after this on the big technology podcast here with Shera Frankel.
Hey, everyone.
Let me tell you about the Hustle Daily Show, a podcast filled with business, tech news,
and original stories to keep you in the loop on what's trending.
More than 2 million professionals read The Hustle's daily email for its irreverent and informative takes on business and tech news.
Now, they have a daily podcast called The Hustle Daily Show, where their team of writers break down the biggest business headlines in 15 minutes or less and explain why you should care about them.
So, search for The Hustle Daily Show and your favorite podcast app, like the one you're using right now.
And we're back on the Big Technology podcast here is Shira Frankel, New York Times journalist, former colleague of mine, author of other.
of an ugly truth inside Facebook's battle for domination.
So, Shearer, we were talking a little bit about content moderation beforehand.
I think there's this debate that's going to just keep going, where liberals, you know,
say take down more stuff, Facebook.
Conservatives say, Facebook, you're censoring us.
I'm curious what you think about the debate, especially in line of the fact that I've had
some discussions with conservatives here on the show recently.
And they say that, you know, social media is kind of like an alternate media.
for them. And, you know, they don't see themselves represented on in the pages of the New York Times or on, you know, networks like CNN or I guess even in mainstream magazines. And this is sort of the place where, you know, alternative media has formed for them. So I'm curious if you think that, like, if that, yeah, I'm curious what you think about that read on, on the situation.
Sorry, what is, tell me again, what is their read exactly? Well, their read is that, you know, basically conservative viewpoints don't,
get a lot of airtime in the mainstream press.
And so, of course, they're going to go to social media, just like gaming doesn't get a lot
of coverage in the mainstream networks.
You don't really see a lot about gaming in the Times or on CNN.
So, of course, gaming is going to be popular content on social media.
And so they see talks about content moderation as just like, you know, this attempt
to crack down on the content that's started to thrive outside of,
the mainstream media.
Yeah, I mean, I think that fundamentally the premise there isn't right, you know,
the studies that have looked at Facebook show that the things that are regularly among the
most viewed on the platform and most shared on the platform are, you know, do come from
people that are self-proclaimed conservatives.
And so that, you know, that's out there.
It's, there's a number of experts who have spoken out about why they think that's happening.
And then in terms of not being featured in the pages, like, I don't know what they mean by that.
Do they mean not being featured on the op-ed pages of the things?
the New York Times or, you know, other leading publications because ultimately what we cover
as news reporters isn't conservative or liberal. And I think that's like fundamentally a problem here
that, you know, I wrote an article earlier this week about the biggest spreader of anti-vaccine
misinformation, who has been named by the White House by a number of academics and researchers
as the most influential spreader of anti-vaccine misinformation. And I was sent,
a lot of hate mail from people who said, why are you trying to ban conservatives? Why are you trying
to, you know, get conservative content? And my answer to that was that this has nothing to do
with conservative or liberal. Like, in fact, quite a few of his followers are progressive, left-wing
liberals who believe in natural health and yoga. Like, it just has to do with the fact that he
spreads vaccine misinformation. And that's, that's neither, that should not be liberal or conservative.
That should just be something that we all agree is bad for public health. Yeah. You had an interesting
story in the book about Nancy Pelosi. There was that slowed down video of her that made her look
like she was drunk while she was making a speech and she got, she had been an ally of Facebook
in Congress and then got angry about what happened and sort of said, don't take any of their calls.
What would you have done with that doctor video, taking it down?
I mean, I can't answer that. I'm not an executive of Facebook. I'm, you know, and as a journalist,
I don't think it's my role to answer that. What I would say is that they should,
have had a policy in place beforehand.
You know, they knew these kinds of videos were going to happen.
I'll remind you that Christchurch, that shooting had happened not long before where they had
seen already how quickly something could go viral on Facebook groups and pages.
And it was for us, I think, a really stark example of how Facebook waits for a crisis
to hit before they try to come up with the policy.
You know, what would Facebook say to that?
I'm kind of curious.
Like, you know, there's so many different types of media.
Like, they had a deep fix policy in place.
They had videos for, you know, a policy for authentic videos.
Like, this was one where it was slowed down to manipulate it.
Like, you know, it's a new, new forms of media are showing up all the time.
It's, you know, part of what happens when there's new technology in place.
So do you think it's a fair excuse for them to say they can't anticipate some of the stuff?
Or, you know, do you think they really should have this stuff buttoned up?
You know, I think our book shows that again and again, they're warned that this kind of stuff is coming.
So, yes, this book's PR line will often be that they couldn't anticipate.
anticipate it. And one thing we tried to show with our book was that they could because people
within their own company were raising their hands and saying, this is a problem. We foresee this being
a problem. Let's figure out something now. And those people are ignored. So yes, I do think they should
have seen things coming. And I think that it's important for companies like Facebook to hire people
who will be that voice in the room who raise their hand and say, here's the dark side of this.
Here's the bad that can happen. Here's how this product is going to get misused. And I think
I will say not just Facebook, many Silicon Valley companies, don't have those people.
They like to hire among these, like, techno-optimists, you know, this techno-utopian crowd
who really want to cohesively believe that technology will advance society and do miraculous things for all of us.
And, you know, I hope in hindsight some of them realize that they need a balance of opinions in the room.
Yeah.
And I think one of the most disturbing parts of the whole book was this guy who had alerted them about what was going to happen.
And they just were flat-footed until it was too late.
Right.
And I mean, I'll note, I think this is some, God, I hope we didn't cut this from the book.
I think it's still there.
We've mentioned a bit, maybe not a fully fleshed out scene, but like, you know,
ahead of Russian election interference, it was also a group of European lawmakers who come
to Facebook's offices from, I think they're from Ukraine, Estonia, Latvia.
They come to Facebook's offices and meet with the policy team.
This is in 2015 and say, hey, Russia is wreaking havoc in our countries.
every election we have. They've got a troll army, which is spreading disinformation about our politicians.
We want to flag this for you and let you know. And Facebook ignores them. Like the security team
doesn't even find out about that meeting. And these are again, people in their offices trying to
raise the flag and be like, hey, here's a huge problem we see looming. Please pay attention.
And they're ignored. Yeah. And one thing that really struck me as I read was how often Facebook
obfuscated the truth. I mean, that stuff really made me mad. Just understanding that
they knew that there was Russian election interference going on on the platform and just
kind of hope that it would go away. I mean, do they not think that this stuff was going to come out?
I kind of wonder, like, what you think this? I mean, a lot of this is, your book tour has been
psychoanalyzing this company. But what's, I mean, are they living in such a bubble that they
imagine that this stuff would just go away? It's a great question. I don't know what their
motivation was, I mean, surely they must have known that eventually was going to come out and people
were going to be mad. I mean, maybe they talked to control the messaging a little bit more.
You know, as reporters, that's always a hard thing to find out. I remember reporting on white paper
that Facebook released in March of 2017. And I had heard from a couple sources that the paper
originally had a Russia section and that it was omitted at the request of executives. And I called
it Facebook's peer department. And I was told that it was categorically false. I still. I
still remember it. I remember the words. That's categorically false. There was never any
Russia section. That's never something we included. And because it was such a strong denial,
my editors and myself decided, okay, well, you know, it's a really strong denial. Let's guess
it didn't happen. And then I go and report for this book and I find that there are multiple
versions with very long Russia sections that were in fact vetoed by executives. So yes,
I mean, that is a very clear example of them obstigating the truth and not
wanting things to become public in a much earlier date.
And I'll remind people listening that it's not until September of 2017 that Facebook actually
does go public with what they know regarding Russian election interference.
It's like a child dumping a bucket of paint on their wall and like not telling mom and dad
because they're so nervous about what the reaction will be.
But anyway, they're going to find the paint.
And like they've pushed their bed up against the wall so you can't actually see it.
But there's paint splatter all over the floor.
So you're like, no, I know it's.
there. Just show it to me already.
Tell me the pay. And I actually think a lot of the anger at Facebook was because they did wait
so long. Like, I really wonder if they hadn't come public with what they knew sooner, if they
wouldn't have avoided a lot of the public anger on this. Yeah. All right. We have about, let's see,
six or seven minutes left. So let's take one more break. And then we'll do kind of a rapid fire round.
This is a new thing I've been trying on the show. I've kind of having fun with it. So I'll do rapid
fire or go through some things that I found interesting. And then we'll wrap and let you go on your
on your day, both you, Shira, and our listeners.
So we'll be back here in just a moment on the big technology podcast.
And we're back here for one final segment of the big technology podcast.
I wrote down hot takes, but maybe just call it rapid fire.
I don't know exactly what we need to title of this section.
But we'll go into it.
So, Shira, I wanted you to talk a little bit about Facebook's internal surveillance of its employees.
They would track their phone locations to match leaker locations and all sorts of other
you know, KGB style tactics. What was going on there? Yeah. And are they any different from the rest of
these companies? It seems like they're on like a kind of next level. You know, it's interesting. It's a good
question. I think all of these tech companies do surveil their employees because they're all
really worried about leaks. And I'll note that, you know, interestingly, a lot of the worst leaks
that Facebook and other companies have had have been employees that are actually selling proprietary
company information to other companies, to competitors.
And so that's a lot, big part of what they're worried about.
But, you know, they're also worried about reporting, you know, them talking to reporters.
And so, yeah, I mean, I've been aware of it for a long time because Facebook sources were
nervous about meeting me.
They didn't want me to bring my cell phone when I met with them and they weren't bringing
their cell phone.
So I was aware that, you know, whatever technology Facebook was using was enough to at least
no real-time location.
But I think I was surprised at how aggressive they were and trying to,
to find leakers and not to get too meta, but definitely in reaction to this book, I know that
they've tried very, very hard to find our sources. Have not found any yet, but that is a focus
of the company. Wow. I do like remember these anecdotes of like Zuckerberg announcing the company
that they had fired a leaker and everybody clapping. Now, I understand, you know, leaking, when you
leak, you know, I can understand how it can feel like a betrayal to the company, especially if you
like go out and just like share the roadmap. I mean, of course, us as journalists, we want that. But
then, you know, honestly, like, if your company continues to fail and people are out there speaking
because they want to make it better, you know, I think oftentimes the right reaction is
ask why people are leaking and not how do we catch them and fire them. Okay. Anyway, that little
ramble is done. Let's move on to the next question I had for you, which I think you even used
the term surveillance capitalism in the first segment or something close to it. Sure, are these
ads really that bad? I wonder
who gets hurt and whether
this term surveillance capitalism is actually
kind of a little dramatic given that
we're looking at like ads that
aren't even that well targeted
for boots and stuff like that.
You know, I think
the term surveillance capitalism
came for Shoshana Zubrov, who we quote
in the book, who wrote an excellent book about
this. You know,
we don't use it. I don't think ourselves.
I think that, you know, yes,
the advertisements can be very useful.
sometimes it's nice that the advertisements are targeted to your interests.
I think why it's important for people to understand what that term is and in general what we hope people take out of our book is what makes Facebook work as a business, right?
Like people should understand that Facebook is in the business of collecting your data.
They need you on the site as long as possible for as many hours of the day in order to be effective as a company in order to continue to show that kind of growth that their investors have gotten used to seeing.
And so whatever they do is incentivized by that.
You know, the one like comparison I've made in some of the interviews we've done regarding the book is that I think of it a little bit like sugar.
Like a little, everyone knows a lot of sugar is not great for you, but we accept a little bit of sugar in our lives.
Like I think for people to start thinking of social media like that, sorry if this is overly simplistic.
I know your audience is much more tech savvy than having.
Well, look, I mean, we've, I feel like within either, you know, within like in the coming weeks where we've already run it, we have a.
conversation with Nehry Al where we talk about the same stuff. So you're right on target. Yeah. Yeah. Think
of it like sugar, like a little bit of sugar in your life inevitable. Like it's, it tastes good and it's
nice to connect with friends. And Lord knows, I have cute kids. I like sharing photos of them on the
internet. But a lot of it I know isn't great for me. I know that a lot of it is, you know, out there. I can
spend hours on Facebook scrolling through posts that get me angry. I know that that's not great for me
for my mental health and so I don't do it. Yeah. You even have a point in the book where some
someone points out, one of the Facebook engineers points out, the type of content that it takes
to get people that engaged and asks, is this really what we want to show? And I found that was
so interesting because, you know, people talk about censorship and all this stuff. And actually,
it's all editorial decisions. It's like they can decide what they want to show up there, you
know, whether they want to highbrow or lowbrow or left or right or whatever. But sorry.
Right. And we have, no, we have this point at the very end of our book in which we talk about how around
the November 2020 elections, Facebook made the decision to show more, to more prominently feature
news sites that had a high level of accountability and of fact checking. So they have the scale
that they rank all the different news outlets. And the ones that were ranked lower, as in that
they were repeatedly flagged by fact checkers for sharing things that were false, were downranked
on Facebook's algorithms. And the ones that were higher up on that scale were upranked. And so
basically Facebook made this decision that were in a period of elections.
it's really important for people to receive truthful information, right?
Like, let's make sure that the top of their news feeds has information from news sources that are vetted.
And so they kind of like make this calculation because they're like, oh, we're in this pivotal moment of democracy.
But I guess I'd ask like, wait, what about the last four years?
Like, what about everything before?
And then they, by the way, I will say that that only lasted for about a month before they reversed course on that decision.
And look what happens just a few months later with the Capitol Hill riots in January.
Yeah.
So I don't know. I'm curious if Facebook executives don't look at all that and say, why don't we, you know, figure out a version of that that's more permanent.
Yeah. This is why I think they'll eventually lose their section 230 privileges because like, you know, at the end of the day, like, it's hard for me to look at this as any, as a platform as much as I look at it as a media company now, less social, more media. Do I have time to ask you one last question before you up?
Yeah. Yeah. I just, I'll, yeah. Yeah, we each time for one more.
So there's not going to be a book about Facebook in the 2020 election, at least I don't think so.
So what does that say?
Well, I'll say that, you know, the last chapter of our book does cover Facebook in the 2020 elections and everything that was important about it, which was that they ultimately solved the problems of 2016 in terms of foreign election experience, but didn't solve the problem of peer-to-peer misinformation, Americans spreading misinformation to other Americans, which everyone was telling them back in 2019 and 2018.
that this was the coming problem, that the problem was this gray zone of, you know,
how do you stop Americans from using their First Amendment rights to say things to other
Americans that aren't true? And so, yes, as you know, I want to. Or do you want to? Is it your
responsibility, right? Like, those are all excellent questions. And that's really what we tried
to cover in the last chapter of our book is how, again, a problem of Facebook's own creation
with algorithms and how the algorithms rank things and all that emotive content going to the top
the newsfeed. But what is Facebook's responsibility there? Terrific. Well, let's wrap there.
Shira, thank you so much for coming on. Always great to talk to you about these things.
And I did enjoy the book. It was a really good read. And I appreciate you, you know,
talking through some of the subject matter here and then also the motivation stuff. I feel like
the more that we talk about the motivations, the less resonant some of the bad faith attacks
will have. And, you know, sometimes it's good for us to be introspective as well. So,
That will do it for us here. Thank you, Shira, for joining.
Thank you to Nate Gwattany for doing the editing, Red Circle, hosting, and selling the ads.
And once again, to you, the listeners, appreciate you coming back every week.
It's great to have you here on the Big Technology podcast.
We'll see you next Wednesday with another interview with either a tech insider or an outside agitator or a journalist.
Stay tuned for which one that will be.
And we'll see you then.
Thank you.