Big Technology Podcast - Carole Cadwalladr and Yael Eisenstat on The Criticism of Tech Criticism
Episode Date: March 31, 2021Carole Cadwalladr and Yael Eisenstat are two of the most prominent Facebook critics worldwide. Cadwalladr is the journalist who broke open the Cambridge Analytica story for the Guardian and The Observ...er. Eisenstat, a former CIA officer, worked on election integrity inside Facebook for six months before quitting and speaking out against the company. The two Facebook critics join Big Technology Podcast to discuss some people’s disenchantment with tech criticism, the role of Facebook’s Oversight Board, and how the company might fix its product.
Transcript
Discussion (0)
Hey, Carol.
Hi, Alex.
Thanks for joining.
Hi, Yael.
Hey, how are you?
Good.
Excited to do this one.
All right, let me read the ad, and then we can just get right into it.
MediaOcean is for people who like to be efficient, like listening to this podcast at 2X speed.
So I'm talking super slow to make sure this ad is super clear.
If you work in advertising, go to MediaOcean.com.
to watch highlights from the Omnichannel imperative.
That's Mediaocean.com slash big tech.
Hello and welcome to the Big Technology podcast, a show for cool-headed, nuanced conversation
of the tech world and beyond. Joining us today are two special guests.
Carol Cadwalader is a Pulitzer Prize award finalist, best known as the journalist who broke
open the Cambridge Analytica story for The Guardian and the Observer. If you don't remember
what happened or haven't heard of it, essentially it involved a political data company, Cambridge
Analytica, elicably obtaining Facebook data on up to 87 million people, using it to build
psychological profiles of them, and then targeting ads for the Trump campaign in 2016,
using that data to target its messaging. It's Facebook's biggest data scandal ever,
maybe its biggest scandal ever, and it still influences much of the conversation about the
company today. Yal Eisenstadt was Facebook's global head of elections integrity operations for
political ads. Before that, she spent 18 years moving about the globe as a CIA officer. Now she is
one of Facebook's most prominent critics. Both join the show in their capacity on the real
Facebook oversight board, which is not actually the Facebook oversight board, which some people
also know as Facebook's Supreme Court, but rather an advocacy group co-opting the name. We'll get to
that and more shortly. Carol and Yael, welcome to the show. Hey, Alex, thanks so much for having us.
Yeah, this is going to be super fun. I've been following both of your work for a while, and I'm pretty
thrilled to get a chance to speak with you.
can we do some brief introductions first just so we can introduce you both to the audience
okay you're into it um all right carroll let's start with you so look as you tell it um
you're writing features and then decided to become an investigative reporter sinking your teeth
into the Cambridge Analytica story so how does that happen well I kept all waiting for the big
boys to turn up I was sort of I stumbled across this incredible story
and I didn't feel that I had the sort of skills or experience really to disentangle it myself.
And so I kept on thinking the crack team of like hard-knuck investigative journalists are going to show up
and they're going to rip it from my cold dead hands, but they're going to do a really good job of it.
And that'll be quite a relief and they never did.
So I just in my very, you know, I just sort of carried on essentially scratching away,
refusing to give it up for, I think it took 18 months from the very first story I did
to being able to publish the account of Christopher Wiley,
who was the sort of whistleblower who blew it all open.
So, yeah, I had to sort of, I mean, I slightly do it down.
I had done investigative pieces before, but not in this way,
not in the kind of following a story for 18 months.
That was not something I'd ever, you know, envisage.
before. So in that way, I really did have to sort of step up to try and sort of
wrestle this story down, essentially. Yeah, and this is the biggest data scandal in
Facebook history, something like 80 million records of people were illicitly obtained and
then passed along for data targeting purposes. How do you come across that story?
Well, I mean, it's kind of interesting. I mean, and this is where I sort of think the insider
an outsider thing is interesting because I was out, you know, I wasn't in Silicon Valley.
I wasn't a tech reporter. I was coming from a very different background. And throughout that
18 months, you know, I had any number of people, technologists, tech reporters, telling me that
it wasn't the story, that it was overblown, that it wasn't real, that everybody did it, that there
was nothing to see here, that it was a conspiracy theory, that it was snake oil, that it
It was, you know, this idea of sort of brainwashing was ridiculous and it was based in a sort of ideological view.
And they just, I mean, apart from it, the biggest thing really was that I was just sitting in a different place.
I was in London and where actually we do have data protection laws.
And there was every evidence to suggest that what they'd done was illegal.
So just like, you know, screw the rest of it or I'm like how much it worked and didn't.
It was just that it's very basic questions about the legality of what had gone on.
And then the second thing was, was that, again, not in being in Silicon Valley, not in Washington,
where Cambridge Analytica was just being seen as another political consultancy,
which had oversold its services.
I was in London where the company had been based, and it was a military contractor.
And I just couldn't get my head around the fact that a military contractor, which was normally working in Iraq and Afghanistan, had turned its attention to elections and was using the same kind of methodology that it had been employed by the British and American state and NATO and all the rest.
So it was really this just thing about having a different angle and approach, I think, on the story, which just did enable me to see it differently.
for that reason, just, you know, refused to be deflected, actually, by any number of people
telling me that I was misguided and wrong.
And that still goes on, actually.
I'm still misguided and wrong.
I mean, that's the kind of funny thing about it, but anyway.
Well, look, I mean, you know, I was here as a reporter in Silicon Valley, was reporting on
advertising in New York before.
And, I mean, I'll admit, like, my initial reaction was like, okay, so some more data leaked.
We'll get to that in a bit.
How interesting.
I'd love to hear about that, Alex.
I'm going to be honest.
I'll be honest about it.
I'm not going to even try to hide it.
We eventually realized it was bigger than that.
So good on you for pursuing it.
Yael, okay.
So your background is fascinating.
You worked as a CIA officer and then end up at Facebook taking on election integrity.
You're at the CIA for something like 18 years or something like that.
And your Facebook tenure was 136 of that.
You left after six months.
and you're now one of its boldest critics.
So what happened there?
So I'll try not to give the whole long history of it all.
But, yeah, so I'd spent most my career in government, both CIA, then I was a diplomat,
and then actually at the White House.
And it's interesting, I was always a global thinker.
I was always working on issues abroad, on global security and policy issues.
And it was in 2015 that my entire lens shifted because I started
really thinking, and I know this sounds sort of cliche now, but really thinking that this
breakdown in discourse and this growing sort of, I'm not saying the hatred wasn't always there,
but this very public-facing hatred in the U.S. was becoming a bigger threat to our democracy
than any of these sort of overseas extremism issues I'd been working on.
So I started writing about it and speaking about it and digging in more and more and realizing
more and more that social media was a pretty big role in it and didn't really know yet exactly
how or exactly what that meant. Of course, Carol's reporting was one of the big things that also
I was reading and eventually basically Facebook called. I'll be frank to this day. I will never
understand why they actually hired me. I don't think it's a PR stunt because they didn't make a big
PR splash about me. But they did hire me to say what I was supposed to do was head this new
team. I was supposed to get the authority to build the team, hire for it, and really figure out
how we, on the political advertising side, would, I mean, basically fix what had happened.
They hired me, they made the offer to me one minute after that famous Senate hearing with Zuckerberg
ended and said, I mean, I was very clear.
I said, don't hire me if you don't mean it.
Like, I'm going to dig in.
I'm going to see how we got here and then figure out how we can make sure that Facebook doesn't continue to be manipulated to influence elections around the world.
And to wrap the story and then you could ask more questions later.
I mean, I was disempowered, if that's a word, from day two.
That's a word.
Yeah.
So you had a good orientation and then.
Yeah, orientation was, you know, the sort of like Disneyland version of.
of Facebook and then on day two my manager told me she was changing my title uh the offer letter
says head of global elections integrity operations the new title is manager and she said she was
rethinking my entire role and uh on day two on day two so i hadn't made any mistakes yet i wasn't
i don't think i made any mistakes during orientation um this happened on day two and that's a fundamental
point to my story because you know the Facebook PR machine. Like people at Facebook love to say,
but that wasn't a role. She wasn't in charge of this or she didn't do that. Yeah, you're right.
I didn't. But that's what I was hired to do. And then with, I would say, a pretty long track record
of fighting to protect democracy my entire life. And I mean, I still spent time working on the
issue, digging in, learning, proposing ideas, but I never was given a seat at the table.
Yeah. I'll say this. I think there's definitely philosophically an interest inside the top of
Facebook to bring people who are not techno optimists into the company and to eject their
viewpoints into the bloodstream. But I think it's pretty clear from what you ran into that
the rank and file are incentivized to optimize for engagement.
and someone in your role doesn't exactly jive with that incentive.
And so we'll talk about that in the second half.
I have a lot to ask you about that.
And, you know, in terms of like the actual product changes that Facebook could make
to help ameliorate some of its problems and why there's no will to do that.
But let's leave that for now.
I actually want to start with the criticism of tech criticism because I think it's important
to address.
There's a post going around medium about by someone.
named Lee Vinsel. I hadn't heard of him before, but I thought it was an interesting post,
even though I don't agree with everything. But I thought his argument was interested. He argues
that critics of tech are giving these tech companies actually too much credit. He says,
essentially, that they rewrite their press releases claiming these bold capabilities and then just
make them dystopian sounding without spending enough time questioning that claims themselves.
And Carol, I think kind of what struck me in your introductory comments, you know,
where like I think you mentioned like well it doesn't matter if it's work if this stuff works it's illegal
and so I actually would like you to comment on like whether you know there's been a ton of criticism
especially the Cambridge Analytica stuff talking about how these companies are like manipulation
machines and yeah just ask you outright do you think that the tech criticism of companies like
Facebook gives them too much credit for the value of their data for their capabilities
and should we have a broader discussion about whether this stuff is actually capable of manipulating us the way that the narrative says it is?
Well, I think, I mean, it's kind of, it's really interesting because that goes to the heart of so many questions, I think, doesn't it?
Because part of the problem is that because they don't let anybody in, so because it is a black box, because this is proprietary data and methodologies and all the rest of it, we're not able to send in independent researchers to, like, make those assessments.
And that's one of the sort of huge problems at play.
But I think, you know, I think it's a stretch to suggest that Facebook's targeting technology doesn't work,
considering the, you know, that advertising is whatever it is, a trillion dollar industry across the world.
And, you know, I personally find it a stretch to believe that advertising doesn't work,
given the huge amounts of money that go into it.
And I find it a stretch to think that the more information you have about individuals
and the more highly able you are able to target them,
that that doesn't have an impact.
And I suppose, you know, one of the things is that we're all people,
we're all individuals also in this landscape.
And a lot of the targeting is quite depressing.
I saw, oh, and yesterday, which is that I live in the London borough of Islington, and I had one down the side of my page, which was targeting.
It was sort of like coffins for the over 50s living in Islington.
And I thought, that's a bit depressing, but they have got some, there's some bits there and geographically and geographically, they've got me paid.
So, I mean, I don't know.
I don't feel I'm the greatest expert on this,
but I, you know, I can't see how we can't take them seriously.
There were some folks who basically took the Cambridge Adelaidecica story
and spun it up into this.
And actually, I'm curious if you think there's legitimacy to this,
into this idea that, you know, Facebook let go of these 80 million records
and they basically contained the key to manipulating people
and to voting for Trump, and that's why Trump won.
And we talk a little, I mean, it was all data, right?
It was a couple of years old, and we talked a little bit about, like,
the half-life of data, and it becomes less effective, the longer it's there.
So what do you think of that narrative?
I mean, I've been skeptical of it, but I'm willing to hear the counter argument if it exists.
Well, it's really interesting because in terms of, because, you know,
I think we're just talking about Facebook.
there generally. But in terms of Cambridge Analytica, I mean, as I say, I actually, for the first
18 months, more or less of my reporting, I never used the word psychographics, for example,
I really steered clear of the kind of like larger claims about whether or not that worked in the
political sense and what impact it had in terms of electing Trump. Because again, you don't know.
And it's, you know, there is no control experiment.
So, so and that was where, and that I think particularly because that was where the skepticism lay,
that I very much chose not to ground my reporting in that.
And it was much more about the responsibility of the company,
the fact that this, the data was open to be abused.
and the fact that this company, I mean, I think that for me was probably the most troubling
thing was that, you know, the company, that the data had come from Facebook, and then
Facebook was also then being used by the same company to target people. And that was this
sort of, it was this two-fold, two-fold process that Facebook was directly implicated in. It hadn't
kept people's data secure and then it allowed itself then to be used for targeting purposes.
And so again, it's like for me it was never about, you know, there was no evidence around the
efficacy. I mean, there was a very interesting story, which I don't know if you saw, which was
by the British Channel, Channel 4 News, ahead of the 2020 US election, which is where they'd
got the entire database from the RNC and went round looking at the targets of voter suppression
and how they had been specifically targeted by Cambridge Analytica using Facebook's
own tools to suppress the black vote. And there was some very compelling evidence around
that. So part of my suspicion always around Cambridge Analytica was that it didn't have to be
that effect, didn't have to be that sophisticated to be effective. So things like voter suppression,
for example, doesn't take a highly sophisticated approach. It's just about scaring people,
deterring them. And that's an easier thing, I think, than the idea of sort of like you're
tweaking this and that lever in their brains. There's some sorts of blunt tools, which when you have
this amount of data, you're able to deploy. Right. Yeah. What do you think about this?
I mean, we've spoken a little bit offline about it, but I'm just kind of curious about
there's a growing line of criticism of tech criticism and what do you make of it?
Yeah.
I actually think it's an important topic.
I'm not going to talk about any individual's work or any of like the actual details of the piece.
I will say on a higher level, what concerns me about the piece is it lumps every single
person into one general bucket of tech critic.
And then, yes, he goes into more detail in the piece.
But each so-called tech critic comes with their own set of experiences background.
And I think each one should be evaluated on their own merit.
And the reason I say that is it serves a certain talking point to push this narrative that tech critics are just whether it's, you know, actually let me go to when we first launched when the real Facebook oversight.
board. Somebody who's pretty prominent in this world made some comment about how we were all just
this or we were all just that or we were all self-promoting or selling books. And I think they said
that before they saw who the members of the board were going to be. And then you look at the
members of the board. And it's like some very serious academics, very serious journalists. Everybody
has a different perspective. And lumping everybody into this category really serves a
higher purpose of the tech industry being able to say, none of these people really understand
it. None of these people really get it. It's too hard or this, that, or the other. I recommend
really, there's some people who, yes, give more nuance than others. And there's some people who
dig deeper than others. I just be wary of even the term the tech lash. It's an intentional
term to make it sound like those of us who criticize some of the actions that are happening
in the tech industry are doing it just out of a desire to criticize.
Like from my perspective, I am always careful to say, I'm not a data analyst.
I am not a deep academic.
I am speaking to the experiences I had there, what I saw.
And to me, I look at the very obvious political decisions and business decisions that are
made at a company like Facebook that are antithetical to actual open democratic norms and
values. And so I just, I would caution when people to easily lump all tech critics into some
large bucket of what's behind saying something like that, is it because you'd rather silence
some of these so-called tech critics who are making very well-reasoned, well-researched points
from their own experience.
Yeah, I think it's hard to separate some of the overreaches of the criticism with the actual
substance of it, but it's super important to do it because it's very easy to be like,
okay, well, you know, there's something that I don't agree with.
This whole line of thinking might be wrong.
And I do think there's obviously some real validity to the tech criticism that we're seeing today.
And so I think, yeah, part of like our discussions like this is to try to parse that out.
So, you know, last question on this topic, how much is Facebook responsible for the rise of authoritarianism across the globe?
Because there does seem to be a meme.
We've talked about it on this show before, but it does seem to be a meme that just blames Facebook for everything that's wrong in the world.
And of course, like authoritarian leaders have used Facebook to rally support for themselves and they've been very effective at communicating to their supporters through the platform.
But there's also other factors, changing economies, the rise of globalization.
So both of you have looked at this very closely.
Where's the link to Facebook there and how important of a role does it play in this whole movement that we've seen across the globe?
Carol, I'll let you go first.
I mean, it's interesting, isn't it?
Because as you say, it's like correlation is not causation.
These things are, you know, the troubling, the troubles.
some nature and effects of social media that we're witnessing are existing at the same time as these, you know, populist authoritarians have emerged. And we do see an alignment. So in the countries, the leaders who've been most adept at using social media and particularly about using it to sort of fan, particularly fear. I mean, that's one of the things, I think, which is that, you know, this sort of fear-based messaging.
Oh, it's very good for that.
Some of populist tropes.
Sorry?
It's very effective at that.
Yeah.
It's very effective.
It's very effective, exactly.
And that's what I'm sort of seeing about the blunt tool.
It doesn't have to be a hugely sophisticated way.
And I think, you know, one of the things, so I think there's a lot of questions.
There's a lot of unknowns.
These are sort of historical forces.
But I think one of the things we have seen, which has been very troubling, is a sort of alignment
between these leaders and between the Silicon Valley companies and particularly Facebook.
And we saw that very clearly in the relationship between Trump and Zuckerberg
and the inability and unwillingness and reluctance of Facebook to upset its
critics in the ruling party.
And we see that very directly, for example, in India,
where the relationship between Facebook and the ruling party,
party has been sort of unpicked, particularly by the Wall Street Journal. And, you know, you see
it's an alignment of interests in an alignment in power. And that, I think, is authoritarian and is
deeply troubling. And, you know, it's receded slightly. We don't know how long in the United
States, but it hasn't in other parts of the world. And, you know, we do really have to
I really, you know, very, very strongly feel that United States has a real duty to other countries which are still facing this and have absolutely no traction or access or means of leverage on these American-based companies.
Yeah, I do think about how difficult it's been for American critics to break through with the folks inside Facebook and then you just expand it to the rest of the globe.
I mean, there's been this back and forth with your parliament and Facebook about trying to get Zuckerberg.
to testify and it just it does not seem to be in the carts at the moment so no i mean exactly and i
think that's i think it's really funny because you know in that way is that it is you do really you know
even though we you know we share the same you know much of the same culture and the same language
and much of the same media is that even just being in britain puts me in a position of powerlessness
with regards to facebook so for example that facebook never ever talks that the guardian
and then observe it. Never ever, you know when Mark Zuckerberg does his apology to us?
Totally ignored because it's sort of like, it's just not interested. So, you know, it goes,
it sucks up to power. And, you know, one of the things I very strongly think about the sort of
the story that I did with about Cambridge Analytica is that that broke through because we
partnered with the New York Times and it was on the front page of the New York Times and that made
it unignorable in a way, which if it had just been in the first,
foreign press, it could have ignored. And so I'm always sort of, you know, very grateful in that
way. But it's all, but it's, it's really, as I sort of say, where I sort of feel there is this
sort of huge duty and responsibility of the American media to step up and help the rest of the
world, really.
Should I take a crack at the answer as well? Yeah, go for it, go for it.
So I'm going to start with actually the idea of, is social media the, the,
ones who are responsible.
And I might broaden it from authoritarianism to extremism and radicalization.
Yeah, and destabilization of society.
So here's what's interesting.
I know that we're in a world now, created by social media, where everything has to be a soundbite,
everything has to be a quick take and a quick reaction.
And it's interesting, I have never, and I'm pretty sure most people have, and I won't speak for
everyone. I have never said that every problem that we have in society is 100% the fault of
Facebook. And I bring this up for a reason. It is one of the quickest things people love to do
to discredit you is say, oh, but what about this? Oh, but what about cable news? Or, oh,
but what about this? And it's all part of a broader picture. You know, when I put out, I hate to
say, like talk about my own work, but when I put out my TED Talk, which is very much. Yeah, this is what
we're here for. Yeah, go ahead. About how Facebook is contributing to radicalizing people
based on having worked on counter extremism issues my entire career. The first reaction from so
many people were, oh, well, what about Fox News? Or, well, what about this, that, and the other?
So let's be clear. I am stating emphatically, no, I do not think Facebook is solely responsible
for all of our woes in the world. That said, we have an industry which, due to the
permissive environment in the United States of let's let the internet flourish, was given free
rain to scale as recklessly and quickly as they wanted to to dominate the entire global
landscape of what one might call discourse or the public square or whatever terminology you want
to use without any checks and balances. And as a result, and not in any way.
liable in the way that traditional media is for certain content or in any way responsible for
the actual consequences, again, not of the speech that is posted on their platform, but on their
tools and what they decide to do with that speech. And so without, I mean, without going into
our long explanation of this, you have platforms that are still to this day as much as Facebook
likes to pretend it's not true, predicated on keeping people engaged, growing their daily
monthly active users, and making sure that people continue to remain on the platform so they can
continue to hoover up our data, so they continue to refine their targeting practices, so they can
sell this to advertisers. Whether or not their targeting tools are perfect, and that's a whole
another conversation, but it's still what they're selling to advertisers. They're selling this
ability. And so there's so many practices between how their recommendation engines are
steering people, and they are steering people towards more and more extreme content about how
you might get a pop-up recommending that you join this group and the next thing you know in that
group, you're meeting other people who share white supremacist ideas and then you're hatching a plan
and go off to Oakland and murder an officer, which happened.
Those two men met on Facebook.
And so there's so many things happening within these platforms that we are not allowed to touch
because it's being mischaracterized as all being about free speech.
And so I'm not saying that Facebook, listen, anger, hatred, polarization, divisive content,
political rhetoric, all of that has always existed.
But now we have an entire different environment where there's,
cheap tools to engage in information and warfare and propaganda, where there is zero
accountability for how your business decisions, tools, and platform are being used to help
spread disinformation, distrust, actually connect. I mean, even if you connect a predator to a young
girl, that company will claim that you can't take that to court because of Section 230. We're not
liable, even if it's their own tools that recommended that that person connect with that
young girl.
Like, these are all the things that get lost when we start using excuses like, oh, but it's always
been bad.
Or, oh, but technologies always come and disrupt what happened before.
And so, sorry, it was a little bit high level, but it's very frustrating to me when people
say, oh, but Facebook isn't the only reason people are acting the way they are.
Right.
It's a contributing factor, but not the whole deal.
I mean, I think right now it's the biggest contributing factor.
But yes, it's a contributing factor.
So I'd like to, there are many discussions that just harp on the problems.
I'd like to talk a little bit about the solutions.
Let's do that when we come back right after the break.
Hey, everyone.
Let me tell you about the Hustle Daily Show, a podcast filled with business, tech news,
and original stories to keep you in the loop on what's trending.
More than 2 million professionals read The Hustle's daily email for its original
reverent and informative takes on business and tech news.
Now, they have a daily podcast called The Hustle Daily Show,
where their team of writers break down the biggest business headlines
in 15 minutes or less and explain why you should care about them.
So, search for The Hustle Daily Show and your favorite podcast app,
like the one you're using right now.
And we're back here on the second half of the big technology podcast
with Carol Codwalder and Yael Eisenstadt.
We were talking in the beginning.
about some of the problems with social media. Let's talk about some of the solutions. So
you're both here in your capacity on the real Facebook oversight board. But there's also this
thing called the Facebook oversight board, which is essentially a board that Facebook has created
to review some of the content moderation decisions that they make. You know, when this first came
out, I thought, okay, maybe this is good. Refer to, you know, refer some of these tricky content
moderation decisions over to third parties. Now it's not all the power concentrated in one
platform, but rather dispersed, you know, to the quote unquote public or as good of a proxy
you can make, you know, for the public. Is there an issue with that? What's the, there's obviously
some criticism of it. You've created a organization that's meant to counter that or at least,
you know, uses the name as a criticism of Facebook. So, so what's going on there? And yeah,
Let's just get into it.
Like, what's the issue with this oversight board?
Do you want to kick off the aisle?
Okay.
Do you want me to say the first swing?
I'll give the first swing.
And then, uh, so listen, as a high concept, the idea of an oversight board, which is an external
group of experts who have the ability to actually look under the hood and really think
through speech issues, in theory, it's a really good idea.
However, to in any way claim that pulling together a group of experts that whether we like it or not were handpicked by Facebook and are essentially paid by Facebook.
And then to give them this very limited remit of you can only really overrule us on content that we took down.
Right.
Not the stuff that we thought.
I would argue the more dangerous stuff is the stuff that is still up.
And that the idea that this group of people who are not accountable to any of us, they weren't chosen by the public, they don't, you know, it's, I really wanted to believe in the idea in theory, but it really is passing the book on accountability and responsibility to this external board so that therefore Mark Zuckerberg can say it wasn't me, they made this decision. And it's, A, it's passing responsibility. B, it doesn't.
address any of the issues that I really care about, which are the systemic underlying issues
of how the platform is monetized, what that is doing to our cognitive capabilities, what
all of the tools, the targeting tools, the algorithms, the recommendation engines, none of that
is in the oversight board's purview. So, bottom line, and very interesting, but do not for a second,
confuse it as some sort of a true governing body that is really tackling the issues of which
Facebook has for so long evaded any responsibility for the real decisions they're making.
That's my quick level, but I'm sure Carol has.
Well, I want to hear from Carol on this, but I do, you know, just to sort of centered the
question to you, Carol, and yeah, I'll feel free to weigh in on this.
But like, who's going to make decisions about content on Facebook?
So you have, I guess you have three entities, right?
One is you can have Facebook make those, you know, decisions itself, you know, take, quote, unquote, the responsibility.
You know, two, you can have the government on the total other end.
I don't know if we want the government coming in and making speech decisions on a platform.
So it seems to me that, you know, assembling some group of people from the public to make these decisions is probably, you know, the best way to do it, even though it's limited.
And I mean, I guess you could say maybe we should let them rule on stuff that they, that Facebook leave.
up, like if Facebook leaves up some posts and people think they should be taken down,
refer it to the board, apparently that's coming.
At least that's what they've said.
So I guess, yeah, I'm still trying to dig.
And obviously, like, you know, if it's limited in scope, then Facebook can say, well,
we solve this problem by handing stuff to this accountability board.
Don't bother us about content moderation decisions anymore.
So is that the main criticism?
Because I think directionally, it seems to make sense.
I mean, it's just that, you know, Facebook is desperate to figure out how to, you know, do this sort of self-regulation.
And I really do think that what we see here is this sort of like accountability theatre in that it sets up this sort of fake Supreme Court, which is using kind of fake law, essentially, to pretend to be, you know, to use the sort of the tropes of mechanisms.
of a nation state, which it's applying to a private company.
And, you know, at the same time is that we've got no idea about how Facebook actually
does content moderation, because it doesn't tell you, it won't explain its metrics.
It won't explain why it took down Trump when it did, but why it didn't before.
It's very unclear.
And so instead of just cleaning up,
up its own house and doing more like Twitter has of, you know, showing its workings,
at least being having some level of transparency about this. It's just outsourced it to a politically
convenient body who is going to take the flack one way or the other. They've got to, you know,
there's a highly politically contentious decision to be made about Trump. And Facebook has
incredibly conveniently just got that off its ticket, it's not going to be facing the music,
the oversight board is going to be. And what is deeply troubling to me is the way that the
oversight board is being taken seriously. It is being considered as a sort of legitimate
Supreme Court. There's legal scholars who are endlessly writing about it. And actually,
Alex, I saw yesterday, I didn't realize that you were the host today because the,
I saw your post yesterday about the sort of how the big tech companies essentially capture the
think tanks and the incredible soft power that they exert and how it has impact.
And, you know, I think that is, we see that on so many levels and you look at the amount of money
that Facebook has to spend on lobbying and PR and you see the way the machine works and you see
the way that it plays favourites with reporters and you see the way that it gives access to data
to academic institutions and it's incredibly hard to counter that there is no sort of there is no
well-funded body that is able to sort of be the facebook counter disinformation unit essentially which is
sort of what we need you know there's a sort of democratic need for that and the press of course is
able to do some of that. But in, you know, it's, there's so much of it. You know, we've all seen,
you know, there's been this amazing reporting just in like the lot, particularly the last 18
months, incredible tech reporting coming out of the states. And, but that's still just the tip
of the iceberg. We know that, you know, there are sort of, there are so many harms. There are so
many problems. And also, it doesn't actually, it makes some difference, but we sort of see the
ways that it's actually very hard to land a punch on these tech companies.
So in that context, and in particularly the way that, you know, journalists find dealing, getting
answers out of Facebook, the idea that this sort of, you know, as I sort of say, faux judicial
body is being sort of taken as a legitimate authority is incredibly troubling.
And I really wish, you know, there's some amazing individuals on the oversight board with brilliant reputations, you know, absolutely experts in their fields.
I cannot understand why they would allow their reputations to be captured and harnessed by Facebook for their own political purposes, I have to say.
And I know that they've gone into it the best of intentions, but I really do worry about it, I have to say.
Can I add one, actually two points to that since you did want to talk about solutions?
Sure.
Alex, you're right.
There isn't, you're right that it sounds in a way better than letting Facebook itself make all these decisions.
But let's also be frank.
A lot of this is because the U.S. government has not caught up to figuring out what the guardrails should be in place around companies like Facebook.
And so we've continued to allow Facebook to self-regulate.
And so that's part of the problem, too, to just be really clear.
It's, there are three major buckets.
Obviously, I'm going to be very clear.
There's no one magical fix that's going to suddenly make a healthy information ecosystem
where trust is restored and truth winds over.
Well, are you going to get into the product stuff because I do want to cover that in a bit?
Well, I was just going to say, though, there's like between.
data privacy regulation, what to do about antitrust, and how to actually decide what these platforms
are and are not accountable for, until the U.S. government figures out some of that.
Facebook gets to continue to claim that they have this Supreme Court that's making these grand
decisions. And you're right, I don't want Mark Zuckerberg to be the one to make a decision
about what is truth and what is fiction. But we've also given him free unchecked power to do so.
So when he claims this is a Supreme Court, you see all these brilliant academics and journalists like scrambling to plead their case in front of this oversight board about how they should handle their decision about Trump.
And it's just fascinating to me that now all these people are giving all this legitimacy to this board, which will make one of the most consequential decisions for the future of our democracy.
And they're not accountable to me.
they are making one of the most consequential decisions.
And so that's just very concerning to me.
Yeah.
Well, I'm trying to balance like the two things I'm hearing.
One is that this board is a red herring, you know, that Facebook's trying to get us to look at, you know, to solve, you know, so we don't pay attention to the broader systemic problems in the platform.
Okay, that one I get.
And then the other one is like, you know, that this, this board is going to be making this really important.
That the decision the board makes is going to, on Trump is going to be really important.
So how do we balance that?
Because, you know, is there a way, I mean, who do we want to make these decisions in the end?
Like, is there a way where a board like this exists and we're happy that the government and Facebook isn't making these decisions?
I'll say just really quickly about this particular decision because you're right.
That is where we are now.
And that's super troubling.
For context, the board is now deliberating on whether Trump should be permanently banned or indefinitely banned.
Right.
But they're deliberating based on two posts that Facebook.
decided to flag, which weren't even the most troubling post. And by the way, all of Trump's
older posts still remain up on the Facebook platform and you can still engage with them.
That in and of itself is troubling. I'll say, I'm going to just reference someone else.
I think one of the best letters and best cases about what should be done with the oversight
board that I saw was the one from Jamil Jaffer. He wrote a brilliant letter to the
oversight board about why they shouldn't actually roll on this at all. And his recommendations
for how they should go back to Facebook
and say before we rule on this,
here are the things we actually want to see you do.
And so I recommend people look that one up.
I thought it was brilliant.
But Carol, I'll pass this over to you.
No, I mean, I totally agree.
And then what's the thing which is sort of even more troubling
is that and that's why the legitimacy of this board
is so important is because the people I've spoken to
who sort of understand the makeup of the board, probably the best,
think that they are going to overturn the decision.
So that's the sense that I'm getting is that they are going to robberstamp Trump.
You know, they are very free speech protective.
Meaning, yeah, they're going to let Trump back on.
They're going to let Trump back on.
Yeah.
And, you know, that in many ways solves Facebook's problems, doesn't it?
wasn't their decision was their independent super queen court who made that decision we banned him
you know nick clegg who the ex-british deputy prime minister who's head of comms and policy
at facebook you i just couldn't i just found it so bizarre he was all over twitter sort of saying yes
we've banned him hurrah and and then we've referred him to the oversight board so it's kind of like
It's just, I think, you know, in a sense, in a sense, Facebook have got a win either way.
But, you know, the prospects of them having a win and Trump being back on Facebook is really deeply problematic.
And that's why I find the coverage to date of the board, you know, is essentially paving the way for that.
Yeah.
And, yeah, and that's where I think.
Yeah, this is why we're having this conversation.
That's why I was excited to have you on the show.
Of course.
What do you think?
What's your view?
I think I personally was thinking before Facebook announced this oversight board
that it would be a good idea to have third parties make these decisions.
But I do find the criticism that it's a red herring and sort of gets Facebook out of discussing some of the harder problems.
I find that very legitimate.
And yeah, I'm concerned about that.
But I think the idea of the board in principle is good.
It's just a matter of finding a way to it.
Because I don't want Facebook making these choices.
I don't want the government making these choices.
So the public to me seems to be the best way to do it.
They actually once turned it over to all the users to make decisions on content.
And then something like less than a half a percent of people actually voted on this stuff.
So I don't know.
We'll see.
And, you know, I don't have this.
like firm stance on it one way or another. That's why I think having conversations like this and
thinking about the legitimacy of the board and, you know, what it's actually, what Facebook's
actually trying to do with it are important. So, um, yeah. And it's just that, you know,
the whole thing with Trump is that, you know, so they got so bold and brave, didn't they?
When he was sort of like kicked out of office. Well, they did it the day that Biden's election was
certified. So that was easy. But yet they haven't done this with Modi, have they? They haven't
done it with Bolsoniara, yet we know they are also posting, you know, as troubling
content. So, I mean, it's a, it's a, it's a, it's a sort of, I don't know, they're, sorry,
that's okay. That's not coming out really. It's that they're just, it's this, you know,
they're bullies, you know, in that way is that they just sort of suck up to power essentially
and just sort of suddenly become brave when the person is out of power. And that, you know,
going back to this question about their relationship between authoritarianism and these social
media platforms, I do think it's, you know, it's there. It's this like consolidation of power
amongst this sort of tight-knit group, essentially. And, you know, it feels like it's only,
you know, the United States has caught a break. But, you know, we know that nonetheless the platform
is still being used in these very troubling ways to spread conspiracy theories and so much more.
So I do feel that there is a sort of finite amount of time, isn't there, to try and take action to fix these problems now?
Yeah, I suppose so.
I want to talk a little bit about your organization.
So I have two questions about it.
First of all, it's called the Real Oversight Board.
I kind of find the name a little confusing.
If it's not the Real Oversight Board, shouldn't you call it like the Rogue Oversight Board or
some something more true to what it actually does. And then, you know, obviously you're not going to
be ruling on content. So what is the idea with the organization? Yeah. Well, I mean, it was sort of like
the whole thing about it was it was, it was launched in a huge hurry. So I think from sort of conception
to launch was just a few weeks. And it was really, it was, it was just as an emergency response,
really to increasing, increasing alarm and horror at the fact that Facebook was refused.
using to take action over Trump's, you know, blatant use of Facebook to subvert the election.
And, you know, what we could see was that there were all, you know,
these brilliant academics and civil rights leaders making this case separately.
But it was this idea of trying to bring them together to bring sort of coherence and weight
and authority really to that to put pressure on.
on Facebook. And, you know, to some degree, you know, that the, they, they agree, you know,
the board agreed upon three demands and Facebook conceded to two of them. So I do think,
yeah, pretty effective. Yeah. Well, I mean, you don't know. You can never say exactly why,
but, but, but at some level, you know, it's this, it's like I so said, it's the, you know,
the civil rights leaders with this sort of moral authority and then these sort of brilliant
scholars in the field, people like Yael, who's got this amazing, you know, actual hands-on
experience and bringing them all together to raise the profile of the, you know, the problems
and the harms. And then going forward, I mean, it's this, there's sort of, it's very much
up in the air, but I think this idea of trying to sort of, trying to do a sort of shadow government
structure. So where the oversight board is providing oversight in this very, very narrow way around
very, very narrow topics and won't consider a whole other spectrum of Facebook columns,
that's where we see that the sort of real Facebook oversight board can potentially play a role
in modelling what independent oversight could look like. And I think the other thing about it,
which was very much front and centre at launch,
which you pick up upon with the sort of slightly silly name,
which was just that it was just like being a pain,
being a thorn in Facebook's side,
was very much part of the thinking.
And in that, considering like all the stuff that they tried to do
in trying to shut us down,
I think in that it was actually quite effective.
And the confusing name is kind of, it's confusing in, you know,
it's not accidental.
all. I mean, the point was about it was that Facebook hadn't launched the oversight board.
They'd announced it sort of and announced it and announced it and then said it wasn't even
going to launch before the election. And so we thought, you know, sod it, we'll appropriate it,
we'll subvert it. And in that, I think, you know, it might be slightly silly, but it sort of
works, I think. Yeah, fascinating. So we've talked a little bit about how the board could have been
this red herring and getting people to look beyond. Look, you know,
avert their attention from Facebook's biggest problems, and this quote-unquote real oversight board can
focus people's attention back on some of those broader structural issues. So let's spend our last
10 minutes together talking about some of the broader structural issues. And I want to toss it back
to the aisle because, you know, you were on the inside. You know, they brought you in like we spoke
about at the top to look at election integrity. And then I think what you ran into was the sort of
buzzsaw of Facebook engagement incentives where people are essentially compensated and
evaluated based off of how much they grow the product and grow engagement.
And Facebook said that they've tried to change this internally, but I personally haven't seen
it happen and just take us on the inside.
You know, let us come along with you on that journey of what it was like to enter the halls
of Facebook Inc.
and what you saw when you were trying to solve this problem.
Sure, and I'll give you two concrete examples, but let me start by saying they love to counteract this talking point about how they're all about engagement.
And I would just keep asking, then how come on your quarterly shareholder reports you continue to use daily, monthly active users as your metric of success?
As long as that's your metric that you're reporting to your shareholders, then I'm going to continue to believe that's your priority.
And inside, yeah, product managers are judged by how they can grow.
And if they can't grow the product, they get bad reviews or they're put on another product.
But yeah, let's hear what you saw.
So, I mean, clearly I was part of what you would consider more of the risk side of the things, right?
So that's already in and of itself, we're not revenue generators.
We're considered, you know, we're considered the cost centers, essentially, the ones who are trying to possibly make you slow down, who are highlighting things that are going to lead to problems in the future.
that is never usually valued as much inside a company who obviously has to keep growing and
keep monetizing, especially the way they do.
I'll give you two quick examples.
I've written about it quite a bit as well.
The very first thing, one of the first questions I asked in one of the internal tribes
was what they call them at Facebook was just asked.
Doesn't sound like a cult at all?
No.
Just just asked.
So I don't remember exactly how I worded it.
but why are we not, this was in 2018, I was preparing for the U.S. midterm election.
And I just sort of asked, why are we not using any sort of fact-checking program that we're using
in organic content for political ads? Clearly, we have the technical capability to have some
sort of program to do this. And we're actually taking money for ads. So I would assume that
the bar is even higher because this is actually where we're profiting. And we're allowing, we're adding
these disclaimers that makes it appear, like the paid foreby, which is making it appear like
we've already validated these ads. So it might even give them even more credibility. And your
average Facebook user doesn't necessarily know the difference between content and ads. They just
see it in their feed. So I was asking all these questions. Why aren't we at least making sure that if we're
taking money for a political ad, that that ad is not engaging in blatant disinformation? And it was
really interesting because a bunch of the PMs and the different people on our teams all started
chiming in on this, whether in the tribe itself or in conversation with me. And they got excited.
They knew I was hired to head this team. And they knew I was asking the question. And it was
interesting. It was like, yes, let's do this. And they started putting together plans on what
they could do. And then suddenly the conversation went silent. I never heard anything about it again.
And, you know, come to learn later that, of course, Mark Zuckerberg had already decided he would
never fact-check Trump, so he'd already decided that they would never do this. But that's exactly
what I was hired to do. And an even better example, when the civil rights audit was going on at
Facebook. And I, so we knew it was going on. And so my team had coordinated with a bunch of other
teams to put together a plan that was technically actually somewhat easy just to ensure that no
ads that made it through our system engaged in blatant voter
voter suppression, meaning it couldn't lie about
voting procedures, it couldn't lie about where, you know, all the
different categories, the date, all these things, which we were
doing for organic content apparently. So we worked with the teams who had
built the ML systems to start screening content for voter
suppression and just decided that we would run the ads through
the same systems. That was rejected too. And it was rejected
on all sorts of weird narratives about, well, that won't scale globally. And I was like, of course it
won't scale globally. Every single election has its own norms, values, political realities, laws in those
countries. Like, I get that Facebook wants everything to scale globally, but do not say you are trying
to protect actual elections if you're not. And I said one ad, one ad that gets through that
engages in voter suppression is more dangerous than all of these other things we're looking at right now.
And it was just completely rejected and in the longer.
And I would say once again, it's because they knew that that would mean that ads from
certain politicians wouldn't be approved.
And that would be very politically difficult for them.
So there are things that could have really been done quite easily that they refuse to.
And then the other thing is anything you do that's going to build friction into the system
is going to be looked at.
Right?
Because today we want everything to be first, fast, and free.
And I would argue if you are trying to figure out how to protect democratic debate and try to get to a point where blatant disinformation is not at the top of everybody's feed and is not beating out actual more wonky fact-based content, you might have to build even the tiniest bit of friction into your system. And so it's a question, what do you value more? And right now, Facebook continues to value growth over protecting democratic discourse.
course and protecting trust and information. There are things that could happen and they're
refusing to do it. Yeah. So I watched your TED Talk, Y'all. And let's just end with this,
this question. You said, lies are more engaging online than truth, salaciousness beats out
reasoning in a world optimized for virality. And this will just bring us full circle.
We have all these conversations about content moderation. But it does seem that as long as the
platforms are optimized for virality and engage.
engagement. We're just going to see some of this, the more of the bad stuff, the more anger,
the more outrage, the more division, the more hate. And I'm curious if you think we need to
focus more on stuff like the share button, right? You just click share and your stuff is immediately,
someone else's stuff is immediately in front of everybody that follows you. And it's not a question
of should you take it down or not. It's a question of, you know, should you slow it down? So
And just, yeah, now that I'm saying it, the content, the oversight board focused entirely on content moderation.
And you're right, Red Herring, we ignore this stuff.
So bring us home here.
Right.
So content moderation is hard.
There's no question.
And there's no super easy, clear cut answers on content moderation.
But the bigger issue to me is, again, the tools and it's frictionless virality, not just virality.
I would be really interested in seeing some of the outcomes of some of the data behind when Twitter tried to introduce some friction into how you could engage with certain tweets.
I think there's an interesting experiment there.
I'd love to see.
I have that data.
Yeah.
Oh, great.
So I'll just cheer it quickly.
I know we're running out of time.
But they put essentially a speed bump in front of the retweet where instead of being able to natively retweet something, they took you to a different screen first where you had to add your own thoughts.
Then you decided add your own thoughts or, you know, just retweet as you would.
That caused retweets to drop somewhere around 20%.
But Twitter decided Twitter used that as a reason to say, okay, well, people don't like it.
We're going to put it back.
And they were talking about it.
And it's like, wait.
So that counters the whole point.
You said you're going to build your platform completely on healthy conversations.
You don't mention anything about healthy conversations in the release of the data.
And you return it back because retweets were down and you didn't get the context you wanted when people
were quote tweeting.
See, they're going for an emotional reaction. And things that make you react emotionally are what you're going to engage with most quickly. I know we're running out of time, but I just want to say quickly. Part of this, again, I want to be very clear because people like to use the free speech, which I also think is a bit of a red herring conversation to say you can't do anything. We should do another show on that one, yeah.
And I really think it's about the tools. And our U.S. government has no right in determining, in talking about what speech should stay up or stay down except for a legal speech, of course.
course, but what they can do is I think they need to dig in much further into the tools of,
let's even just look at January 6th. I would like to know some of the people who have been
charged right now in the insurrection. Did they go looking for stop the steel content?
Did they go looking for Q&N? Did they go searching for all these mirrors because of this
whole mirror to society excuse Facebook likes to give? Or do their recommendation engines
prey on what they already saw were some of their vulnerabilities and steer them towards this
content, recommend them to certain Q&ON groups, connect them.
Those are the things that you will never hear Facebook talk about.
They will always flip it to be about speech because they don't want you to look under
the rugs at the actual tools that they are using.
Yeah.
And, Carl, I want to toss it to you for final thoughts.
I did have a question I wanted to ask, you know, maybe you can wrap it all up together.
In terms of funding for the real Facebook oversight board are like Facebook's enemies funding
this or is it all publicly funded?
and yeah, feel free to just wrap it in terms of...
Hilariously, Alex, we were very fortunate in that Luminate,
which is Piero Medea's Foundation, gave us some money at the beginning.
And then what happened is that they found themselves under a barrage of calls from Facebook,
not from the oversight board, you'll notice,
basically heavying them about why they were funding us,
and suggesting that they shouldn't be.
And, I mean, it was quite, I mean, it was sort of, like, really is quite extraordinary.
I mean, this, I think that, I think that there are sort of increasing numbers of journalists in particular,
and some academics who find themselves in this sort of, like, sort of adversarial relationship with Facebook,
where they don't really act like a normal corporate company.
So, for example, I think you're both probably familiar with Andy Stone, who's that Facebook spokes dude,
as I call him, who's out on Twitter
not acting like a normal
being very Trumpian sort of
tactics and going in for this
sort of hand-to-hand combat with journalists
and I do find it quite
sort of peculiar and disorientating
as I do the idea that this
you know multi-trillion pound company
is going out and sort of trying to heavy
its critics
out of
you know their very very
modest amounts of funding
so yeah I mean as I say on the plus side it does it does make you think well I must be doing something right
can I make one quick point about that though because I think it's a really important distinction
funding for the real Facebook oversight board does not mean it's funding the people who are members of it
and I just want to point that out because it's funding the mechanisms behind it what what they're doing
but I one of the things I'm very clear about is I'm not making money
off of speaking about these things.
I'm speaking about these things
because I've spent my entire life defending our democracy
and I'm not ready to give up that fight yet.
Yeah, and it's kind of like, you know,
this is the point about the oversight board.
They were hand-picked by Facebook.
They're getting over six figures for doing it.
I mean, it's a very, very different ballgame
than the people who are on the real Facebook oversight board
doing this, as you say, EIL, not paid in any capacity at all.
And as I say, these very modest amounts of money going to sort of, like, help of it with the production and press around it.
So, and you're sort of like, as I sort of say, you look at the absolute millions that Facebook has at its disposal, you know, the legions of press officers and lobbyists and all the sort of soft power techniques.
And it's sort of, you know, you really are taking a knife to a gun fight.
But still, it's sort of like, I do feel we've nonetheless got to get out.
out there with the knives and do what we can.
Yeah. And one of the, I'll just close, one of the stats I like to cite on the show is that
the federal trade commission, which is tasked with oversight of the big tech companies or
regulating the big tech companies, has an annual budget of like 330 million. And Facebook makes
that in the day, pretty much, a day and a half. And so, sort of Amy Klobuchar, who's just
on the show, has a bill that wants to give $300 million a year more to the FTC, which I think,
you know, wouldn't even the playing field, but it might at least, you know, might make it a fight.
So, all right, let's wrap with that.
Carol and Yael, thank you so much for joining.
It's been great having you here on the show.
Thanks so much, Alex.
And I feel really bad, can I just say as well,
but when I sort of was saying at the beginning
about the tech reporters
who sort of didn't believe the story initially,
it wasn't direct to you, Alex.
I take no offense.
This is a safe phrase for introspection.
The first, yeah, when we launched the show,
we launched it, is the tech.
bad and I understand sometimes sometimes it is no one's perfect so you know I'll admit the flaws
when I happen but I appreciate it yeah yeah amazing work now thanks Alice Alex for having a more
nuanced conversation I appreciate it for sure for sure people who want to learn about your work
do you want to like throw out your Twitter handles and then the um the oversight board website or
the sorry the real oversight board website I'll just start with uh so I'm starting
new role in April 1st. So if you want to know what that is, my Twitter handle is just
my full name. Yeah, L. Eisenstead. And the real Facebook Oversight borders at FBO oversight,
at FBO's oversight. And you don't want to follow me on Twitter because even I find it's funny.
Okay. Well, thank you. Yeah. No, I encourage everybody to. I refuse to. I'm going to keep
following. All right, everybody. Thank you so much for listening. What a great show.
Thank you to Red Circle for hosting and selling ads. Thank you to Nate Goatney for
editing. As always, great job. Nate. Appreciate it. And to all you for listening, we will be back
next Wednesday here on the big technology podcast with another conversation. Some tech
insiders are outside agitators. Stay tuned for that. Thank you all and have a great week.