Offline with Jon Favreau - Alex Stamos on Leaving Facebook and Zuckerberg's Reign
Episode Date: January 9, 2022This week on Offline, Jon is joined by Alex Stamos, Facebook’s former Chief Security Officer. As Jon’s first guest who has worked at a social media company, Alex gives us a first-hand look at Face...book’s internal politics, delivering insight on Russian hackers and the Haugen papers He also makes the case that it’s time for Mark Zuckerberg to step down.For a closed-captioned version of this episode, click here. For a transcript of this episode, please email transcripts@crooked.com and include the name of the podcast.
Transcript
Discussion (0)
Was there a moment where you said to yourself, like, I think I have to leave.
This is this is I've butted heads enough here and I got to go.
Yeah.
I mean, in 2018, that's what happened.
Facebook's this place that like if you go and you're like a lower down employee, your new employee, it's like Care Bear land.
Right.
Like everything's wonderful and the food is free.
And like the posters say thing about people supporting people and stuff.
Right.
So it's like the Care Bears for 80, 90% of the company.
And if you're in the top 10% of the company,
it's Game of Thrones.
I'm Jon Favreau.
Welcome to Offline.
Hey, everyone.
Happy New Year and welcome back to Offline.
So my guest this week is Alex Stamos,
who's currently director
of the Stanford Internet Observatory,
where they do all kinds of research
on the negative impact of technology and disinformation. But it's Alex's last job that we spend most of this episode
talking about. From 2015 to 2018, he was the chief information security officer at Facebook.
We've heard a lot about product manager turned whistleblower Francis Haugen in the last few
months, but Alex was actually the first high-ranking employee to leave the company
over a dispute with other senior executives about how Facebook handles the spread of
disinformation on its platform. Basically, he wanted the company to be more transparent about
the issue. They disagreed. Stamos has plenty of criticism for his former employer, but
not too surprisingly, he also takes issue with some of the criticism directed towards Facebook
from media outlets and liberals like me.
We get into all that.
And we also talk about how he and his team first noticed that the Russian government was using Facebook to influence the 2016 election.
Why he decided to quit the company.
What the internal culture and decision-making process is like at Facebook.
Why he thinks it's time for Mark Zuckerberg to step down.
And what changes he thinks it's time for Mark Zuckerberg to step down, and what changes he
thinks would make the platform better. It's the first episode we've done with someone who's
actually worked at a social media company, and I think it's a valuable perspective to hear.
As always, if you have questions, comments, or complaints about the show,
feel free to email us at offlineatcricket.com. Here's Alex Stamos.
Alex Stamos, thanks for joining Offline.
Hey, thanks for having me, John.
So I've been doing this series on all the way that the internet is breaking our brains.
And you're the first guest we've had who's actually worked at a social media company.
So you're here to answer for the sins of all.
Great. I appreciate that. Thank you.
So you were Facebook's chief security officer from 2015 to 2018. Is that right?
What did that job entail?
So the majority of that job is kind of the traditional information security job, right?, I led a team whose job it was to keep people from
breaking Facebook, the company stealing data, uh, stealing money, um, doing, you know, kind of your
standard hacker things. Um, and then there was a component that was about preventing people from
doing harmful things on the platform. Uh, so I had, for example, a child safety investigations
team that would do, uh, the investigations, kind of the worst of the worst of the child safety investigations team that would do the investigations, kind of the worst of the worst of the child safety issues. While I was there, we started a counterterrorism investigation team
due to mostly ISIS, but other terrorist groups who were doing, you know, kind of advertisement
on the platform. We had an anti-fraud team, and then we had a big team that worked on state
sponsored hacking. So whose job it was to kind of track what Russia and China and Iran and North Korea and other countries like that were doing to cause harm as well as to try to attack the company to get information out of it.
I was going to say, things got pretty interesting for you and your team in 2016.
Yeah. Can you talk about how and when you guys discovered that the Russian government was using Facebook to influence the election in the U.S.?
So they're really two totally different kind of online Russian campaigns by two totally different Russian groups for which the connections between them are actually pretty loose.
There's not a lot of good evidence, at least in the unclassified side that we have,
that those groups are working together. So the part that a lot of people talk about from a
disinformation perspective is the troll factory, right? So there's the Russian Internet Research
Agency is a private company that belongs to one of Putin's oligarchs and who pushes propaganda
on behalf of what is in support of Putin. Most of
their propaganda is actually in Russian, but they do have divisions that focus on the United States,
Western Europe, Eastern Europe, and such. And then there's a totally different section,
which is the Hack and Leak campaign by the GRU, which the GRU is actually part of the government is the Russian military.
And they have both hackers and disinformation actors. But they have the actual hands on hacking capability, which is something that the Internet Research Agency and the other troll farms
lack. And so that had a very different texture, which is those were the attacks against John
Podesta against the DNC, the DCCC and such, as well as then the leak campaign of using wiki leaks and then a variety of fake outlets that they created to push the propaganda that they wanted.
So we actually saw the second one first.
Right. So the often if you're a professional hacker working for a government, the first step in what we call the kill chain, right, which are the steps a hacker has to take to do what they want to do is reconnaissance. And so reconnaissance will often
be done on social media to try to find out, you know, everything about a target and figure out
the ways that you can possibly attack them, right? Can I text them here? Can I send them a message
here? Stuff like that. And we saw reconnaissance activity in the spring of 2016 by accounts that we were able to tie to GRU.
We had a dedicated guy who all he did was track this group and had been doing so during actions in Ukraine, during attacks against the World Anti-Doping Agency, stuff like that.
And so then when did you start sort of disagreeing with other Facebook executives about either the transparency of like how to release this information, how to talk about
it, the spread of disinformation on the platform itself? Can you talk about that?
So a lot of the problems that Facebook is still facing, I think, were set up in that kind of
August, September of 2017, where there have always been a couple of different, you know,
people talk about Facebook wants this, but the truth is, you know, a big company and 50,000 people will have all these different
groups that want different things. And the big fight has been since that time,
should we be open and transparent and honest about these failings to demonstrate one,
to help other people figure it out to two, to coordinate with others and three,
to kind of demonstrate forward movement? Or is it better to just be kind of, to say the minimum amount possible
and to keep as much as close to the best as possible?
One of the things that complicated the whole thing
was we had privately given the data
to the Mueller team that summer, right?
And Mueller, as an actual prosecutor,
was able to go to a judge and get a search warrant.
And then he took all this stuff
and then he gave us a gag order
saying that we weren't allowed to talk about the fact that we had given it to
the Mueller team. And then, you know, Facebook comes out and says, hey, this bad thing happened,
but we're not going to share it with anybody, but also can't say at the time, hey, but because
we're working with the Mueller team and becomes this big back and forth where the company takes
like the most aggressive interpretation of the privacy laws around this to not share it with
Congress.
And this becomes like this, you know, big brouhaha that eventually the data gets shared to Congress
and then Congress can share it with other folks. And that's how we're able to now see all of that.
But that that fight at Facebook has happened continuously since then. Right. And, you know,
I I've always thought that being open about this stuff is the only way that we're going to fix it
because this is a society wide problem and you can't you know a company by itself cannot go do this right this is require
not not just the private sector to work together but the private and public sector to work together
and when you try to control everything and make it about the pr outcome it makes it very hard to
fix the problem in the from my perspective like if facebook wants to fix their pr problems the
the best thing to do is to fix the underlying issues themselves, that that's actually the shortest.
Like you can't really spin your way out of these things.
But that's what started there.
Now we've had many, many examples since then of that kind of fight happening over and over again. the specifically the product comms legal side of facebook right is always erring on the side of
less transparency and not more so i if there's like a fundamental there's a couple of like kind
of fundamental organizational flaws at facebook that i think are real problems one is the mark
sheryl split right so everybody thinks sheryl Sandberg is the second most powerful person on Facebook.
She's like the 10th or something, right?
The truth is, is there's a bunch of people
who run product teams
and don't even run product teams who are lower down,
but who do important things like metrics and such,
who actually have more power over what the real impact is
because Facebook's real impact
is what is built into the products, right?
Sherrill has the legal department,
the comms department,
the policy team. They're kind of the cleanup on aisle five folks. At the time, my team reported to Cheryl, which is actually kind of another fundamental problem, is that you had the, you
know, we were part of the team of kind of cleaning up the product mess. And so that kind of division,
I think is really harmful in that kind of the idea is like, oh, you can throw this stuff out
in the world and break a bunch of things. And then these people will clean it up. And that's not our
responsibility is kind of the feeling from the proxied. The other kind of fundamental policy
problem that I think goes directly to this issue is that the team, the comms and policy team is
huge, right? So there's this humongous team that now reports to Nick Clegg, the, you know, former
deputy prime minister of the UK, but was all big together at the time. And that included the people
who were trying to decide kind of substantively what the rules should be at Facebook. And I think
there's actually a fundamental problem here is that, you know, their goal is to keep governments
happy with the company. So the company can operate everywhere and make money everywhere. And I think
the, probably the biggest kind of driver, you know know there's a natural kind of comms reaction of you
know this this phrase like let's not break into jail of if we don't release things that people
won't talk about like that has failed over and over again for facebook right so like i i hope
people don't say that inside never never a good political comm strategy doesn't seem like a good
corporate comm strategy either right yeah it doesn't doesn't Um, and I think like we probably talk about this,
like, I think there's probably a lot of parallels between Facebook and government,
both from a, like all the different actors who are fighting, but also of getting pulled into
these policy discussions for which there's no real right answer. And so no matter what,
you're going to be in trouble with somebody. So you, you can't like make everybody happy.
And specifically at the time, you've got this new Republican administration, you've got a, you know, uh, Republicans in Congress. And so you have a
policy team whose job it is to push, uh, to make the government happy. Who's then very close to
the comms team. And so you end up with this general strategy of, Hey, let's not get involved
in this because at the time it was like super controversial to talk about Trump and Russia,
right. Um, that was seen as like a really political thing. And so their impulse was, well, instead of us coming out and
talking about this publicly and honestly, it's better to just avoid it and stay out of it because
we don't want, you know, we don't want to take sides. And that's like a constant problem at
Facebook is people not wanting to take sides when one side is right and one side's wrong and
neutrality is actually the problem. Yeah, no, I've noticed that from the outside. I mean,
when you say that their job is to keep governments happy, it sounds like what you're saying is their job is to make
sure that a government, whether in the US or elsewhere in the world, doesn't get so pissed
off that they regulate Facebook in ways that Facebook doesn't want. Is that right? Yeah. I
mean, yeah. I mean, keep a government happy is regulatory. Then also just kind of the administrative
state, right? Like even if you don't actually pass laws, there's all kinds of powers governments have
to punish companies and make things hard for them.
And so, you know, just like any other government fair shop, they're trying to keep in basic
governments.
That's the ruling party, which is like the really scary thing is, you know, governments
don't necessarily reflect all or even, you know, a majority of the people in the country. And so
they end up if you like are just keeping the ruling party happy, that you end up twisting
yourself in all these decisions. And so I think a lot of it came out of like, let's not piss off
Trump. Let's not piss off the Republicans. It's not like take a side, because if we're being honest
about this stuff, then that's kind of taking a side, which I think is ridiculous and has way
hurt the company. But that's kind of this
impulse you see over and over again. Was there a moment where you said to yourself, like, I think
I have to leave? This is this is I've butted heads enough here and I got to go. Yeah. I mean, in 2018,
that's what happened. Effectively, you know, I was trying to fight to kind of make my team
much more effective. I was, you know, get out from
Undershare, I'll be closer to the product teams, have the ability to actually affect things,
massively grow the number of people who are investigating and working on this stuff.
And some of that stuff ended up happening, but effectively there were a handful of executives.
You know, Facebook's this place that like, if you go and you're like a lower down employee,
your new employee, it's, it's like care bear land, right? Like everything's wonderful and
the food is free. And like the posters say thing about people supporting people and stuff. Right.
So it's like the care bears for 80, 90% of the company. And if you're in the top 10% of the
company, it's game of Thrones. Right. Um, and, uh, I was the victim of a very smart political
machination by somebody who basically said, Oh, well, I can take care of the problems Alex is working on, but I can do so without pissing you off, a.k.a. telling the truth.
Right. And so there are basically a bunch of decisions that were made that I would not be able to make things any better. I talked to Charlie Worzel about Facebook for this series.
Yeah, yeah, I saw that. I thought that was a good interview.
Oh, thank you.
Well, he said in that interview that he thinks there's very little cynicism in the upper ranks of Facebook management because most people are true believers in what they're doing.
They basically think they're doing the right thing.
Was that your experience?
Yeah, I think that is definitely.
Lack of cynicism is not the term I would use.
Right.
I would say one, you know, Mark Zuckerberg really does truly believe that letting everybody in the world talk to each other is a good thing.
Right now, I'm not the right person to ask about this because I have spent my entire career working on the downside of letting people
use computers, right? Like I'm, I'm a total Luddite. Like all I've done is security and safety.
And so once you've worked on the sexual abuse of children online, once you've worked on
counterterrorism issues, it is very difficult to feel good about the internet. And so I'm way
farther on the other side. And I think there's a truth somewhere in the middle, right? But Mark really believes that. And one of the
basic problems at Facebook is, is it, is it quite a small C conservative company
in that the people who run it have barely changed in like 12, 14 years, right? Um, one of Zuckerberg
smart moves in the beginning when he started Facebook was he realized that he was just a kid
and there's a bunch of things he didn't know. So he surrounded himself with all these adults
from Silicon Valley who knew how do you sell ads? How do you build a business? How do you hire
people? How do you build data centers, right? How do you run multimillion computer infrastructures?
He had no idea how to do that being like a Harvard dropout. He had a product idea. And so he
surrounded himself with those people. And that group has barely changed since the IPO, right? Since the 2010, 2011. And so I think that's like one of the fundamental problems is that one, those people believe everything. But the other problem is that there's all these decisions that kind of made sense when Facebook was a scrappy upstart, right? When you're like, oh, no, Google Plus is going to defeat us. Or when people are saying, hey, Instagram acquisition is crazy or mobile is going to
crush Facebook.
Facebook will never be profitable.
The IPO is a disaster.
And so they make all these decisions when they're a scrappy upstart.
And then once you're like the dominant platform and you are intermediating most of the human
interactions on the planet, those decisions don't make sense anymore.
Right.
But the people who made them are exactly the same.
And it's really hard for them to go back and to be like, oh, maybe I need to change my mind now.
And I think like, again,
like it's this small C conservatism of,
there's all this talk about the good old days
were the good old days, right?
And like, oh, things were much better
when we were smaller and stuff.
And it's kind of like,
the company's incredibly successful.
You can celebrate that,
but then you have to recognize like,
man, we have a huge amount of responsibility now
that didn't exist back in those good old days. And so things do have to be different. Like you
do have to have more bureaucracy in the things that product people complain about, because
when you have more responsibility, you've got to spend a bunch of time thinking about
what bad things are going to happen. Um, and I think that's like a fundamental problem that
has not been solved is that the turnover at the top ranks is incredibly slow. And those people create kind of this bubble
where Zuckerberg gets to be detached.
This is the other issues.
You know, Mark's never had another job, right?
Like this is all the guy's done since Harvard.
And so he never had like a real kind of young adulthood
where he went into a company
and he saw how a company was run and stuff.
And I've seen a bunch of companies
and like really well-run companies.
The CEOs know they're in a bubble and they pop the bubble.
And Mark does not pop the bubble. He's okay being in this bubble of people who are telling him, you know, not necessarily what he wants to hear, but they are
formatting things in the way he wants them to be structured. Right. And that will never be,
there's a fundamental problem in like your business plan. It's always going to be like,
here's a metric that we can make better. Yeah, it does seem, and from my conversations with people higher up in Facebook, too, that I've had, that he believes in his mission.
He's a little clueless about everything else that's going on.
When he gets public criticism, he's pretty stubborn about it.
And he thinks that because what he's doing is right and good and he's getting unfair criticism and the people close to him are telling him that it's unfair criticism that and he thinks to himself, well, I'm just a I'm just a computer guy.
I'm just a product guy. And I didn't I didn't think I was going to be this the CEO that that's fine, that that's all he has to do.
Yeah, it's it's a real problem. And I've said this publicly for years now.
Like I I said to Kara Swisher on stage
a couple of years ago in Canada,
he just shouldn't be the CEO.
And he shouldn't be the CEO for a couple of reasons.
One, he has made all these decisions
that have to be revisited.
And so it's time for somebody to come in.
They need a major culture change.
And it's very hard to change the culture of a company
while it's the same people, right?
There's a bunch of tech companies
that have had kind of really great second acts.
And the best example is Microsoft.
They were kind of stuck in their Windows monopoly world of the 1990s because it was Bill Gates and then Steve Ballmer.
Ballmer was part of that inner circle.
And it was not until those guys were gone and Satya Nadella came in, it was like, we're going to redo everything.
And they're kicking butt in a lot of ways.
And they fixed a lot of the problems of the problems that they created in society. And Facebook needs kind of the equivalent
cultural change. I have never seen a company do that while the same people are in charge.
And so I think like no matter what, he he probably should not be CEO, partially because he also he's
just really bad at representing the company. Right. It's like a fundamental job of a public
company CEO is you have to be able to go to the Senate and take a bunch of crap from people who don't really know what they're talking about that well.
And who perhaps, you know, some of their criticisms are correct.
Some of them are totally off base and crazy.
Read Ted Cruz.
And you have to kind of sit there and take it and then be able to represent the company well.
And he can't do that.
So it's like if he didn't own a voting majority of the shares, I think like a very reasonable board of directors would say you're failing at the fundamental job.
And I actually think they kind of missed an opportunity here because they became meta.
And when Google did this, again, it was that kind of shift where Larry Page went up into
the stratosphere of I'm going to be the CEO of this holding company and I'm going to be
able to work on rockets and all kinds of fun stuff.
And then they hired a manager to run Google, the big part.
And so there was an opportunity, I think, when Meta happened that he could go play with
his virtual reality and go do the fun stuff that he obviously wants to do and then let
somebody else take over as CEO of Facebook, preferably from the outside, and kind of revamp
it culturally and fix a bunch of these problems.
And they didn't do that.
He's both the CEO of Meta, but he's still running Facebook day to day in a way that I think is really unhealthy.
I mean, do you see a scenario where he ever steps down? Like, is it possible that there's
internal pressure from within, especially after, you know, the Facebook papers and everything over
the last several months? It doesn't seem like anything has changed or there's a desire to
change there. From my
perspective, it seems like they were just trying to get through that last controversy and hope
that it quiets down and then continue on their merry way. Yeah, no, I mean, I think the strategy
is working out like the company's never been richer and never been making more money. It's
never had more users. They're not facing really true regulatory pressure, at least in the United
States, because I mean, the fundamental thing is that Republicans, Democrats agree that they hate Facebook,
but for, it's for completely opposite reasons. Right. And they finally figured that out. There's
this weird period during which that, um, Ted Cruz retweeted, I remember this very distinctly. It's
like Elizabeth Warren tweeted this big criticism of Facebook and then Ted Cruz retweeted hers.
And I was like, that can't hold. Right. Cause it's like yeah right because yes they're both angry but they they want completely different futures and so um and you know they are facing i
think some reasonable antitrust pressure uh and so i you know if there's an antitrust move and you
end up breaking off the cool new sexy stuff then i think there is a world where mark goes with it
right like if there was a oculus metaverse spinoff that in the end, he's like a nerd who likes building stuff.
He has a product vision and he's quite good
at kind of predicting where things are going to go
from a product direction.
That's what he wants to do,
not like deal with this huge social network
with a gazillion languages
and all these people causing all these problems
that in that case, he might go with the smaller company.
But that's the only situation I could think of.
I mean, the guy's going to live for a long time.
He's very young still.
So you don't really have this Bill Gates situation
where Gates finally like gave it up
and decided he was going to give all his money away.
Mark's already giving his money away.
His wife's doing all the good work, right?
And so I think part of his long-term plan
is like Priscilla is going to save my reputation
by, you know, curing cancer or whatever.
So yeah, I'm not sure.
And there's no good way to create that pressure from the outside because the board of directors is captured by his votes.
Well, so there's the obvious leadership issues, but I'm interested in your take on Frances Haugen and the document she leaked. damning and compelling takeaway is that Facebook is not just some neutral public square where
people are acting like people, that the company is making intentional decisions about the information
that we see and that we engage with. And the effect of those decisions, whether intentional or not,
is often to amplify the worst parts of human nature. Would you agree with that?
Well, that last one. So this is the problem is we don't know what the Hagen documents say.
So there's like 1,500 documents, something like 30,000 pages of documents and only a
couple dozen.
The couple dozen that were like the steamiest and the, you know, that were the most interesting
to a handful of news outlets got published, right?
Right.
A majority, I've seen a couple of the documents
that have been published.
A majority of those documents
are people actually doing a good job,
which is bad things happen online.
We should study how it happens.
We should come up with mitigations.
We should test those mitigations.
And then we should see what the outcome is.
And that's what you want, right?
That's actually what we want.
And so my overall feeling in the Hagen documents
is one, Facebook could have solved this problem for them
by being much more public about this, right?
Like most of these documents, if they were published on a blog post
or in a journal or something, would not have been a scandal. And when you say they would reveal that
people were basically doing a good job, are you saying that they would reveal that people were
taking steps to reduce the potential harm that the platform caused or that people on the platform
were causing? Right. So these documents exist because Facebook has created this org called integrity, right?
Which is a bad word, but like is effectively like a trust and safety. It's exactly integrity is the
Facebook term for trust and safety. They should just call themselves trust and safety instead of
Facebook always has to come up with their own branding for anything. But anyway, it's,
it is because Facebook has like by far the largest trust and safety team
in all of social media. Right. And so there, there are more quantitative social science PhDs
doing research on Facebook's negative effects than in probably the rest of the industry combined.
And so now are those people always effective? And I think one of the outcomes from the documents
that I have seen is that there continues to be this problem that understanding what is going on
does not help you. If you run into either a policy team who wants you to be neutral, or if you run into a growth team
that doesn't want to do anything that hurts their metrics, right? And so I think that is like a
continuous problem of, you could say, we've noticed this uptake in some kind of bad behavior.
We have some ideas to fix it. And then you have that getting vetoed by a growth
team. That's like, well, your idea to fix it hurts our metrics by one and a half percent.
So we're not even going to consider it, right? We're not even going to test it. And, um, and so
I think that that continues to be a fundamental problem. Um, but you know, again, this is kind
of like the direction we want to go. And the hogging documents are not really well-timed here
for the industry because it is good that Facebook has started on this path. They have not gone there yet. And now we're at this junction where everybody else is
watching and is trying to decide, like, is it worth it for us to know what the bad things are?
If knowing is what creates the possibility that all of a sudden we're going to get criticized for
it, which is like, again, this constant battle inside of Facebook of like publicly fix, identify
and fix our problems, or just kind of cover it all up and hope that it passes us by is that like, which one of those
teams is winning. It looks like the pass it by team from reading the tea leaves on the outside,
from the people who have quit and the people who have been fired. There's a bunch of people who
work on transparency and integrity stuff that are now gone. And so on the overall, like it's,
it's, I think it's a much more mixed bag because we want them to do this work, right?
Like we want them to do this work.
And most of the companies in the world, media and social media, don't do it, right?
That's the other kind of great irony here is that we're reading about all this in the Wall Street Journal, which belongs to Rupert Murdoch.
The idea of Rupert Murdoch having a civic integrity team, like measuring what is the outcome of the Wall Street Journal
and Fox News on American democracy is just laughable, right? Yeah, they're never going to
do that. Right. And so we kind of want this work to happen. We should be critical when then the
work does not make it into actually fixing problems. We should not be critical when it's
just, okay, well, you identified a problem, right? Because here's also the truth here is every single platform has these, right?
Facebook is not the largest advertiser in the world.
They're not the biggest website in the world anymore, right?
TikTok is actually the biggest in the United States now.
YouTube has more time spent.
Facebook, I think, has the most unique users
when you add up all the products, right?
So it is the biggest by one measurement,
but not by all of them.
All of them have this, but most companies don't look.
And so I think we need to create an incentive structure for there to be transparency and inner kind of
self-reflection on these problems and not punish that self-reflection by just saying these documents
say that they're bad. That I completely agree with. Like stipulated, it's good that Facebook
is introspective like this and tries to find and identify problems on the platform.
Every social media platform should do the same thing.
The ones that aren't should be criticized for that.
I think, at least in the documents that were leaked in the stories, the problem seems to be when researchers, employees within the company can say, hey, you know what?
This is a problem with the product. This is an algorithm that is,
if we tweaked it this way, would cause less harm.
And then running up against a brick wall,
it's senior leadership saying,
no, we're not going to make that change
because we want to prioritize growth over safety.
Right.
And so just to be more specific,
which revelations from those stories do you think Facebook deserves criticism for?
And what do you think that Haugen and maybe the media got wrong about Facebook and what criticisms do you think is unfair?
Maybe just into two categories.
So again, of the documents that have been publicly discussed, I really want to get access to these and I want them to be public so we can have a public discussion.
Because also there's a huge amount that we can learn both at Academia and other companies
can learn from learning from Facebook's mistakes, right?
Look, I will just say that it's always bothered me that like the press strategy to deliver
documents to a media consortium and not to just make them public so that everyone could
see them has always bothered me.
And I don't really understand them.
I get why she didn't just make it public without doing any kind of censure because you have all
these names of people. And the truth is, so I had people who I was doxxed by ISIS on a Telegram
channel, right? Like I had people who had got death threats from ISIS. So you really don't want
of like, here is one of our low level employees who writes a report about ISIS all of a sudden
is now getting death threats at home. And so like, I, as you can imagine, I support that because there's probably like things in there
about the ministry of state security in China or the GRU that have my name on it, where I'm saying
not so great things about people who could, you know, make my life. Yeah. Yeah. You don't want
to piss those people off. Yeah. No, I get that. You've been there. I've been there. Yeah. Yeah.
Right. Um, and so like I worked for Mike McFaul, right? Like I I've seen what that's like to have the GRU on your ass.
Um, oh yeah.
And so, uh, anyway, um, and so, but yes, I, I think like there, there was an alternate universe in which you go through like a reasonable step of like getting rid of people's names
and stuff like that.
And then they are released on a rolling basis publicly, or at least to like a group of academic
groups that can then release them publicly later.
But that, that pissed me off anyway.
So of the things that are the worst, I mean, the worst one for me is the one that was
always a problem. And I think is a continuous problem with Facebook was there's an article
in there about the underinvestment and kind of basic trust and safety outside of English in the
developing world. Right. Um, which is the, the amount of investment, you know, the sad part is
like the Facebook that we get as Americans is the best one that exists on the planet. Maybe the German one's a little better depending on a couple of different
metrics,
but like English and German are the ones for which a lot of work happens
French a little bit.
And then outside of those languages,
it drops off precipitously.
And so that specific one was about human trafficking,
where there are a bunch of different components that are happening on
Facebook,
but the most egregious was that the people were advertising the jobs to pull people into this trafficking.
They're advertising jobs on Facebook.
Now, doing a job advertisement is obviously not illegal, but this guy was able to pull together of like the people doing this job advertisement are then on the back end are basically selling these people as slaves.
Right. And so that this is the kind of thing that if was proper investment that you could take care of and um and
this is one of my one of my kind of issues overall with kind of the media discussion of facebook is
that 99 a bit about disinformation like among like kind of the elite corporate mostly liberal media
in the united states 99 of it is about political disinformation in the u.s right which is bad but
disinformation is the hardest of the problems here because there are
legitimate free expression concerns and legitimate concerns around kind of the political power you
want the platforms they have there's a bunch of bad things that happen for which there is no
counter argument right like human trafficking should not happen the abuse of children should
not happen um the organizing of terrorist attacks should not happen and so i think that is like the
of all the articles that that kind of hit home for me as with my personal experience is that the investment outside of a handful of countries that have really high media criticism and really high kind of regulatory potentially problems are just way underinvested.
And that causes a lot of unnecessary suffering especially in the developing world and you could fix that problem just with more investment more staff more resources
in some in these other countries this is not like one of those really tricky problems to solve
right you don't have to decide like what is vaccine disinformation which is a angels on the
head of the pin kind of question that 50 lawyers at facebook will argue over for a month deciding
human trafficking should not happen on the advertising platform is
pretty simple. All the rules are there. You just need the investment in the people and the languages
and the skill sets to pull it through at the end. Part of the problem there, too, is that a lot of
the investigatory work at Facebook is in a loop with the public sector, right? So like when we
found a bad guy hurting children in America, we were able to get law enforcement on like that,
right? If we found a potential terrorist attack in Western Europe, that guy was arrested like that.
When you're dealing with some of these problems in the developing world, it is hard to get law
enforcement to get involved. And so what happens is instead of continuing of like, well, we're
going to do the best we can on our own. I think there's a little bit that loop has to exist where
law enforcement comes back to you and says, this is a problem and I want to work on it. So I think Facebook needs to kind of redouble down on in the places where maybe we can't get this guy in jail.
We are going to double down on mitigating the harm, even if this person is going to continue to exist and we know they're going to come back over and over again.
What about the story regarding the meaningful social interaction sort of algorithm there that
was one that's really got me because i feel like it gets to the core of at least my problem with
facebook which is not so much about like let's moderate content everywhere but they are
specifically prioritizing in news feeds and people's feeds, right? Content that gets people
angry, enraged, et cetera. Well, that's not exactly what that story said. And so this is
where I get a little like both the Wall Street Journal and the Washington Post stories here
were not really matching up with what the document said. And so like this is where I think like it
gets pretty complicated. Part of it is whenever you make these change, it's the company doesn't
like make a change in the vacuum. And so I think like Part of it is, whenever you make these change, the company doesn't make a change in a vacuum.
And so I think the main social interaction,
when you make a change like that,
even if it's well-meaning, there will be an outcome.
And so, yes, I think the bad part there
is that clearly there was the growth team
is putting their finger on the scale
in a bunch of different situations
where their metrics, they believe,
have to monotonically go up, right?
Like, you can never go backwards.
It is not okay to make a trade off.
And that is again, a fundamental kind of Zuck problem is that he allows the growth team to have
that power without holding them accountable for what the outcomes are.
But I want to caution against the idea.
There's kind of this,
not to use,
you're using me as a stand of all Silicon Valley.
So I'll use the stand of all liberal media,
right?
And so like the first kind of like there's this first kind of liberal
media bubble of everything's russian propaganda 99 of the bad stuff that happens is americans
doing it to ourself right so that bubble's popped you're you're past that right yeah right like
everybody says it wasn't them but you go roll the tape back to 2017 and a lot of people know
i know a lot of people go we everyone gets swept up in it i get it everyone gets swept up right the second bubble was kind of like okay well and this bubble
continues to exist you can content moderate your way out of this right that like as long as you
enforce your rules then the bad stuff i don't like will go away now that should be true for a bunch
of things for which are not kind of politically difficult but for the things that a lot of you
know people are talking about these days, like vaccine disinformation, political disinformation, it's not that easy to come
up with those rules. Right. And, um, and it, and, and that's where like the adversarial stuff gets
really tough. And now there's kind of this new one of like, it's just the algorithm. If you just
tweak the algorithm, all this bad activity will go away. And I'm just gonna tell you, that's not
true. One, this happens on every single platform, right? So like everybody who has all these
different algorithms and one of the Hagen documents that was leaked
is actually an experiment that Facebook did
where a certain percentage of people saw reverse chrono,
right, which is just all the stuff that people have shared
who are friends of yours on Facebook
in reverse chronological order, no ranking at all.
And that was worse by almost every metric of badness
than the ranking, right?
Which makes sense because people do
things like if you're a spammer you do a lot of spam if you're a clickbaiter you do a lot of
clickbait and so the you know it is quite complicated of if i tweak this algorithm you
can make it better and i think that's like this bubble that continues to inflate because just
kind of imagine like all these bad problems will go away if you're able to figure out the algorithm
and tweak and i just don't think that's true. I don't think that's empirically, there's not empirical evidence for
that either in the Hagen documents or kind of in the public academic literature. I don't necessarily
think that if you just tweak the algorithms, you're going to, you're going to fix these problems.
But I think that what, what has been revealed is that what you're seeing on facebook is not just some it's not just a
public square where everyone just gathers and shares news like that the team does have its hand
on the lever like it can deprioritize this or prioritize this or so like there's a lot of power
here and i don't and and and i think the the fact that if you make one change it causes one
problem and then if you change it back it causes another problem just speaks to the fact that
something has been created that is so unwieldy that even the people in charge of it can't fully
control the harms that it causes which is probably a bad thing for society well the harms that it
causes so facebook causes you're saying
it's people you're no no what i'm saying is we have to separate out the things that facebook
makes worse from the things that are a reflection of society and that's the that is now the kind of
overall media liberal bubble that is inflated is the idea that like you want the social media
companies to make people better and i actually i think that is like the scariest direction
here yeah i don't think their job is their job is not to make us better i think their job i think And I actually I think that is like Facebook's Facebook's policy in other countries is like we follow the local law.
What happens when the local law is supporting an autocratic regime? And does Facebook really want to say that they're neutral in the global struggle between autocracy and democracy right now. Well, by their own mission statement, you
know, bringing the world closer together and building community, it doesn't sound like that's
neutrality. It sounds like that's on the side of democratic values, but it doesn't seem like all of
their internal policy decisions either here or around the world necessarily match that.
Right. And that goes back to way too many of the policy decisions are being driven by the
government affairs responsibility, which outside the United States is it's way worse, because at least in the US, when Trump was president, his power to get Facebook to take down his opponents and such was pretty limited.
Right. Yeah. What you see right now in India, especially so India is by far the most important country, you know, for the I've said this for years, that it is the most important country from a regulatory perspective and just predicting the future of the internet. And right now you've got
Modi who is trying to say, trying to recreate kind of the power that the people's Republic
of China has within a democracy, using the fact that he is democratically elected.
He legitimately has the support of hundreds of millions of people, and he has very few legal
and constitutional restrictions, the kinds of powers he can of people. And he has very few legal and constitutional
restrictions, the kinds of powers he can have. And you are seeing a future in which the American
companies, Facebook and Twitter in particular, and now YouTube are all of them saying, well,
we follow local law. Uh, India is a democracy. And so we're going to follow their law. And the
outcome is support of autocracy. Right. And so, yes, yes, I mean, I totally agree. And I think that is
the neutrality argument here is both punting responsibility and also happens to be in the
economic benefit of the company, because you can't make money unless you have ad salespeople and all
the folks that you can have in a country. And so the future in which Facebook pulls out of India
is just completely, you know, impossible to imagine from a financial and a growth perspective.
But like not imagining that then puts them in this trap where they're like, well, then the only other option is we fall with the local laws and the outcome of that is going to be really, really negative. If you were Mark Zuckerberg, what internal policy changes do you think could and should be made to make the platform a better place?
Right.
I mean, so first I would quit and spend my money, you know, live on Kauai and let somebody else take care of and fix these problems, right?
Finish buying an entire island in Hawaii and just hang out.
Yeah, okay.
So I think the policy
team has to be split, right? Like you have to have the government affairs people not be in charge of
platform policy. I think you need to specifically, uh, you know, have somebody very high up whose
job it is to represent the human rights responsibility here. Um, and so get that out
of the general comms and policy argument and have the platform policy people,
the people who decide what is allowed and what is not allowed, you know, have a specific set of
human rights goals that you want for them that effectively, like you said, are not being neutral
between democracy and autocracy, right? That you're going to say, we're going to be on the
side of pluralism and democracy, even in situations that means going up against a democratic elected
government. That's the step there that's complicated, right? Of like, well, the guy's elected, right? Like Duterte was elected
or, you know, Bolsonaro was elected. So we should do what the guy says, um, is like not, not actually
in support of democratic values. Right. Um, and so I think you have to split that team. I think
more importantly is you have to make the, the integrity team that creates these metrics of, these are the bad things that are happening. Not only do you have to invest there, but you have to make the, the integrity team that creates these metrics of these are the bad things that
are happening. Not only do you have to invest there,
but you have to give them power over the growth and other product teams,
whereas that they have the ability to basically say that our metrics are going
to win this one. Right? So exactly how you do that gets a little bit complicated,
but a lot of it has to do with kind of the,
what kind of metrics you're holding to executive compensation.
So, right.
So like for those executives who are running product teams, you need to change around their
compensation plan so that like, yes, your growth goals are good, but if you slip on
those growth goals and you've improved on these other ones on these goals around safety
and security, then we, you will get your bonus anyway.
I highly doubt there's anybody at the VP level at Facebook
that gets a big pat on the back if growth goes down,
but integrity goes up,
including the people who are running the integrity team.
And I think that's like a fundamental problem here
is that like you have to create that tension
inside the company
and then you have to let this new upstart team
whose job it is to measure downside
and to fix downside win.
And then I think publicly,
they need to decide that they're going to be much more like they missed an opportunity here to
release the Hagen documents themselves, right? They still have it like they're not public.
They could go do the redactions themselves and, you know, get people's names out and such,
but pretty much nothing else. And then go release them and say, like, this is the reality of what's
going on in the world. And we're going to be more public about this. And I think that would be a huge step forward of kind of turning over a new leaf on this and
then continuing that process of we're going to work with outside researchers, we're not going
to get rid of CrowdTangle. You know, there's all this stuff that looks like CrowdTangle is going
to guy CrowdTangle is the tool that our team uses to study disinformation that everybody else uses
to study disinformation. It is a huge pain in the ass for Facebook, right? Because there's no
YouTube CrowdTangle, there's no YouTube crown tangle.
There's no TikTok crown tangle.
Everybody else gets away with murder
because they don't provide the ability
to study their platform.
And so they're going to less transparency.
I'd like them to go to more
and then to try to make that a competitive advantage
in the public market of,
we are going to be more transparent and open
about the kinds of things that are happening.
A big problem there is Facebook groups.
That's actually a very hard issue to study is what's going on in groups.
And we don't really have time for that because it's a really complicated kind of privacy tradeoff of, like,
at what point is a group large enough that you want to have transparency in what's going on
or what point do you want to, you know, protect people's privacy?
If we had a functioning Congress, what laws do you think would help or what regulations?
Right.
So now we're really like in a fantasy land, right?
Yeah.
Okay.
So like my colleague, Nate, personally has put out a law, a draft law around transparency
that has two components.
One, it's a hold harmless for researchers to do this kind of work.
So, you know, there's a group at NYU that has been trying to study,
Facebook has an ad archive, unlike a bunch of other platforms, but it's not that great.
And so they get around the fact that the ad archive is not that great by doing a bunch of
scraping a bunch of data in a way that Facebook prohibits. This gets actually, again, pretty
complicated because like a lot of this goes back kind of the overreaction around Cambridge Analytica,
where everybody freaked out about Cambridge Analytica being the worst thing
ever.
And so as a result,
like academic study of the platform has been massively cramped down.
And so the law,
this law basically says,
well,
you know,
if researchers operate within these parameters,
they are allowed to do this work.
And then the company is not responsible for it.
Right.
So,
you know,
if,
if NYU does something bad,
NYU is responsible for it.
But as long as you're doing something good,
they're, you know, they're protected.
And so I think that's a complicated balance to make.
But Nate has created this balance.
And then the second is required.
You know, if a platform hits a certain size, they are required to transfer transparency, that they have to have at least the content that is public has to be available via APIs. And certain engagement stats have to be available programmatically to
certain researchers. Right. And again, there's some privacy issues there, but I think they're
all fixable because it gets, you know, right now, Facebook and Twitter are the ones who are out
there. As a result, 99% of the stuff of, I saw something bad on the internet is written about
Facebook and Twitter. And almost nothing is written about the platforms that are actually
in some measures larger. And so we don't want Twitter and Facebook feeling this pressure to backslide.
We want to take the bar of where they are right now and set the bar on transparency even a little bit higher
and then try to get them to meet that and then try to get these other platforms to get there, too.
Your current job is director of the Stanford Internet Observatory.
You're also part of the Aspen Institute's Commission on Information Disorder.
You guys issued a final report. It starts by saying we're in a crisis of trust and truth
that exacerbates all other crises. What have you learned about the root causes of that crisis and
how we even begin to get a handle on it as a society? Yeah. I mean, when we talk about kind
of the disinformation crisis in the United States, it's complicated by the fact that things have changed a lot between 2016 and 2020, right? Kind of the texture of
disinformation has changed of like, you know, either from Russians or from, you know, just
troll farms and propaganda farms, you'd have all of this crap that was being spread by a large
number of people that the vast
majority of disinformation now is spread by a relatively small number of people. And we know
exactly who they are. Right. They're all friends of yours on Twitter, I'm sure. Yeah. Right. People
who love you. Right. And so we have like this huge report that people can get from EIPartnership.net.
But in it, if you look for like the top 25 spreaders of disinformation on Twitter,
you recognize all the names. Right. It's Gateway Pundit. It's Donald Trump jr. It's a bunch of kind of blue checkmark
accounts that are specifically spreading disinformation. Cause that's what they do all
day. Um, and those actors are multimedia. And that's one of the other challenges that we've
got right now is that, you know, a Dan Bungino has an AM radio show. He has spots on Fox news.
He goes on Newsmax and then he also has a big presence on
facebook um and so you you end up with like all of those different outlets kind of reinforcing each
other um in a way that then is very difficult to act and in only one of those cases do you really
i mean i guess like in the newsmax or fox news you've got a central actor but like really only
in the facebook case do you have a central actor who has ever taken steps against disinformation? Um, and so you can push on it in one side and then it just
kind of squeezes out to the other. Right. Um, and, uh, now that doesn't mean the company shouldn't
do anything, but I do think it, it goes to the complexity of the problem because we're now at
the point at which it is a really profitable business to be somebody who lies to people all
day. They
know they're being lied to, but they want it. Right. Um, and you're going to provide that to
them and you have some different outlets to do that profitably. Uh, and if some of them cut you
off, you've, you've built the base elsewhere. Um, and I, there's not like a really good, easy
solution for that. So, I mean, that's something we dealt with a lot in the Aspen commission,
um, was effectively like the Fox news problem, right um uh you know we can come up with all these recommendations
because we know that there's a possibility of some kind of regulation that affects social media
platforms and some kind of interest in the social media platforms to fix something even if you don't
think it's enough um whereas then there's these outlets like the news maxes and such who their
entire reason of existence is that they,
it is very,
very profitable for them to spread disinformation.
Yeah.
I mean, the attention economy incentivizes shameless assholes.
That's what's profitable.
So that's a much larger problem than just any,
any social media platform.
Right.
And,
but I mean,
in some ways,
like we have this much more,
you know,
you've started a media company that couldn't have existed 20 years ago.
Right.
Like you've got podcasts because you can spread stuff on the internet.
You know, you guys, you're huge on Twitter.
You've got, you know, Facebook pages and stuff.
And so there are positives,
but like, is there a way we can get those positives
without having the negatives?
I'm not sure.
Especially when you have people
who are just straight up motivated to,
it's a supply and demand issue.
And I think that's the other issue
I'm always dealing with, you know, personally,
is like, you know, it's easy to be kind of a supply sider that like if you you believe that if you cut
off the supply of disinformation that this problem's gone um and almost nothing is ever done
around the fact that there's a huge kind of demand uh to be lied to um yeah and and at the same time
that we have kind of a continuous problem of um you know, kind of people not believing in institutions and centralized institutions anymore, which those institutions make worse by the way they act.
Right. Like if you look at all the vaccine disinformation stuff, you can look at misstep after misstep by CDC and WHO and Dr. Fauci and others about how the emerging consensus on these issues is handled, which
creates kind of the opening for the hucksters. Like when the institutions say we're not really
sure what's going on, the hucksters can come in and say, I am absolutely positive and I'm going
to sell you on my vision that I am totally true. And there's a huge demand for people just to be
told that this is the absolute truth and that they don't have to have any ambiguity.
Well, these things fuel each other. The distrust in institutions fuels disinformation.
The disinformation fuels more distrust in institutions.
And so it becomes a vicious cycle.
Last quick question that I'm asking all the guests.
What's your favorite way to unplug
and how often do you do it?
Oh, God.
Yeah, I like to go sailing with my kids, right?
So I haven't done in a while.
The weather hasn't been great and we we've been you know all stuck inside unfortunately but uh yeah I like to
get out outside I mean I've got three kids and so uh basically any anytime I'm not working I'm with
them excellent uh Alex Samos thank you so much for joining offline appreciate it thanks John Offline is a Crooked Media production.
It's written and hosted by me, John Favreau.
It's produced by Andy Gardner Bernstein and Austin Fisher.
Andrew Chadwick is our audio editor.
Kyle Seglin and Charlotte Landis sound engineered the show.
Jordan Katz and Kenny
Siegel take care of our music. Thanks to Tanya Sominator, Michael Martinez, Ari Schwartz, Madison
Hallman, and Sandy Gerard for production support. And to our digital team, Elijah Cohn, Nar Melkonian,
and Amelia Montooth, who film and share our episodes as videos every week.