The Rich Roll Podcast - Facebook Whistleblower Frances Haugen On The ‘Chaos Era’ Of Social Media & The Future of Public Discourse
Episode Date: September 7, 2023Determined to bring transparency and accountability to Big Tech, in 2021 Frances Haugen risked everything to blow the whistle on Facebook. She copied tens of thousands of pages of documents that revea...led that the social media giant had accidentally changed its algorithm to reward extremism. Even worse, Facebook knew its customers were using the platform to foment violence and spread falsehoods—and refused to fix it. Frances testified to Congress and spoke to the media. She was hailed at President Biden’s first State of the Union Address. She made sure everyone understood exactly what the documents showed. And she set an example for standing in truth and doing what is right for the greater good. Today we dive into the nuanced impact of social media on society. We talk about why algorithms prioritize extreme content and Facebook’s own complicity in radicalization and political violence around the world. We explore the tools available to combat these issues, including what Big Tech can do to prioritize user consent and reduce misinformation and hate speech. Note: If this exchange leaves you wanting more, Frances has written a compelling and comprehensive book about her experience entitled: The Power of One. Ultimately Frances left me with a surprising sentiment: the belief that we can have social media that brings out the best in humanity. This is a fascinating and important conversation. I hope you learn as much as I did. Show notes + MORE Watch on Youtube Newsletter Sign-Up Today’s Sponsors: ROKA: roka.com/richroll On: on.com/richroll AG1: drinkag1.com/richroll Plant Power Meal Planner: https://meals.richroll.com Peace + Plants, Rich
Transcript
Discussion (0)
The Rich Roll Podcast. how Facebook is functioning. You know, if you start a fight in a post, you're going to get much, much more distribution
than someone who does a calm response.
If your kid uses social media
for more than three hours a day,
they're at double or triple the risk
of getting depression or anxiety.
The average kid in the United States
uses these products for three and a half hours a day.
Zuckerberg himself refused
to address these major problems.
Mark has had to be the CEO since he was 19.
He's the chairman of the board,
which means that he is also his own boss.
He lives in an echo chamber.
We're not citizens on our social platforms.
We are subjects of a king.
Hey everybody, welcome to the podcast.
Today, I'm joined by Francis Haugen,
a data engineer, a scientist, and former lead product
manager on the Facebook civic misinformation team. But she is most widely known as the whistleblower
during the 2021 Facebook Files. In an effort to stop Facebook's complicity in the radicalization
and instigation of political violence around the world,
Francis copied tens of thousands of pages of internal Facebook documents, which revealed that the social media giant knew it had accidentally changed its algorithm to reward
extremism, all while refusing to fix it, and knew its customers were using the platform to
foment violence, spread falsehoods,
and more. Frances testified before Congress about this. She spoke aplenty to the media.
She was hailed at President Biden's first State of the Union address. She made sure everyone
understood exactly what the document showed, and she set an example for standing in truth and doing
what is right for the greater
good. She has since written a compelling and comprehensive book about that experience,
entitled The Power of One, which obviously provides the subtext and the basis for today's
exchange. A few more things I want to say about Frances and the conversation to come, but first.
to say about Francis and the conversation to come, but first.
We're brought to you today by recovery.com. I've been in recovery for a long time. It's not hyperbolic to say that I owe everything good in my life to sobriety. And it all began with
treatment and experience that I had that quite literally saved my life. And in the
many years since, I've in turn helped many suffering addicts and their loved ones find
treatment. And with that, I know all too well just how confusing and how overwhelming and how
challenging it can be to find the right place and the right level of care, especially because
unfortunately, not all treatment resources adhere to ethical practices.
It's a real problem.
A problem I'm now happy and proud to share has been solved by the people at Recovery.com,
who created an online support portal designed to guide, to support, and empower you to find the ideal level of care tailored to your personal needs. They've partnered with the best global
behavioral health providers to cover the full spectrum of behavioral health disorders,
including substance use disorders, depression, anxiety, eating disorders, gambling addictions,
and more. Navigating their site is simple. Search by insurance coverage, location, treatment type,
you name it. Plus, you can read reviews from former patients to help you decide. I feel you.
I empathize with you.
I really do.
And they have treatment options for you.
Life and recovery is wonderful.
And recovery.com is your partner in starting that
journey. When you or a loved one need help, go to recovery.com and take the first step
towards recovery. To find the best treatment option for you or a loved one, again, go to recovery.com.
Okay, Francis.
In this conversation, Francis shares the nuanced impact of social media platforms on society.
We talk about how algorithms prioritize user consent and reduce misinformation and hate speech online.
Ultimately, Francis left me with a surprising sentiment, a belief that we can have social media that brings out the best in humanity.
Now, that is a hopeful sentiment, and I think you will in turn also find this conversation
surprisingly hopeful.
It's fascinating.
I loved having it.
I really enjoyed Francis.
So without further ado, here we go.
This is me and Francis Haugen.
I appreciate you coming here to share the book.
The Power of One is out everywhere.
Easy to find. We'll link it up in the show, The Power of One is out everywhere, easy to find.
We'll link it up in the show notes and all of that.
Well, let's walk through the history of this a little bit
for people that aren't super familiar with your arc
and begin with why you even joined Facebook to begin with.
Like what was attractive about going to work at Facebook?
Because at the time, it wasn't that long ago, it was 2019, right? There was already a lot of stuff,
alarming things about Facebook at the time that for the sort of uneducated person about your life
would make one think like, well, why did she even go there? So back in my late 20s, early 30s, so I went into
the hospital in 2014. So I was 29 years old and my health had been declining for a couple of years.
So I went from being able to ride 125 miles on a bike in a day to being in a wheelchair in two and
a half years. So I go in the hospital and I find out that I have a foot long clot in my leg. That's like the girth of like a cigar that's been growing there for like
two years. I find out that I have not been actually being careful enough about my diet.
So I have something called celiacs. So I'm allergic to gluten and I was so malnourished,
my protein synthesis had stopped. So I was starving to death, even though I'd gained a hundred pounds,
which sounds so counterintuitive.
And I ended up,
it took me 15 months to relearn to walk.
And even when I went back to my job,
I was still only just functional enough
that like I could work all day,
come home and then sit on the couch.
And so I hired an assistant
who was a friend of my younger brothers
and he became like essential to my recovery.
So he would take me like on long walks.
He like liked to lift weights.
So he like coached me about protein
and macros and stuff like that.
And in 2016, Bernie Sanders lost the primary.
And my friend, he kind of fell into the dark corners of the internet. He wanted to commiserate.
He really felt that Bernie had been wronged. And he began to have a lot of conspiratorial thinking.
And I remember how painful it was.
Like when the Facebook recruiter reached out to me
in like late 2018, early 2019,
and was like, hey, do you wanna work at Facebook?
The thing I said to her,
it's like, I really didn't care
if I got the job or not, right?
Like you said, Facebook had a bad reputation
by the time I joined it.
I said, the only thing I would work on is misinformation
because like watching him
divorce himself from our shared reality was just excruciating for me.
How deep did it go for him?
And he ended up moving to Indiana with some people who I think are questionable. And it's
interesting, like the thing that saved him was he met a nice girl and he like started going to church again.
And like, he's fine today, but like, you know,
he was definitely running with some people
who have some pretty extreme political beliefs.
And so I always try to slightly guard his privacy
because it's not my story.
Right, sure, of course.
So that experience with that close friend
was a motivating factor
in you being interested in the possibility of participating in the solution. Like Facebook's
not going away. If you're inside that machine, perhaps you can pull some levers and right the
ship. It's also one of these things where like I talk about in the book, the idea, and I'm not
trying to brag here, like the idea that we have hundreds,
hundreds of people in the world who understand really how these algorithmic systems work and the intersections of it. When I make this product choice, you know, when I decide
that a post should go to the top of your feed with every new comment, like how do those two things
interact with each other? You know, when the person said like, hey, we do have a job for you, you could do civic
misinformation. It put me in like a really hard place because on one side, like I had felt like
this really intense loss and it scared me, right? Like he was a smart, college educated, funny,
insightful, empathetic person. And he still like spiraled off enough that like we would have fights over like,
does George Soros run the world economy?
Right, like stuff that any reasonable person would be like,
no, he does not.
There's no space lasers,
there's no lizard people.
I'm envisioning the comments on YouTube below this video.
There's gonna be a bunch of people
who might disagree with you. Yeah, yeah. Good. We should have a conversation. We should have software that
facilitates conversations, not screaming at each other. But so I felt that on one side,
you know, if you know that you're in a small group of people who can do something and that
thing could actually impact a lot of people's lives. And you don't wanna do it.
How do you navigate like feeling like you have an obligation
to at least do a tour of duty?
And so like, I never thought that like,
I was the only one that could save the day,
but it was like, you know,
there's a relatively small number of people.
I'm not gonna go work in Washington
or like, I'm not gonna move there,
but I'm willing to do, you know, two years, three years
and work on this problem.
And then I can say, I, I at least, uh, did, did my part and I ended up there and, um, it was just
so much worse than I had any idea it could be. Um, because like I, like I said before,
I had never really thought about intern, the international implications of Facebook. Like I,
I didn't even think I was going to work on the international stuff. I thought I was going to
work on the United States. Um, and so it so it spiraled very quickly. So walk me through the
progressive disillusionment and, you know, in specific terms, like what did you encounter?
So I showed up and, you know, it was like a couple of different levels of where I was like,
something is profoundly wrong. So like on the organizational level,
like I said, like you mentioned,
I have an MBA from Harvard.
I show up- With a focus on organizational structures,
like that's your jam.
Organizational health, yeah.
Have you ever seen the TV show, Silicon Valley?
Yeah.
So I failed to watch all of season one
because it's not a comedy, it's a documentary.
Right.
Right? Like there's a documentary. Right. Right?
Like, there's a lot of things they play for laughs where I'm like, oh, I watched that happen.
That is not nearly as funny as you think it is.
But, yeah, so I really care about how organizations function.
And I show up and they tell me the very first week.
So, they put me in this boot camp for product managers and for people who are not familiar with technology or how software is written.
You have software engineers and you have people who work with the software engineers whose job is to kind of articulate what is the problem we're trying to solve and kind of segment that problem and the solution to the problem into like the actual discrete set of steps where like we're going to solve this, then we're going to solve this, then we're going to solve this.
And together we're going to ladder up to the thing we're trying to accomplish.
And so I show up and they say, hey, you know, you think you know how to do your job, but Facebook is enough different than how other tech companies work that we have learned if we don't put you on this two-week bootcamp,
the chance that you're just gonna like burn out and fail in the next six months is too high.
So for the next two weeks,
we're gonna teach you how to Facebook.
And I get like maybe four days into this bootcamp
and my manager's like,
hey, we don't have time for you to go to bootcamp.
You need to come and give us the plan
for what we're gonna do with civic misinformation
for the next six months.
So I've just come to the company. That's red flag number one.
Red flag number one, right? I've just come to the company. I know nothing about Facebook's problems.
My teammates have never worked in this space, right? So my only three engineers are front-end
engineers. So that's like people who figure out how to lay out the buttons and plumb it all
together so the information goes to the right places. My data scientist is new to the company.
My engineering manager is new to the company. We have no idea what's going on, but we are expected
to come up with a plan in the next like five days that we're going to present to like all the
leadership of the safety team. So that's the first red flag. Second red flag is I hadn't really
thought, so one, I had not thought about the idea of Facebook being the internet.
But I hadn't really thought about the implications of, you know, when the internet comes to a new place, how does that change the place?
Right?
Like when we got the, like in the United States, when we got the internet, it started in the 80s, right?
And, you know, it started a few Ivy League schools, a few departments.
You know, it progressively flowed down a gradient of privilege,
you know, went out to more and more universities. It went out beyond the universities into the
public, you know, on and on and on. But it took, you know, a solid 20 years. And what's happening
in a lot of other countries is like people are becoming literate to use Facebook, right? That
one of the first experiences I had
was we were supposed to sit down
and come up with how we were going to measure success.
And we had a hard problem
because we were specifically not working
on third-party fact-checking.
So it's like people who do,
like say, this is not true.
Like we were going to work anywhere in the world
where there were not fact-checkers,
which is actually most of the world.
Facebook mostly goes in and does that like after the fact cleanup in like
the United States and Europe. And so someone suggested, well, what if we looked at expressions
of doubt? So like if you see something that looks off and you make a comment saying, I don't think
that's true, or I think that's misinformation. You know, we could count up those. And if the
number went down, we were doing a good job. And one of our researchers said, that sounds really obvious.
That sounds like it would be a thing that would work. Except for when we went in and did interviews
in India, people are coming online so fast that when we talk to people with master's degrees,
these are smart, educated people, they say things like,
why would someone put something fake on the internet?
That sounds like a lot of work.
And so just imagine I'm in this context.
So smart people who are getting onboarded
onto the internet through the lens of Facebook
and basically Facebook only
without adequate discernment tools
to understand, to be internet literate.
The way I like to put it is like,
we don't realize that shit posters
are actually like a cultural treasure, right?
Like our like 20 years of people
saying random shit on the internet
makes us have a certain amount of doubt of being like,
oh, you can't just forward things.
People will laugh at you if you just forward things.
In a lot of these places,
people believe that the information on Facebook I just forward things. People will laugh at you if you just forward things. In a lot of these places,
people believe that the information on Facebook is more trustworthy because it came from a friend.
Right, like if your friend sent it,
then you should trust it because you have that connection.
And so that's just like-
And there's an understanding that we've gone through
that maturation process of kind of understanding
shit posting versus reality on some base level,
even if not great, at least some, right?
And when you think about the developing world
in that very different picture,
in the context of Facebook proudly announcing
how much money it's spending on third party,
fact checking, et cetera,
but not understanding or appreciating
that the gravamen of those resources
are being poured into the developed world, right?
And for various reasons are not doing much of anything
in the developing world,
language barriers, resources, et cetera.
I think another thing that is important
for context for people is like,
I hadn't really thought about linguistic diversity.
So like if you look at a country like Ethiopia, you know, in the abstract, it's like, oh, a country far away.
They have 100 million people, 120 million people.
They have 95 dialects, six main language families.
You know, you have a problem there where when we focus on content moderation, instead of trying to make the systems a little safer by default, so that's things like, should you have to click on a link before you reshare it?
Just that knocks down misinformation by 10 or 15%.
Or saying, we don't have infinite reshare chains.
We say, it gets a few hops away from you, and now you have to copy and paste.
You have to intentionally share it further.
That's like 30% less misinformation. Those kinds of things are systematic. Everywhere in the world gets the benefit. When we instead focus on censorship, content moderation,
we have to rewrite those systems language by language by language. And there's ways that we
could do that hypothetically in a scalable way. like we can decentralize out how we generate the labels. But at least the way it's done today,
places like Ethiopia just get left behind. I can't imagine though, even in the best
circumstances that deploying human resources to handle this job is ever gonna be successful. It's an impossible task to do it that way.
And I think the problem that comes up
is the context in which this debate is framed,
which is one of free speech versus safety.
And a big part of what you talk about
is the falseness of that binary.
So it's really interesting. Like that, I think,
is the thing that keeps us from moving forward in the United States is that, you know, enough
Facebook executives have gone in front of Congress and said, oh, everything you're talking about is
so sad. But, you know, it's this fundamental divide between free speech and safety. Or like
Mark Zuckerberg has gone on podcasts like this and said, you know, I've really grown in the last few years
because I've realized when you stand up
for what you believe in, you know, it can be really hard.
And I'm a defender of free speech.
You know, the thing that's frustrating for me is,
is, you know, this question of, you nailed it earlier.
When we have, when we prioritize content from our family and friends,
you spend a little bit less time on Facebook. Your family and friends are not quite as exciting
as whatever that magical post is the algorithm found for you, right? Or really tantalizing
content keeps you on there a little bit longer, but also has more violence or misinformation.
Having systems where we say,
hey, you have to actually just be public about everything, like about what's happening on your
systems, then you can begin putting pressure on things, having protests, having boycotts,
having divestment campaigns. And that provides a counterweighting balance beyond just profit and
loss. Because right now, you know, like we talked about
cutting the reshare chains and saying, hey, you know, if it gets more than a couple hops away,
you have to start over, copy and paste it, say whatever you want, but you have to do it,
not just your instinct. You know, right now, you know, that's not a huge hit to profit. It's maybe 0.1, 0.2%. But right now there's no number value
placed on reducing misinformation by 30%.
We don't have to quantify that as a societal cost.
Right.
And it is amazing how much energy there is
behind this free speech argument
given that these are private entities. And there's the town square argument much energy there is behind this free speech argument,
given that these are private entities.
And there's the town square argument,
but that feels very porous to me.
That it just feels like- We're trying to document it all right now,
but it's looking like they censored my book on Facebook.
Oh, really?
Yeah.
Like they, my publisher usually can like mark up,
like put the buy button, all the other books they sell.
And mine violated the commerce policy.
How so?
Cause it's from me.
Yeah, cause as a data scientist,
you're gonna wanna know like where, who's buying the book,
where are they buying it and what platforms are they using
to purchase it from?
And if you're not seeing any traffic coming from Facebook,
that's a pretty clear indication
that they're throttling you.
And so this question, I have a lot of empathy
for the idea of the town square, right?
They are the primary place where people communicate.
And in the United States,
like I actually really dislike it
when like more societal elites say like, does Facebook matter?
Because like there is a huge fraction of the American populace that still uses Facebook as their primary way of communicating with each other.
And so I have empathy on the question of like, what should you be allowed to take down or not?
But for context today, we don't get to see what they take down.
There's no transparency on how any of these censorship systems work. Right. We don't get to see what they take down. There's no transparency on how any of these censorship systems work.
We don't get to see what posts are being left up
or taken down.
That's so interesting.
So on the one hand, we have evidence or a sense
that you're being throttled on Facebook
in terms of the word getting out about the book.
At least the buy button, we don't know beyond that.
Okay, yeah.
No, you can't get that buy button up there.
And then on the other hand,
and I have to believe this is a direct response
to your work and the fact that the book has just come out.
I think it was, was it yesterday or maybe even today,
Facebook announced all these parental controls.
Oh, really?
Did you see this?
No, I haven't seen it.
Meta just announced they're adding new safeguards
and monitoring tools for teens across its social platforms.
Parental controls on Messenger,
suggestions for teens to step away from Facebook
after 20 minutes and nudges urging young night owl
Instagrammers to stop scrolling.
So this is good.
This is positive change.
The timing is interesting.
This is, these ideas are not new. These are ideas that
you've spoken about and suggested. It's just interesting that they're happening at this moment.
The thing that interested me so much was after I came out, they doubled the amount they spent on
safety. Like they came out and they said, we've heard the criticism. We're going to spend twice
as much on safety. And then in the last six months, you know, after Elon Musk showed that you could lose 75% of your employees and there were no consequences, you know, Mark himself came out and said, you know, Elon really set an example.
Like, we can have a year of efficiency and fire 20,000 people.
And I find it really funny that the thing that is making them start thinking,
or at least publicly making gestures
towards doing more safety is I started being noisier again.
Yeah, and Elon's cuts involve the safety teams, right?
The safety teams got cut.
They did the same thing at Facebook.
They did.
I've had some of my favorite researchers
are no longer at the company
and it wasn't because it was their choice.
So when they talk about these large investments
that they're making in safety,
then how does that match up against the layoffs
or what does that actually mean?
Well, if you'll notice,
like a lot of those changes are relatively superficial.
There's no way that took more than 10
or 20 engineers to build. Right. And the second thing
is we don't actually know the efficacy of any of those interventions. Like imagine a world where
Facebook said, hey, we know that sleep deprivation is the single largest danger from our products
for kids. Right. So sleep deprivation for those who have not like dug into literature,
it puts kids at big risks of academic deficits, which stays with them for the rest of their life.
It puts them at higher risk for mental health issues, not just depression and anxiety, but also things like bipolar or schizophrenia.
Increases the rates of substance use, uppers because they're tired, downers because they're depressed.
And rates of death from accidents.
So not just automotive accidents, but like all cause accident rates.
You know, Facebook would come out and say,
we acknowledge what a large role we play in kids' lives.
We're gonna start publishing
a very simple set of data each week.
It's gonna take us 15 minutes to do like once.
And after that, I'll just update over time.
And we're gonna say how many kids are online
at 10, 11, midnight, 1, 2, 3 a.m. And we're gonna say, how many kids are online at 10, 11, midnight,
one, two, 3 a.m.? And we're gonna do it every week.
If they were doing that,
if they were really serious about these problems,
you know, it doesn't violate anyone's privacy,
but we'd be able to see,
did those numbers move when they launched those things?
And as long as they refuse to release even basic data,
like we should consider all of these moves marketing.
Otherwise, we're just taking them at their word.
Well, it's also things like
Antigone Davis came and talked to Facebook.
I think the week before I, excuse me, talked to the Senate,
the week before I came out.
And she bragged about how Facebook took seriously
the issues around body image
and they had a little thing that would pop up
if they thought someone was at risk for an eating disorder.
And like I intentionally, you know,
took pictures of the dashboard
that showed how often that triggered.
And so globally it was triggering
a couple hundred times a day, right?
So it's one of these things
where if you're gonna get up there
and brag about how serious you take this problem
and then do an intervention that touches
maybe 1% of kids affected,
like we shouldn't give you too much credit.
What do you make of Elon's Twitter?
There are certain things that I'm super excited about.
So things like he published the algorithm, right?
And he didn't just publish the algorithm,
he published the history of the algorithm.
And for anyone who writes code,
your code in the past was always worse
than the code in the present, right?
One of the sad parts though,
is he's done things like cut off data access.
So Twitter used to be significantly more transparent
than Facebook.
He turned off the API.
Or he started charging cost prohibitive amounts for the API.
Sure. And then Reddit followed in suit with that. And there is an argument there like,
hey, we're sitting on all this data. Why should we make it open to all of these AI tools to crawl
it? This is our value as a company. They should at least be charged for it. There is an economic
argument for that. But I think the, the, they could be making exceptions for, for critical
public safety roles. So there are a handful of academics that were actively monitoring for things
like influence operations. So, so one of the ways in which social media is now used in warfare
is you have networks of accounts that go and spread misinformation.
And I worry a lot more about organized spreading
and misinformation than say your uncle believing odd things.
Right?
There've always been people who believed odd things.
They go to dinner parties, they say those odd things,
we talk to them about it.
When you have organized efforts
with thousands of people spreading a lie,
it changes the balance of power
and they have a disproportionate impact
compared to an individual saying something.
It used to be academics could monitor
for those coordinated efforts.
And the Twitter data was so important
that influence operations on Facebook would get detected with the Twitter data, that they would see accounts with the same names operating on Facebook, or Twitter would send the IP addresses over.
And Elon could be coming out and saying, hey, if you are doing one of these vital public safety jobs for free, basically, you can apply and get the same access you got before.
Right.
Right. Right. So as a result, people like Renee DiResta, Tristan Harris,
who have organizations dedicated to kind of, you know,
reviewing all of this stuff and trying to make sense of it.
I don't think Tristan's does.
But Renee does, right?
So now she doesn't have access to the information
that she had prior.
I'm a fellow at McGill this year,
and like they had a huge,
like voter disenfranchisement monitoring effort,
a bunch of other public health related monitoring things.
And they had to stop basically all their research
because it's like $40,000 a month now
to get access to that data.
Okay, so opened the algorithm, closed the API.
What else has he done? And how are you making sense of it?
I think his efforts around subscription models is super interesting. So for context for people,
if Facebook had had a subscription model back in 2010, we probably would have never diverged from
that version of Facebook. You know,
the version of Facebook in 2010 was about our family. It was about our friends. It was much
more human scaled because there wasn't a, but because there it's an advertising supported
business model, they had an economic incentive to get us to view more and more and more content.
And I'm, I'm, I'm very interested in how his subscription experiment will play out
because it is having the side effect of it is the level of conversation is definitely not as rich
as it used to be. And the fact that he biases towards subscribers distorts that again.
But the flip side is I like where the incentives are aligned.
But the secondary thing is it also creates a larger barrier to entry for accounts. Like you
don't have to pay $8 a month for that account. Right. So on the one hand, political dissidents
in non-democratic regions who need a level of anonymity and access are gonna have a more difficult time
getting the word out in the way that they need to.
On the other hand, I like the subscription model
and it makes one wonder what would have happened
if at the onset or the inception of the internet,
that that became the primary model
as opposed to an advertising model,
the internet would be completely different.
And we would have a better incentive structure
that would not have created a lot of the problems
that we're contending with today.
And I don't know how you unravel such a massive knot.
I mean, advertising is how this whole operation functions,
but to the extent that,
and it's hard to reverse engineer it.
Once that's out of the bag, how do you go to subscription?
You capture a small percentage of the audience that's interested in that.
But I guess with a large time horizon here, maybe that can tip in the right direction.
And to give your listeners context, like right now in the United States,
or the United States and Canada, the average user on Facebook generates for Facebook $55 per quarter.
So I always ask people like,
would you be willing to spend, you know,
200, $250 a year on Facebook?
And it's interesting, most people claim they would not,
but Facebook has run experiments
where they bribe people not to use Facebook.
And people actually value Facebook
a lot more than like $20 a month.
It's fascinating to see that kind of cognitive dissonance.
And when you say Facebook,
are you including Instagram, WhatsApp in that?
But what's interesting is that's $55 per user
in the United States and Canada.
It's only $14.50 in Europe.
It's $4.50 in Asia and it's $2.50 in the rest of the world.
And so it's one of these interesting things around,
you know, I think it, I think it would be super interesting to see how things play out as we see
the demographic changes in the United States, because if more and more young people leave
Facebook, you know, as those things change, the place that Facebook makes its money from is the
United States. And so they'll be left with these users
that are locked in
because it is the internet in their countries,
but Facebook doesn't generate
very much revenue from them.
Netflix.
Totally.
Happy to pay for Netflix.
I actually like that the algorithm
is paying attention to what I like.
So it serves me up interesting things
that I might be interested in and it's effective
and it doesn't feel predatory or pernicious
because it's not based on an ad model
that's trying to get me to,
I mean, obviously engagement is important to them.
They want you on their platform as much as possible.
So that's not great, but-
They do, but they don't, right?
So like they-
Once you've subscribed, yeah.
But the more you watch,
then maybe the more you share it with other people and that allows them to onboard more subscribers. they don't, right? So like they- Once you've subscribed, yeah. But the more you watch, then maybe the more you share it with other people
and that allows them to onboard more subscribers.
I don't know.
They want you to watch enough Netflix
that you want to subscribe again next month,
but no more than that.
Because every minute more you watch,
they have to pay for the computers.
They have to pay for the bandwidth.
You know, and, you know, think about it like-
But there was that thing about like, well, how do we get people to not sleep? Like the whole thing
about like, how do we maximize engagement? Yeah. Well, the nighttime is the big problem here. Like,
how do we get into that? You know, the line, the line is something like the thing that he's
competing against his sleep. Right. That's what I mean. Yeah. More concisely said. Yeah.
So one more thing to just keep in mind
from the business model perspective,
like where do the incentives lie?
There's a big push from investors
away from subscription-based models to advertising models
because the ceiling of value of your company
is much higher with an advertising-based model.
Because over time you can get higher and higher value.
Advertisers, you can get better and better at targeting the ads
as you accumulate more and more data.
And that becomes the real value.
It's the future value of your company
because there's a finite amount more,
like Netflix always faces this,
like people don't want to pay for the subscription to go up.
You know, they're like $14 was okay, but $16, you know.
All right, well, let's go back to you as a whistleblower walking towards this, you know, Rubicon moment
where you decide you're gonna share what's happening.
So you're becoming progressively disillusioned.
Was there a certain inflection point where you decided,
I just can't, I have to go public
with what I'm discovering here?
So it's interesting.
You know, like the saying,
bankruptcy happens slowly over time and then very suddenly.
So I lived with my parents during COVID.
And one of the things that I'm super, super grateful about is because I lived with them, as I saw things unfolding over that period of time, you know, I could sit at dinner and be like, I saw this thing today.
It feels really weird.
You know, like I said this and they like, they gaslit me, you know, is it me or is it them?
And they gaslit me.
Is it me or is it them?
And part of what I'm really grateful for in retrospect is most whistleblowers don't get that level of support.
They hold secrets alone and it eats away at them. And they go to work each day and if they try to raise their issues, people tell them things that make them question their own reality.
And I was able over that period of time to like become more and more confident that what I was
seeing was real. And that meant that when we reached December of 2020, so this is about a
month after the US 2020 election, when Facebook dissolved the team I was on, so they dissolved a
team called Civic Integrity and meant that
I had already gone through my period of agonizing. And I saw that it was time that Facebook could not
heal itself. To give you a little bit of context. So one of the classes I took at Harvard Business
School is called Change Management, which sounds like a cliche business school class. It's like, oh, you want to be a
consultant when you grow up. But change management as a field is a really interesting academic field
because think about how hard it is for an individual to change a behavior, like a habit.
Once you put a group of people together, because they all have conflicting, they have a vested
interest in the status quo, It becomes even harder to shift,
say, a company. And there's a playbook of maybe four or five things you have to do that are not
optional if you want to change an organization. And one of them is you have to appoint a vanguard.
You have to say, this group of people is the future. They're going that way.
We have to join, like, you either join them or you get out of their way
because the leadership's going to guard them. And for four years, Facebook did that, right?
So they got in a huge amount of hot water after the 2016 election because there were things where
it's not like Russian misinformation where they are asleep at the wheel. It's like
there were these misinformation entrepreneurs out of Macedonia that had far
bigger impact than the Russians because they were just kind of running wild, monetizing on Facebook.
You know, Facebook spent four years building out civic integrity to be that vanguard. And
right after the 2020 election, they dissolved the team.
And what is your sense of why they made that choice?
I think part of it was that, you know, they made good investment decisions over a long period of time.
Like it was the only part of Facebook.com that was growing was, as the team grew, I think they just accumulated more and more liabilities
for the company because it went from having
an amorphous problem to having a concrete problem.
Every time a researcher did a project,
now there was a record that Facebook knew
there were problems here.
And they weren't willing, like you have to remember,
Facebook can't change the circumstances
of their external constraints unilaterally, right? Like Facebook can't come out and say, hey, all software,
all social platforms need to be transparent in the following ways so that we can do the right thing.
Or at least they didn't think they could do that. And so as they accumulated more and more of this
documentation, I think it was just creating conflict for the company
because they weren't willing to actually go
and address those problems.
Right, so they're getting a sense
of just how broad and complex the problem is.
Yeah.
And there's a lot of documents piling up,
accounting for that.
And without solutions,
maybe better to just go,
la, la, la, la, la, I don't, I don't,
you know, it's better to not see.
Well, it's not a question of there weren't solutions, like over and over again, solutions were, la, la. I don't, you know, it's better to not see. Well, it's not a question of there weren't solutions.
Like over and over again, solutions were being proposed,
but each of those solutions, you know,
would hurt profitability by 0.1%.
You know, if you stack together
the 30 most important things, 50 most important things,
you know, Facebook might be 5% less profitable,
but they also have like 35% profit margins.
So it's this question of where does that trade-off of cost come from? Right. So if you go from 35 to 33,
is that really a problem if we're actually addressing these social issues?
Yeah. If you could save the lives of 100,000 people in some country where Facebook is the
internet, is that worth 0.1% of profit, 0.2%
of profit. And part of what you reveal in disclosing the documentation that you collected
was that Zuckerberg himself refused to address these major problems despite a lot of well-documented
pleas from employees. One of the stories I tell them there
is to kind of benchmark how Mark views himself
within the company,
or at least how he portrays how he views himself,
which is in the spring of 2019,
the president of India began making a lot of comments
about Muslims that are considered red flags
for ethnic violence.
He was comparing them to rodents, other kinds of vermin,
which is like a classic kind of step
on the dehumanization kind of ladder.
And people started,
and so people start having conversations about,
what should, if he started calling for violence against Muslims,
you know, what should Facebook's policy be around that kind of violence incitement?
And a task force of, you know, 30 people with, you know, deep expertise in these issues from
all over the company, you know, from every single slice of the company that would be affected, you know, from communications to the ads team to the social cohesion team, which is like the genocide team,
you know, all over, got together and did a task force for months to figure out what should
Facebook's approach be to this? Should it happen? And Mark came in, looked at the work of this group of 30 people
and said, I can write a better policy
and wrote one over the weekend.
And when they announced it on Monday or Tuesday,
the policy was,
we will not touch the speech from politicians.
And the problem was he had never consulted
the advertising team or the misinformation team.
And so he didn't know,
the Facebook didn't know who
the politicians were in the world. Right. So it was a wrongheaded decision and ill-informed at
the time. And he didn't consult the teams that actually knew what were going on, which speaks
to a larger issue around control and authority within Facebook. Like Mark sits on what, like 54% of the voting stock
or something like that.
So Facebook buys back a huge amount of shares every quarter.
And it was like 53 ish when I came out.
And I, at least last time I checked it was up to 55.
So like every quarter goes up a little bit.
Right, and so it's structured such that
there's all these different classes of stock, right?
But he's always sort of notching up his voting rights shares
so that he can exert basically,
authoritarian control, unilateral control over the company,
which is problematic, right?
Yeah, the- I'm the CEO, bitch.
Yeah, that was his business card.
Yeah.
So he's the chairman of the board,
which means that he is also his own boss
because he's the CEO.
And Microsoft, you know,
one of the inflection moments
when Microsoft started being a little bit more
of a pro-social actor
was when Bill
Gates stopped being allowed to be both the chairman of the board and the CEO. And so these
questions around, you know, one of the things I learned in business school was this idea that
we think of corporations as these like behemoths. You know, they're eternal. They've never changed.
They have all this power. But the reality is like even what we conceptualize as a corporation has changed over time.
And there are checks and balances on those corporations, even if they don't necessarily take into consideration enough social costs.
In the case of Facebook, all those checks and balances are gone, right? Because he controls the majority of voting shares,
it doesn't matter that over and over again,
80% of all the non-management votes say things like,
we need to look at the costs of Facebook.
We need to do a security audit.
We should have only one class of votes
because you shouldn't be allowed to demand to be CEO forever. security audit. You know, we, we should have only one class of votes. Cause like, you know,
you shouldn't be allowed to demand to be CEO forever. It doesn't matter because he has that,
you know, um, outsized influence of the company. And what is the composition of the board?
Does he have objective feedback? Is there anyone who's in his ear who's pushing back on what it
is that he wants to do. Not really.
Like if you really aren't anywhere near Mark, if you're critical.
So this is yet another in a string of events and incidents that tiptoe you towards the edge of the cliff here. So when they announced they were going to dissolve civic integrity, that was the moment where I was like, oh, they've given up on the playbook.
Very clearly someone had come in and guided them and said, hey, if you want to change the course
of Facebook, this is how you do it. When they told us, your team is so valuable, we're going to
integrate it into other parts of the company. That was the moment for me where I was like,
Facebook can only save itself if it gets help.
Like it's not gonna be able to do this alone.
And so that was the moment where I knew
that I was gonna have to figure out how to do something.
And the dissolution, I have to believe on some level,
because it came in the aftermath of the election,
it feels like, well, the heat's kind of off
because we're now on the other side of the election.
So the pressure wasn't as intense
as it was leading up to the election.
I think what's so interesting about it is,
so the fear at Facebook was genuinely
that there was gonna be violence in the United States
in the realm to the election.
And I think Facebook kind of felt like,
oh good, we dodged the bullet.
Like that the situation was resolved.
There's no future danger.
But that meant that when the buildup to January 6th happened, there was now no longer like a single person whose
responsibility was to raise their hand and say, hey, you know, maybe we should turn back on the
safety systems that were on for the election. So what's your sense of Facebook's culpability
in the January 6th events?
So when you look at how the movement
that culminated in the riots at the Capitol,
it's very interesting how those social movements grew
because it wasn't like a broad organic swell.
It was a very, very small number of people
inviting hundreds or thousands of people to these groups that they didn't know.
There were people who were very actively understanding what the weak points were in Facebook and then going out there and spam friending hundreds and hundreds and hundreds of people or making lots of twin accounts.
They could do it over and over again. And I think there's real questions around,
even in, I'll give you an example of a safety feature that might have changed how some people made decisions.
When you sit in interviews with the people
who got arrested at the Capitol,
they say things like, I thought it was real, right?
When I looked at my Facebook feed,
all I saw was that there was a coup
and that I,
a patriot, needed to come save the country. Part of why that was happening was if you have a post
and every time there's a comment on it, it goes to the top of the feed, very rapidly,
you can have a feed that gets completely swarmed with this kind of mass panic.
swarmed with this kind of mass panic. And one of the things that was on before the election in 2020 was if you had a Facebook group that had too many calls for violence, they would do things like say,
hey, you need to appoint your own moderators, which sounds like, well, what's that going to do?
Like, aren't they just going to approve all their own posts? It turns out getting people to care about a group enough that they are willing to volunteer to do tedious work like that, you automatically kind of rate limit that group.
Or they said things like, if there's too many calls for violence, we'll turn off commenting.
Right?
Now that means those posts don't go up to the top of your feed a hundred times.
And so there's things where the heat
could have been turned down on Facebook
for the six weeks, eight weeks before January 6th.
So you make this decision
that you're gonna start collecting documents
and you kind of tap into this history,
this background you have in debate
and how you formulate arguments
and that informs how you start to collect information.
Right?
And so we all have this vision of, you know,
Edward Snowden and thumb drives
and how do you sneak the stuff out?
But you were basically taking photographs
of documents on a laptop and then organizing them
on a gapped laptop.
Is that how it went down?
So the, I think one way to think about it is like,
when Snowden did what he did,
because he was an engineer who worked on an archiving system
he had a lot more awareness
of what the security protocols were.
And in my case, I didn't know how vigilant they were being.
And so like, I had to be careful in terms of saying,
you know, what's the worst case scenario?
Like if they were really surveilling me,
you know, what would look like I was doing nothing?
And taking pictures of your screen,
like one of the things that is interesting about it
from like a threat modeling perspective
is there's no way for a company to know you're
doing that other than maybe like the cadence at which you scroll. And so I was being as cautious
as I could be because I knew that there would only be one chance to do this. That, you know,
whatever I got out would be the thing that was the record for history because they would lock it down afterwards.
But you came out with how many, like 20,000 documents?
22,000 pages.
22,000 pages.
It's like 1,300 pages.
So that's 22,000 camera snaps?
Yes, yeah.
So that must have taken quite a bit of time.
Or you get very efficient.
It's interesting, a huge fraction of everything that was captured
was captured in the last month I was at Facebook.
So it seems like, how is that possible?
But it's one of these things where I did a lot of like
very, very long days for a number of weeks.
You make this decision.
Well, you get in touch with this writer
for the Wall Street Journal and that kind of-
Or he had reached out to me.
He reached out to you and that initiates
kind of the momentum behind this. And ultimately,
you decide to trust him and he becomes your sort of co-collaborator in all of this.
And you make the decision that the government entity with whom you're going to share all of
this was the SEC, which I think is interesting. Like I think most people would think,
why didn't you go to the DOJ? So in the United States, there's only two organizations that have whistleblower protections.
So one is the SEC and one is the FTC. The DOJ doesn't have by default whistleblower protections.
And so usually the way you do is you go to the SEC, you get whistleblower protections,
and then they work with say the FBI or the DOJ. You also have the right to give information to Congress
that you believe is essential for Congress to fulfill
like their constitutional role in oversight.
And so that was the strategy for my lawyers.
And you're somebody who, despite doing all of this,
you weren't looking to be in the spotlight.
Like you're a reluctant whistleblower on some level.
That's part of why there's so many documents.
Yeah, and you really didn't,
you wanted to be anonymous.
Like the whole kind of public facing aspect of this,
I wouldn't say it's an afterthought,
but it wasn't your first choice.
You had to be cajoled into that role.
Yeah, I think the thing that I had not really thought about,
like I was enough in the zone of, you know, I talk
about part of why I did what I did was I felt like Facebook took my future from me. Like once I knew
the magnitude of the problem, you know, I had this fear that I would, you know, 20 years in the future
see a catastrophe in an African country or in Southeast Asia or five years from now and say,
like, I didn't do anything. Like I could have done something and I didn't do something.
And I, I, I feared that I would lay in bed and just feel guilty about it. Um, and so I had not
really thought about what was going to happen next after I gave the information to the government,
right? Like I, I, it wasn't really my, like my horizon of crisis,
if you will, was like not that far into the future.
And after I left Facebook and like,
we got closer to the actual publication date, you know,
it became this thing of having to confront the idea
that Facebook knew who did it, right?
They could look at the data, they could figure out.
And, you know, I think anyone, if you They could look at the data, they could figure out. And, you know, I
think anyone, if you go and look at your life, there's a version of you that you could spin
facts in certain ways and make you look very unsympathetic. And my lawyers were just very
clear. They were like, you know, even not coming forward, you can actually endanger people,
right? So like in the case of the Ukrainian whistleblower,
which was also represented by my whistleblower lawyer,
some of the people the media was suggesting it might be
weren't the person.
You're endangering other people as a result of that.
And also it's easier to cast doubt and aspersions on an anonymous face.
It's more difficult if you're out in front of it
and the choice to go on 60 minutes
and reveal your identity, I think was the right choice.
It had to be terrifying.
It was terrifying, yeah.
So in the background, the Wall Street Journal is compiling.
I'll give you context.
Like total number of times in the background, the Wall Street Journal is compiling. I'll give you context. Like total number of times in my life,
I've had hairspray, I had had hairspray,
like in my hair prior to going on 60 Minutes.
Maybe two, maybe.
You said you had like Texas hair.
Yeah, and I like, imagine you're sitting there
and you're someone who like,
like my mother's a scientist.
So like I only would get to see her with any makeup on,
like maybe once a year for like a Christmas dinner
kind of thing. And a priest.
And she became a priest in her fifties.
Yeah.
My grandmother became a lawyer in her fifties.
My mother became a priest in her fifties.
And so I've joked about becoming
a psychiatric nurse practitioner in my fifties.
Cause that would be like the art.
There's a pivot up there at some point for you, I guess.
Help people deal with all the trauma
that we've accumulated from social media.
But like, imagine, you know, like I've never,
I've almost never worn makeup.
Like maybe I've worn makeup 10, 15 times before my life.
I've had hairspray on my hair maybe two times in my life.
And I'm sitting in this makeup chair
and they are like doing stuff to my hair.
And it's just like,
like I talk about this moment in the book
where I was almost grateful
that there were so many people on the set
because it made me feel like I had no option
but to continue forward.
Right, that I was like so overwhelmed in that moment.
I was like, well, I guess I don't wanna send
all these people home if I don't like,
if I chicken out like right now.
In reflecting back upon that moment
and seeing those Wall Street Journal articles
and watching the 60 minutes interview,
a part of me was like, yeah, like, it's not like,
like this was, it's shocking that an insider came out
and said it, but what you were actually revealing
wasn't all that surprising to me,
but it seemed like it was to a lot of people.
And the point that got the most attention
was the impact of Instagram on teenage girls,
which was only one of kind of many revelations
and perhaps maybe not even in the top three
of the most cataclysmic things that you were sharing.
I think this question of, you know, when I say to you,
an algorithm couldn't cite violence, right?
Or one of the key revelations was Facebook made a change
to the algorithm in 2018 2018 where they went from just maximizing how long you were on the site to trying to maximize how much you would react.
Because your actions incentivize other people to create content.
But within six months of that, you had political parties on the right and the left across Europe saying, you know, we're being forced to run more extreme content now. Like stuff we know our constituents don't like because it's what gets
distributed by the algorithm, right? Like think about how abstract that kind of revelation is.
Or like saying, when you apply a system like that to a country in Africa where they don't have any
safety systems written in their language, what happens? Most people have trouble emotionally
connecting to something that is that foreign. But most people have at least one child in their life
who is either a tween or a teenager, and many of them have seen kids suffer.
And so I think part of why the revelations around teen mental health
were so resonant is people are really seeing the magnitude of the harm today.
And I think, you know, you're saying before, like people had brought this up before.
When you have, it was kind of like a tobacco moment that prior to when the tobacco whistleblower
came out, you know, scientists said known for a couple of decades,
like cigarettes cause cancer.
The thing they didn't know was that the tobacco companies knew that.
Knew that, had documentation to that effect, right?
Yeah.
Yeah, that's the big differentiator there.
What was your expectation going into that whole experience?
And how did that match up to what actually transpired
in the aftermath?
It went way better than I ever could have imagined.
Right, like I came into it and they,
I think part of, because they had dealt with,
you know, whistleblowers like the Ukrainian whistleblower,
that person had death threats.
Even the lawyers involved had death threats.
They had people showing up at their homes
and screaming at them.
We didn't know how the internet,
how the public would respond
to the information I brought forward
because there was a real chance
that the conspiracy theories that spread about me were relatively innocuous. would respond to the information I brought forward. Because there was like a real chance that, you know,
the conspiracy theories that spread about me
were relatively innocuous.
You know, it was things like, she's a crisis actor.
Like there's no way she could be that good.
Which is a big part of what I would imagine
was a motivation in front-loading your book with like,
actually like, here's all this stuff that I did.
Like, I am good.
Like I am smart.
Or like I had a lot of people help me, right?
Like my debate coach was good enough
that he became the head of the national debate.
You were like a top 25 in the country debater, math wizard.
You were like reading at like a college level
in kindergarten or something like that.
My parents are both professors
and I had a college level vocabulary.
Vocabulary, right, yeah.
So I was a little obnoxious.
Yeah, you're very precocious, intelligent child.
So all of that kind of gets vetted out in the book,
but yeah, crisis actor, interesting.
But think about it, like that is way less bad
than like she's like part of a network of pedophiles
or something.
Like it could have gone real weird.
Did you get death threats?
I never got death threats.
Like I don't even get harassed.
Like that's part of what I've been totally blindsided by.
Is because like most women who speak publicly
get harassed pretty seriously.
And I have like one or two people
who messaged me occasionally on Instagram
and tell me like that dress was really cute.
But compared to like what happens
to most women on the internet,
like that's like a walk in the park.
What do people not understand or appreciate
about the experience of being a whistleblower?
That is a great question.
So one of the things, you know, I mentioned before
my gratitude that I lived with my parents. So one of the things, you know, I mentioned before my gratitude that I
lived with my parents. People often ask me things like if I could give one piece of advice to
whistleblowers, like what would that be? And the thing I always say is you need to find at least
one person in your life that you really, really trust that you can be fully honest with because carrying a secret alone, it destroys people.
And I realized, you know, I got really lucky
that my whistleblowing experience was much less traumatic
than most people's whistleblowing experience.
And part of that was that I didn't go on the journey alone
or I was in Puerto Rico when I came out.
And so it meant, you know, there was,
it was very difficult for someone to like get in a van
and drive 12 hours and scream at me.
Right, I'm sure that was helpful.
But you had a lot of help, the Wall Street Journal,
you had this whistleblower law firm, you know,
you had a team of people who are counseling you,
you had people preparing you for 60 minutes,
you had people preparing you to for, you know,
appear before the Senate
and you're a debater, like this is your thing.
So it's interesting.
I'm a product manager.
So like product managers don't code.
Like we are not, like we can code.
Like I can code, like on hobby things, I code.
But when the product moves forward,
it's not because I did it directly.
It's because I helped Marshall help in that direction.
Or like I helped other people
unleash their ability to make impact.
And I think part of it was that like
most whistleblowers don't have MBAs.
Like most whistleblowers are not product managers.
And I have never needed to be the star, right?
Like I don't believe that I uniquely
am the one who's gonna save things.
And so it makes it a lot easier to say, I need help.
And it's interesting.
I had this experience, speaking of the debate.
So if you were a professional dancer,
you take like a lot of dance classes
where you like show up and they teach you some moves
and like that's the dance class for the day. And the reason you do that is that when you audition for a job,
they're going to come out and they're going to teach you some moves and they're going to see
how fast you learn them. Cause that's going to be a measure of like, how easy is it going to be to
work with you? When we did the prep for the summit, we spent two days. So imagine there's
like 15 people in a room or on like Zoom and you spend two days
them throwing questions at you
and you try to answer them
and they say, that's a horrible answer.
Like do it this way because it's cleaner.
Like that's like the content's good,
but like it's not, you know,
you're not communicating cleanly enough.
And I talked to one of the people who was in that room,
I don't know, a year later.
And he's like, you know, Francis,
when we left at the end of the second day,
we were all convinced this was gonna be a disaster.
But you like incorporated the feedback that you got.
And so when you actually started up at the Senate,
you know, you'd done enough dance classes
through doing debate and having your coaches yell at you
and say like, that's a bad answer,
that you knew how to like take the feet.
Yeah, but you'd done enough reps growing up
that you had a comfort level with that type of dynamic.
When you were before the Senate
and also sitting with Scott Pelley,
in either of those cases,
was a question thrown at you that caught you off guard
or were you fully prepped and like had a answer for everything that was asked you?
I'm sure there was a bunch of stuff that I didn't expect.
There were like, I remember there were moments at the Senate.
There was a moment at the Senate where one of the senators
asked me about like quantitative measures on Instagram.
Like, you know, did I believe we needed to take quantitative?
So this is like how many likes a post has. He's like, do you believe we need to take quantitative
measures off of Instagram? And I was like, well, you know, it sounds obvious, but we actually have
documents in the disclosures saying Facebook tried this. As long as you leave comments on there,
kids can still tell who's more popular than whom. Like it doesn't really do anything.
And the Senator clearly had not been given
a large enough bench of questions
cause he just asked me-
He didn't have a follow up to that?
He just asked me a second time.
So I would say that's probably the moment
that I was like most surprised.
It's still a better question than,
how does Facebook make money?
The moments that will live in infamy.
Right.
Where the internet is a series of tubes.
Yeah, right.
That's a good one.
The difference between when Mark Zuckerberg,
you know, appeared before, which committee was it?
The Senate.
The Senate, yeah, yeah.
And then when you appeared,
the sophistication of the questions
and the kind of knowledge base was pretty evident.
Like they're playing catch up obviously
in the regulatory landscape and the legislative landscape
is just miles behind what's actually happening
because the pace at which tech is iterating upon itself,
it's impossible for bureaucratic systems to keep pace
and even understand what's happening.
And I think that makes it all the more important and prescient when someone like yourself comes
along who not only understands the technology deeply, but also has this skill set in debate
and communication, and you're able to translate these things in a meaningful way that is impactful.
Can I challenge you on something you just said though?
You said like it's impossible
for them to keep up at the same pace.
And one of the things that like,
like I'm running a nonprofit right now,
and one of the notions we're actively challenging
is the idea of, you know, if we stopped accepting the idea
that they can't keep up with pace, how would we fix that?
And I think the question is like,
we definitely are seeing an
acceleration in technology. And if we continue trying to execute regulatory systems at the same
way we did it before, it's not going to be fast enough, right? Technologies run away too fast.
But you could say things like, we've unlocked organizational technologies through things like
wikis, other ways of bringing a large number of people together and synthesizing their attention over the last 20 years?
What if we could just recruit exponentially more people
to pay attention and synthesize that knowledge
so that we could help the regulars stay at pace?
Right, and that's, so your nonprofit, Beyond the Screen,
part of the premise there is onboarding
as many people as possible,
particularly young people with education
and like kind of running, you know,
walled off social media experiments
so that they're fluent in what's happening.
So it's a ground up sort of thing,
a populist movement of sorts.
But how does that translate into regulatory
and legislative change? Because
you are pushing up against these gigantic systems that move glacially, if at all.
Well, we have three main projects right now, and two of them are really aimed at what I would call
the ecosystem of accountability. So it's like regulators, it's litigators, it's investors,
it's concerned citizens.
And the second one is this question of how do you have like millions more people feel fluent in how the choices that are being made behind the scenes of these systems.
And that's like for mass democracy.
So we have one that's targeting, one might say the elites, and one saying, hey, how do we do mass scale of culture change. In the case of the one that's targeting more the expert class,
you know, we're working on building out a wiki right now where we say, hey,
let's actually document the harms, kind of like how Jonathan Haidt has done it for kids' mental health. So, one of the really transformative things that happened in the last, you know,
five years was this psychologist at NYU said, hey, you know, we keep throwing around the same like one-liners about
social media. Like there's this famous line from a paper that said, social media has the same
impact on kids' mental health as eating potatoes. So like every time he would try to bring up like,
you know, hey, something's happening with the kids. People would be like, social media has the
same impact as eating potatoes. And he was like, I have to unpack this paper. And the reality
is it's very poorly designed. The statistical analysis used is ridiculous. But he said, hey,
let's start organizing collectively. Let's come together and do a collaborative literature review
of everything we know about kids' mental health and then say, where are the weak spots in this
literature? So imagine you took up an approach like that that has been so helpful in crystallizing and
catalyzing the conversation on kids' mental health. What if we expanded it across the problem set for
social media? Because now it's not just like a few regulators sitting and trying to understand
things. It's we can bring thousands of people to the fold and say, hey, let's synthesize together.
And then one of the things that we think
can be potentially extremely helpful
is we're trying to make the conversation
on how to move forward a little more productive
by separating the concept of we have levers
that might prevent or mitigate harm.
And then we have many, many, many strategies
for pulling each lever.
And the example I usually use is on kids' issues.
So it doesn't matter what the problems with kids,
a common lever across those problems
is let's keep under 13-year-olds off these products.
Like allegedly that's what it's supposed to be already.
Let's actually do it.
If you talk to the people who understand kids' social issues,
they usually are not technologists.
And so they'll say things like,
we should check IDs.
You know, every social media account
should have a passport
or a driver's license associated with it.
And then my work in China,
but it would not work here.
And we could have a conversation on that,
but just trust me.
And there's like all kinds of implications
for things like civil liberties.
But if you had instead said,
hey, I have a lever.
I need to keep kids under 13 off.
And so you said to a technologist,
how could I pull that lever?
They'd be like, here's 10 or 15 different ways
you can find an under 13 year old.
You know, it's everything from kids say,
I'm a fourth grader at Jones Elementary
to kids report each other to punish each other,
like on the playground.
It's like, you piss me off
and I report your Instagram account.
Oh, wow.
I know.
That opens up, that's a sticky wicket.
Isn't that interesting? But the thing is, once you get like a thousand or 10,000 examples, you can find all the other kits. You don't have to check IDs. And so our hope is right now,
there is no regulator in the world that can sit and look at a menu of what is possible.
We can't have community discussion saying, what should the floor be?
Like we're not saying do it all.
Like we're saying there's something down here
where we're saying, this is what's being done today.
This is what should be the floor
and this is what's possible.
Right.
And so we need to think about organizationally,
how do we move faster and faster?
It reminds me of something Senator Cory Booker said
when he was on the show,
which was that change doesn't come from Washington,
it comes to Washington,
which to your point is all about establishing a groundswell
of awareness and education that puts pressure
on legislators and regulators.
I talk in the book about the idea that we like to think laws
are where change comes from.
They're like laws are so clean.
It's like, oh, the law got signed, change happened.
But in reality, like laws don't happen until we come up with norms.
We say, hey, that's actually wrong.
Like we believe that's wrong.
And we come up with standards for what it means to transgress that norm.
And that process is slower, right?
And I think there are ways that we can catalyze
that conversation and it should be a conversation.
You know, it should be a public argument about
what do we want our social media to do?
Or what does social media owe us?
Because so much is possible.
Yeah, well, there's so many threads to pull on that.
I mean, I think that on the one hand,
there's a call to action
or a need that you identify in the book
and have spoken about many times
to interject the humanities and philosophy
into the tech space.
Because when you think of these tech behemoths
and your boots on the ground and have been at,
employed by many of these places,
you're looking at very smart, intelligent math wizards
like yourself and data scientists and coders, et cetera,
who are suddenly making decisions at extraordinary scale
that are impacting society,
but are not fluent in certain kind of ideas
around like ethics, for example, right?
And so young people making big decisions,
doing the best that they can, well-intentioned,
but perhaps, you know, divisions and teams
within these entities,
not staffed with the appropriate people
to make those decisions.
Unquestionably.
Let's say you were a 19-year-old
and you knew you wanted to work in a big tech firm.
And I'd say of people who get CS degrees, a pretty large fraction of them have aspirations of something like that.
You already have a much larger set of requirements usually than most other majors.
So you have less flexibility off the bat to take things that are outside of just your CS education.
But there's also the reality that every class you take that is not a CS class makes you less likely to be able to pass the interviews to get in at Google.
Because someone else will have doubled down.
And so we have this very—part of why I wrote the book is I want people to understand when I talk about the incentives are not aligned with the common good, it's not just at the point of Facebook writing their quarterly earnings statement.
It's like every step along the way
because we're getting more and more godlike powers,
but the people who have the skills to wield them
or choose what problems to solve
have very, very little context
on what the implications of that technology could be.
You said recently, you told the Sunday Times that tens of millions of people will die
if social media isn't overhauled.
So what do you, that's a, I mean,
that's obviously a headline grabbing quote.
What do you mean by that?
So when I joined Facebook,
and I wanna fully admit I'm in the same pool as lots and lots of other technologists that went down this road.
When I joined Facebook, I had not really thought about the implications of Facebook being the internet for some of theed the scene in the late 2000s, early teens and said, hey, we came to prominence because we killed MySpace.
You know, we came to prominence because we killed Friendster.
You know, these were social networks for those of you, your listeners that are too young to remember the glory days of the early 2000s.
You know, these were the social networks that people were on in 2003.
You know, we came to prominence
because we surprised those legacy networks.
You know, we don't know what corner of the world
the thing that will come from will surprise us.
So we're gonna go in other countries and say,
if you use Facebook, we'll pay for your data.
If you use anything else on the open web,
you're gonna pay for yourself.
So de facto, that means, the translation to that is that for gigantic swaths of the open web, you're going to pay for yourself. So de facto, that means, the translation
to that is that for gigantic swaths of the developing world, Facebook is the internet.
It is the primary interface, if not the only interface by which people are interacting with
each other online, which creates a tremendous amount of power. And with that power comes
responsibility. And a big part of your work has been that power comes responsibility. It should.
And a big part of your work has been kind of pulling the covers on the lack of responsibility
or the gaps in what should have, could have been done.
Unquestionably. And one of the things that I want to be clear is this book isn't actually
just about Facebook. It is about opaque systems that are in similar positions because we're actually seeing now TikTok.
And notice TikTok came from the only corner of the world
where Facebook couldn't play, right?
Like it came from China.
I thought when I came out that we had five years
before there would be violence caused by TikTok
because video takes more bandwidth,
you need a nicer phone.
And within a year after I
came out, there was violence in Kenya around the elections that was fanned by TikTok. And if you
look at the drivers, they're actually related. So I talked to someone at TikTok and they said,
when the violence was taking place, there were no moderators at TikTok that spoke Swahili.
And that is the recurring theme, right? So for Facebook, they went into these vulnerable
places in the world. They prevented an organic internet from forming. So an internet that was
based in companies that were local, that spoke the local language, that understood local culture,
that had a vested interest in the success of those countries. They came in and displaced that,
made sure it didn't get to grow up. But then they had these systems where because the algorithms give more distribution to more extreme content, if you start a fight in a post, you're going to get much, much more distribution than someone who does a calm response.
Facebook has set up a system where unless you are actively kind of turning the knobs, or this is the same for TikTok, TikTok's even worse because they push things towards being hyperviral.
Unless you have people in there going in there and kind of blunting the sharpest edges, you can have systems that fan ethnic violence.
And we've now seen that for Facebook twice, Myanmar and Ethiopia, and smaller incidences in places like Sri Lanka.
When we talk about the tools that are available currently
for purposes of staving off misinformation,
disinformation for combating violence, et cetera,
that is incited online. What are those tools? We have
content moderators and we have the ability to toggle the algorithm. Is that still the case?
Like, what does that landscape look like? I'd say we actually have a third one also,
which is you can make intentional choices about how your products are designed.
So, like, if we were to roll back in time to 2010, right? So in 2010, Facebook had a
feed. It looked quite similar to what it does today, but that feed was not algorithmic. It
was chronological. And because of that, you couldn't have mega groups. Like you couldn't
have a group that had 5 million people in it. Because to have a group with 5 million people,
that group might make 5,000 pieces of content a day.
You have to have an algorithm that will cherry pick out 10 things to show you, right? You know,
we can make choices around, do we want social media that's about our friends and family?
Do we want a social media that prioritizes choices we make? So I'll give you an example.
This doesn't even touch the algorithm, or it does a little bit, but it's more like a product design
choice. You know, Facebook has done experiments where they say, hey, let's divide all the content on Facebook into two pools.
One is content you consented to.
So this is the friends and family you picked, the pages you followed, the groups you actually joined.
And one pool is content you did not consent to.
So this is someone invited you to a group.
Facebook started putting content from that group
into your feed for 30 days.
You wrote, that's misinformation on a post and a comment.
And they said, ooh, you like it.
And they made you a member of that group.
Or your friend commented on something
and now the post shows up in your feed.
If you just say, hey, we're gonna give priority
to content that you selected, that you consented to
over content you didn't consent to that you consented to, over content
you didn't consent to, you get less misinformation, less violence, and less hate speech. Like your
friends and family aren't the problem. Right. But you get less engagement and you get lower
revenue because you're not able to serve up as many ads. And so the incentive structure is upside
down. And a big part of your work in the book is talking about the importance of changing those incentives
in partnership or in lockstep
with the urgency around transparency.
So right now we have laws that say,
if you're a private company, you're publicly traded,
you have to report your profit and your loss.
How much money did you spend, your expenses to get that profit? You don't have to report the
consequences of your products, right? One of the major ways in which our economy is changing,
and we have to talk about this because it's only getting more and more dramatic,
is it used to be 50 years ago, 100 years ago, most of our economy was run by systems
where you could buy the product
and you could take it apart.
You know, you could test it
for does it have lead in it or heavy metals.
You could put sensors outside the factory
and see what kind of pollution is being imposed on society.
We are moving now onto opaque systems
where the important choices,
the important consequences happen behind a curtain.
You know, they live on data centers.
Facebook has to report the profit and loss,
but no one actually gets to see how Facebook is functioning.
And I'll give you kind of an example to give a sense of why the world has changed so much.
And I talk about this some in the book.
You know, I know this is going to feel like a reach.
You and I could sit down and spend
three weeks and I could teach you enough coding that you could ask meaningful questions about
Google. You know, like you could ask questions about what does Google distribute? What seems to
be like what their preferences are, that kind of thing. Because we all see basically the same
version of Google. If you want to do that same level of like accountability 101, we would have to recruit
20,000 people and convince them to install software on their computers or their phones
to send us what they saw. You know, it's a much, much harder problem. And so the kinds of laws
that I think really shift the playing field, You know, you're saying change comes not from
Washington, it comes to Washington. For things like cars, and I talk about this in the book,
by the time we met our watershed moment in the mid-60s, so the fatality rates on cars had been
going down for years. In the 60s, they started going up again. You know, there were 100,000 automotive engineers who could
confirm what unsafe at any speed was saying. Or there was an insurance industry that didn't want
to pay out claims. And so they would spend money on research so that they could say, no, it was
actually the automotive company's fault. Or there were lawyers who had been looking for victims and
could give you representative data. You know, if we wanted to enable an ecosystem like that,
that could go and begin challenging these opaque systems,
we would need to pass laws that say,
hey, we need a baseline level of transparency.
Like you can't hide information
on like how many kids are binge watching YouTube at 2 a.m.
You can't hide information
on how many kids
are getting overexposed to self-harm content.
Because even just a little bit of transparency
can be privacy protected data, aggregate data.
Suddenly moms can organize.
Suddenly parents can make different decisions
about what products they let their kids use.
Advertisers can say,
I'm not willing to invest in your product.
But we can't do any of those things without baseline data.
In addition to being called a crisis actor, the other criticisms that were levied at you were, there's a bunch of them, but one was, you weren't high enough up.
You were never in the room.
You're not in the C-suite.
You don't really know what was happening at the highest level.
You're not in the C-suite. You don't really know what was happening
at the highest level.
And, you know, Mark, right after everything was made public
kind of came out and said,
she's mischaracterizing everything.
She doesn't really understand what's happening.
I take this personally.
It was sort of an emotional appeal.
Yeah.
So part of, I totally anticipated
that that would be said about me.
And so one of the things I did was
I didn't just get the documents,
I got all the comments on all the documents so that you could see Facebook employees
talking about each of these documents in real time.
So you could see that people didn't believe
that these were crazy, that they thought these were true.
And one of the things I really hope is gonna happen
is Harvard is right now releasing the documents.
I hope when- Oh, wow. like, so like there's, you know, if there's, you know, tens of researchers that have access right now,
hopefully in the future, like the public will be able to access most of them.
You know, the thing that is, is sad for me about how things rolled out, like we had to do it this
way because we needed to protect people's privacy. You know, if you could see the names on the comments or you could see the names on the documents, like who wrote what,
I think the disclosures are 10 to 20% more scary. Because right now when you read through them,
you occasionally see things and you're like, that can't be right. Like there's no way that
could be true. And if you could see the thread of like, this person said this and this and then
this, and they've been at Facebook for 10 years, you would be thread of like, this person said this and this and then this,
and they've been at Facebook for 10 years, you would be like, oh, this is actually much more scary than I thought it was. What would be an example of that?
I think just things around things like the algorithms, like the idea that there are biases
and what gets distributed. And that these things don't have, they don't have to operate this way,
but they are operating this way
to a really extreme extent.
And to have the comments in the documents also.
Or stuff about things like cartels operating
on the platforms in Mexico or the terrorism things,
or how Facebook's like censorship systems
take down counter-terrorism content.
Like there's just a bunch of stuff along those lines.
And the comments validate the legitimacy of the documents. To have those comments on there
makes one realize like, oh, this is a real document and there were multiple people engaged
in discussing the ramifications of it. Or I'll give you another example. There are documents
criticizing that they have brought multiple safety interventions and that, say, the policy team has intervened to protect the speech of certain political actors.
Right?
Like, there were things brought forward that would have made things safer, would not have cherry-picked.
Like, they weren't picking ideas that were good or bad.
It was just changing how the system was tuned.
And that, you know, the policy team would intervene because it would have implications for certain
actors.
What about the argument that you cherry picked documents?
Great question.
One of the things that I did very intentionally as I documented things was I
actually documented the order in which I captured documents.
So you can,
you can walk beside me as I wander through the documents.
And you can see that it's not like,
you know, the random things here and there.
It's like there, I was clearly searching
and just doing arcs and pulling things as I found them.
And so I just got whatever I found
when I searched for, you know, teen depression
or that kind of thing.
But I do agree that like, I didn't get a complete view.
I only got, the stuff in the documents
is just stuff that I had access to.
And some of that access was privileged
in that I worked on threat intelligence.
So I had access to a broader set of things,
but you know, for a lot of the stuff on,
like say teen mental health,
like that was just, was freely available
to anyone in the company.
One of the things that comes across in the book. And when I listened to you speak
is, is that you're not attempting to vilify Facebook or Mark Zuckerberg. Like you're,
you're, it's quite the opposite actually. Like you seem to have a lot of empathy and compassion
for not just that organization, but for the people that work there and the position that they're in.
And I think that's interesting.
I think it would be easy to just point fingers
and create villains.
So what is your perspective on Mark?
Oh, you know, Mark is so interesting.
So think about it like, what were you like when you were 19 years old?
I was a knucklehead.
I was a drunken, out of control disaster.
I was a disaster when I was 19.
A lot of 19 year olds are disasters.
Mark has had to be the CEO since he was 19.
He has been surrounded by people. Remember, he's had control of all the CEO since he was 19, right? Like he has been surrounded by
people. Remember, he's had control of all the votes since he was 19. So everyone around him,
he has to decide if they're around him because he's in control. And I became who I am today
because I made lots of mistakes and I got lots of feedback and I had to live with consequences of my actions. And Marcus had very little of that feedback cycle.
And I really worry about, you know,
what life has he ended up in
because I, you know, he's going on podcasts
and saying things like,
when I wake up in the morning,
it feels like, and I look at my phone,
like I look at my email,
it feels like I'm getting punched in the face.
You know, like how, like if that's not a cry for help, like I don't, I'm not really sure what it is.
And it's one of these things where, you know, he's the last of the original big tech CEOs that's still in charge of his own company.
And I think part of that is that he lives in an echo chamber.
You know, he lives on an state in Hawaii half the year.
The other half of the year, he's at Facebook.
Like he can't go out anywhere in the Bay Area
without people grimacing at him.
You know, like it's not like, if I had crossed Twitter,
the Elon Musk fanboys would have come for me.
But like, there are no Mark Zuckerberg fanboys.
That feels like it's changing a little bit
with this cage match thing.
Yeah.
Which makes me just, from my perspective,
this Elon, Mark, like cage match,
it's like, we really are living in an idiocracy.
Like I don't care about this.
Like what is happening?
People wanna see it, it probably,
and if it happens, I'm sure I'll watch it.
But like, it's very strange.
My heart breaks watching this whole thing play out
because Elon is trolling.
Sure.
Elon is being Elon.
And Mark is earnest.
Yes, he's training.
And the part that I like breaks my heart is like he's leaning into this because he's like, oh, I'm going to show I'm serious.
Like, I'm going to show I'm a tough guy.
And it's like, he's in a lose-lose situation.
Like, if he wins, it's going to be like Dweeb.
Like, everyone, like, people liked Elon more.
You beat up the more popular guy.
And if he loses, it's just going to be like, oh, you're a nerdy, you know, white belt
in jujitsu. You know, it's, it, I, he, you have to remember if we rolled back in time to 2016,
so this is right before the election, right? In April of 2016, Marco was in Africa getting
parades thrown for him. You know, he was visiting Africa and people were saying, you are the savior of Africa.
You brought internet to Africa.
You changed the lives of all these people.
And six months later, people are like,
you're destroying democracy.
Like imagine Facebook is your identity.
Like everything you've done since you were 19 years old
is this company.
It is almost impossible for a rational human being
to be like,
oh, the thing that I've spent my life doing
actually is hurting people.
And so it feels desperate.
The psychic toll of that, yeah.
And being in this gilded cage and not, you know,
having people in your life who can sort of check you.
Yeah, is a really interesting thing.
And the fact that he is the last of these people
where everybody else has sort of realized time
for a new chapter or let's move on.
He's holding onto it.
Do you think that's because he needs to evolve it past
this narrative so that he feels good about it again?
I think like the metaverse stuff is right there.
Then you would think that he would invest
in all of these things that you're talking about
to rectify them.
Well, he has invested more than we spent
to go to the moon on the metaverse.
Right.
Like he clearly is like, oh, the thing,
if I can pull off the metaverse thing,
then that will be the thing that I'm known for.
I won't be known for-
Early indications are that that was unwise,
but on a longer timeline, that bet may pay off,
I don't know, but those headlines got eclipsed
by the advent of AI, which I wanna put a pin in for a minute
because I wanna come back to that.
But yeah, he shifted the narrative away,
but this is his Hail Mary.
It's his Hail Mary.
Yeah.
And part of what I, my, again,
my heart goes out to him on is like, you know,
one of my favorite thing about talking to journalists
is journalists gossip like no one else, right?
Like that's the kind of professional gossips in some ways.
And back when the metaverse stuff was really blooming,
I had multiple journalists tell me, you know,
Mark is going out there and saying,
people are gonna spend all day in the metaverse. You know, they're gonna work and saying, people are going to spend all day in the metaverse.
You know, they're going to work in the metaverse.
They're going to hang out in the metaverse.
Like, this is going to be like the next town square.
Was because he spent all day in the metaverse.
And if you think about it, like if you're a person and you can't go out in public without people grimacing at you or without eating in a restaurant, without being afraid of people coming up and telling you,
they think you're a genocidal,
like someone who's gonna hurt humanity.
Like the metaverse might be really attractive.
Cause like you can go out with whatever avatar you want,
or at a minimum, if people recognize you,
they can't grimace at you,
at least until the new headsets come out.
That's a very compassionate perspective.
Well, hurt people hurt people.
And he's just gonna keep causing damage
until we like help him go on to greatness.
He's gonna go do something else.
And he is so young and he's so smart
and he has so much resources.
You know, I've joked before that like,
if I ever write another book,
which I don't think is in the future,
cause the PTSD of writing one is too close.
You know, I wanna dedicate it to Mark
that like, I am convinced you can accomplish greatness
and I will not give up till you do, you know.
That's sweet.
Yeah.
Has he reached out to you?
He's never reached out to me.
Have you had any communication with him,
even by proxy through somebody else?
Nope, nope.
By anyone at Facebook?
Nope.
They pretend I don't exist.
Uh-huh.
Interesting.
But you've said-
They won't comment on the book.
They won't.
Well, talk about a no-win situation for them.
But you have said that under certain circumstances, you would go back and work at Facebook.
Yeah.
We can't give up on Facebook, right?
Like I think the most important work I could do
is if they actually wanted to fix it,
is like work on like,
how do we make sure it can function safely,
you know, in the places where it is the internet.
Because I think there are strategies
where even if you didn't want to necessarily do it the way I would suggest doing it,
which is like, let's go in there and add the friction, like add the intentionality around sharing, that kind of thing.
Even if you did want to keep a content moderation strategy, beginning to work on decentralized content moderation.
Like imagine if there's a scholar named Ethan Zuckerman at UMass Amherst who talks about the idea that right now we are subjects of a king on our social platforms.
You know, we can follow the laws.
We cannot follow the laws.
We can go to another place.
But we're not citizens.
You know, in a democracy, you know, it's not just that we have rights.
We have responsibilities.
You know, we do jury duty.
We have other obligations to our communities.
There's a way that we could be scaling up
even the current ways of doing safety into these places,
but we'd have to say, you have to do some of the labor too.
The evolution beyond decentralizing content moderation
would be decentralizing the entire social network.
I mean, this is Jack Dorsey's whole thing, right?
Like in his dreams, Twitter would be a decentralized system.
Is that something that you see emerging in the near future?
I think there's really interesting opportunities
around things like creator owned social networks.
So like, you know, we talked about before the idea of
these are advertising supported platforms. If people don't create, they don't get, there's no ads to, you can't run ads
against it. I think a world where people who participate have the ability to influence how
the system is governed is a thing that I think will very likely happen at some point in the future
because people value these spaces so much.
And if people knew there could be a world
where they would have a say in these spaces,
I think that would be compelling for them.
Well, we're in a bit of a chaos adolescence phase right now
with the emergence of all of these smaller
kind of Twitter analogs
and the emergence of the sub-stackification of journalism.
And I feel like we're sort of trying to find our feet
in all of this, but it's a little strange right now
to not really know, like, do I subscribe to this or this,
or how many subscriptions am I actually gonna be up for?
And how is this all gonna play out?
Is there gonna be a consolidation
or is there gonna be a clear winner
that's gonna emerge from all of these, you know, nascent little networks that are popping
up right now? One of the ones that I'm really excited by is called Post News. And like, if I,
if we have, let's say we pass transparency laws tomorrow, like the one I would want to work on
is Post News. Why is that? So one of the things I worked on at Facebook that I talk about is we did
influence modeling.
You know, we said, hey, like we can actually model human relationships and like how does influence percolate between people.
And that's actually really important for detecting like a fake person versus a real person, right?
Because it's very hard to fake the constellation of relationships that real humans exist within.
Post News is founded by the guy who did Waze.
And part of why Waze was successful in literally having just the crowd map the world
was it thought about the idea of credibility
and the idea that credibility can percolate.
You know, it can propagate out amongst the graph.
And so they're taking into consideration
this question of how do people have influence
and how can we have credibility
that grows and evolves over time?
So you're not having a centralized authority
say you have credibility.
It's saying, hey, can we watch
how this person interacts with the ecosystem
and how other people validate them?
Can we build systems that are truth from the ground up
instead of truth from on high?
And not based on an ad model?
I think they're not based on an ad model.
I think they're subscription based.
So beyond that, if you were designing
the ultimate social network for the future,
what else would be built into that?
So I think it's always important to,
there's people talk about social media as a thing.
Like it's really kind of two main worlds.
It's broadcast social media.
It's like Twitter, you speak on Twitter,
not because you care about,
or like TikTok, for example.
Creators create on TikTok,
not because they're trying to reach grandma,
it's because they're trying to reach
as many people as possible.
And so they'll repost their videos onto YouTube
or onto Instagram.
And people watch TikTok not to follow any particular person,
but to see what's popular.
Be entertained. Yeah, be entertained.
But on places like Facebook, it's personal social media.
Like you post your baby photo,
not because you want a random person to see it.
You want grandma to see it.
And so I think we always have to be a little bit careful
when we say like, what would the ideal social media be?
Because it's a question of ideal for what?
And I think in the case of personal social media,
I really like human scaled personal media.
It's like my, or social media,
like my primary social media right now,
I was about to hold up my phone,
but my phone is not over here.
I, my, it's Signal.
It's like a group chat.
You know, I have maybe 50 group chats.
They have different themes.
Like I have one called Doom News
and it's like about war and destruction,
you know, global warming and all
those things. And people spread news on those themes, um, versus like ones that are like your
local pub. They're like one of my friends hosts for, you know, 40 of us or something. And, and
this is this question of like, what do we want to get from social media? And we want connection.
We want to have a lens on the world. We wanna stay in touch with people.
And we don't need to have half a million person groups
to do that.
And so I think for me, good social media is at the scale
that the institutions that humans have evolved over time
can function at.
So like we have the concept of a dinner party.
We have the concept of a cocktail party.
We have the concept of a church parish, of a university.
Like these are concepts of communication,
like a convention, right?
We have scale that goes up to maybe 25,000 people.
Beyond that, it gets a lot more complicated.
Like you do have to have these much longer conversations
about governance.
And I think most of the things
that we want to get out of social media
can be accomplished in communities
that have less than 25,000 people.
Yeah, it's interesting though.
When I hear you saying that,
the missing piece for me,
as somebody who creates on the internet
and has built a career,
premised upon this free distribution model and access, model and access to lots of people.
And this whole influencer economy that's kind of emerged from it is the democratization of the
voice, which has its positives and negatives. But I think you could still host content. So,
something like YouTube, you host the content and this question of like where people discuss the content,
you know, do you have half a million person groups and, and, and how, what kinds of information end up propagating in a world where you have a megaphone that goes to a million people versus
a megaphone that goes to 25,000 people. Right. Depth versus breadth. Because like when you have
a 25,000 person community, like you can have like a discord server and you don't need an algorithm.
Like humans mediate what information you receive.
Once you get above that,
you know,
you end up in a,
you know,
when you have a single channel,
you know,
like a Facebook group that has one stream and has half a million people,
the only way you can actually navigate that content is with an algorithm.
Yeah.
And now you have to have conversations around what are my rights to resetting that algorithm?
What are my rights to seeing that algorithm?
You know, that kind of thing.
As a data scientist, a lot of people,
you hear a lot of people saying,
well, we don't even know how these algorithms work.
Yeah.
Is that true?
Oh, yeah.
What parts of it do you understand
and which parts of it elude even someone like yourself?
So it's one of these things where,
this is a very young field.
Like this is one of these things
where I talk about in the book.
And we can design a system and say,
this is the data we put into it.
Like we can describe that, we can see it.
And we can, when Elon published the algorithm,
like he can see what provisions it says.
But the problem is that all these things work together
in ways that are unexpected.
Like no one at Facebook, when they made the shift
from saying, can we keep you on the platform
as long as possible to,
can we get you to react as much as possible?
No one said, I think this is gonna lead
to more extreme content.
Like no one set out to do that.
And I want to validate the things that Facebook has said.
They say, we don't recognize these allegations. We would never intend to do that. And I want to validate the things that Facebook has said.
You know, they say, we don't recognize these allegations.
We would never intend to do that.
I don't think they did.
But the problem is that when they learned there were these side effects,
and right now the only people who get to ask questions about could there be side effects are the people inside the companies.
You know, people who have a vested interest in not knowing,
you know, when they learned that there were these consequences
and to be clear, like the shift was big enough
that like the CEO of Buzzfeed wrote to Facebook and said,
I think something's broken.
Jonah Peretti.
I don't remember which, which gender,
it was, would have been in like 2018 or 2019.
So I don't know which era that is.
Yeah, it's wild.
Yeah.
And it's getting wilder, right?
Like when we think about what went down
in terms of election interference in 2016 and in 2020,
it feels kind of very, you know,
kind of not that big of a deal in consideration
with what we're contending with as we, you know,
careen towards the 2024 election.
Everything is now suddenly infinitely more complicated
due to the release of these various LLMs,
the rapid growth of generative AI, deep fake technology,
like the tools that are now available
on the disinformation battlefield
are just extraordinarily powerful.
So how do you think about how those tools
are going to be deployed and what we can expect
as we move towards November?
So I think there's two areas that we need to really
pay attention to. So one is when Elon Musk fired a lot of his safety teams,
it really shifted the information environment. So one of my husband and I like to watch YouTube
together because we're nerdy. And there's a guy named Peter Zaihand who follows a lot of the politics in Russia.
And, you know, he was talking about how it used to be one of his best data sources was Twitter.
And when Elon came in, he ended up blocking a lot of people who were in Russia.
Also, because he fired the teams that were looking for the coordinated behavior, you can really obviously tell the trolls on both sides of the conflict now just outweigh the real voices.
And so one is we are investing less in catching bad actors.
But the second is the way the tools have changed what's even possible with misinformation. So for context for people, one of the ways we used to catch
those bots or like catch the networks of people who were pushing an information operation
was we look for repetition. So, you know, there's always this interesting trade-off between
how much distribution can you do and how much unique content can you do, right? So if you
really don't want to get caught, you write a different thing every time.
Before the large language models came out,
you were limited.
You know, you weren't gonna reach a ton of people
if every single thing you posted was unique.
So you could start looking and saying,
interesting, you keep sharing the same links,
or this group of people keeps saying
very, very similar things, or maybe even the same thing.
Right, and it becomes clear
that this is a bot farm.
Or like China was particularly egregious at this
because they would do things like,
they would have brigades of tens of thousands of people
who would just like flood the comment threads
on dissidents posts with like just the Chinese flag.
You know, like you can tell this is all the same thing.
When you start having large language models,
it's now possible to go and generate tens of thousands of unique pieces of content that all
basically say the same thing, but say it in slightly different ways. And that makes it a
lot harder to see that coordination. Or for example, you know, part of what makes misinformation
so pernicious is consensus reality is like a channel. There's
not like a huge continuum of what you can talk about. There's like, we can sit and come to a
view of like reality if we talk long enough. When it comes to misinformation, it can play in the
entire field of ideas and whatever is like the most seductive or like most incendiary on the algorithm,
that's the thing that gets distributed.
In a world with large language models,
you can generate 10,000 different variations
and send them to lots of different nooks
and figure out what meme,
like what idea is most seductive.
And that just supercharges what can be done
with a misinformation campaign.
It's terrifying. It is.
And that doesn't even get into all the deepfake stuff when you can mock somebody's voice or
their likeness in a compelling way that's indistinguishable from reality, then our footing
in what is real and what isn't real vanishes. Yeah. And things get really scary very quickly.
So this is, you know, when I talked about the idea
of the post news is like a really interesting approach
in my mind, you know, we, part of why I think
we need to step away from content moderation
as the solution or like censorship as the solution
is it presumes that we can go in there
and establish true and false, right?
A different approach is to come in and say,
hey, you know, information operations promoting lies
is much more dangerous than most things.
It is very hard if you are an information operation operative
to have a robust real set of relationships with people.
Like one of the things I talk about in the book
is this idea of how would you catch a fake person?
Or like, how do you catch someone
who is behaving in an inauthentic way?
And so if you are really acutely modeling and saying,
hey, you have friends, you have a family,
you have people you regularly interact with
who interact with you back.
We can tell you are an organic human.
If you have a fake human, it's like an LLM,
they can try to guess what do real people look like,
but they're not gonna be perfect.
And if your whole network uses the same assumptions,
now your network will stand out
as being a clump of people
who act in a slightly anomalous way.
Yeah, I like that.
I mean, the idea being traditionally,
if you wanted to see whether someone's a real person,
go to their Twitter account.
Oh, they just created the account last week.
They don't follow anybody that I know
or none of their followers or anybody that I know.
This all seems manufactured.
You can kind of get to the bottom of it pretty quickly.
What you're saying is create a matrix of credibility
that establishes this person is real
with a history of deep relationships and interactions
with other credible people that over time
establishes somebody as a worthy distributor of information.
Yeah, this person looks like they're a real person.
They're having real conversations.
And even if I disagree with them,
the idea that they are a real person
means they are less dangerous
than they are part of an operation.
But at the same time, it's like,
we're having a conversation about just trying to figure out
who's a real person or not.
Like it's insane, right?
Yeah, we are living in,
and this is part of why we have to start
having more transparency because we have tools
where we can start saying,
we gotta move this ball down the field,
but there is no economic incentive today
to do those things on their own.
So what are we gonna do, Frances?
Like what's the solution here? How do we, how do we put a better foot forward?
So part of, I think the first step is, is, is just, and this is a big part about why I wrote
the book. Like I'm, I think of the things that I regret the most about the rollout of my book is,
is no conservative media has, has offered to like talk to me, right? Like I haven't been on Fox news.
I haven't, we've reached out to a number of more.
That's interesting.
I would think they'd be happy to talk to you.
I would think so too.
Because like I've been saying consistently
since the beginning, you know,
content moderation is a flawed strategy,
but Facebook spent a huge sum of money
on a whisper campaign saying, you know,
she's a dark horse for censorship.
And so I think the place we start from
is an idea that people have the right to see how these systems work, right? Like I should have the
right to know I am not allowed to sell my book on Facebook, right? We, the public should be able to
see what content gets taken down. Right now, the only avenues that we have are you can appeal to the Facebook oversight board, but it disappears
into a black hole, right? Things can really change if we have transparency. And so we need to first
work on how do we make that a bipartisan issue, that we cannot have an information environment
that is run by a private company in the dark and have a democracy.
And the second thing is like,
so how might we actually do that?
So Europe passed a law called the Digital Services Act last year.
I talk about it in the book
that basically it sounds pretty blah.
It's like, you know,
if you know there's a risk to your platform,
you gotta tell us about it.
Like we know you get to operate behind the curtain,
but we can't see.
We'll never catch up with you
unless you tell us what you already know.
You have to tell us what your plan is
for reducing that risk.
And you need to give us enough data
that we can see that you're making progress.
And if we ask you a question,
you have to give us an answer.
And if we don't comply.
So I think it's like 10% of global revenues.
Oh, wow. It's the penalty I think it's like 10% of global revenues is the penalty.
Like it's a real one.
Though when you have a 35% profit margin,
it's possible Facebook would come out and say,
we are now 25%.
It's just a business expense.
It's a business expense, yeah.
What is your sense of the viability of something like that
passing in the United States?
You know, it's interesting.
If you'd asked me two months ago,
I would have given you a much more negative answer. But the Surgeon General came out with their advisory,
you know, about a month ago. And just for context for listeners, there's been like under 15 Surgeon
General advisories since the 60s. You know, there are things like cigarettes cause cancer,
seatbelts save lives, breastfeeding is good for babies, like stuff that we kind of take as duh statements today.
But there was ambiguity before those advisories came out.
And historically, when an advisory has come out, within two to three years after an advisory, something happens on average.
And so I think there's a good chance that something's going to happen around kids.
And my hope is that we can scaffold towards a broader transparency law from there.
What do you mean something's going to happen around kids? Meaning there will be
like a sort of political will to do something because of our shared concern of how these
things are impacting young people. Are impacting mental health of kids, yeah. For context, like there have been laws
that have been starting to pass.
So like Montana banned TikTok, which I don't support.
Or like Utah came out and passed a fairly extensive law
about parental oversight of these systems.
You know, these are conservative states.
And the way I usually frame this for people is
every societal issue has some finite number of kids
that we're willing to harm.
So in the case of cars,
we put eight-year-olds in car seats.
Like eight-year-olds in car seats saves like tens of kids.
It's like 60 kids a year
versus like having a six-year-old in a car seat.
Think of how many fights with eight-year-olds about car seats take place every day because we put eight
year olds in car seats or like how much money is spent on eight-year-old car seats. But we as a
society say those 60 lives are worth a huge amount to us. When it comes to things like guns, we're
willing to accept many, many more kids being harmed because of the societal value that we place on guns.
I think we are reaching a tipping point
when it comes to kids,
where if you look at things like the suicide rate,
it's a hockey stick.
And the data is always like five years behind.
So last year, the most recent data I could find
was from 2017.
This year, it's from 2018.
It was hockey sticking though up into there. And I don't see any reason why I wouldn't keep
hockey sticking. And I think there's just a real thing that it's going to be difficult for
social platforms to say, we deserve to continue to operate the way we have given the level of harm it gets.
It's an interesting tension between freedom of speech and family values, particularly in
conservative locales when you're dealing with young people and their access to social media
tools that we're discovering are very damaging and how that butts up against that deep entrenched interest in unfettered free speech.
And think about this for a second.
Some of those harms have nothing to do with speech.
So like sleep deprivation, sleep quality
has nothing to do with free speech.
Right, but truncating access to a social media site
can easily be interpreted as an you know, an infringement on
free speech. But imagine instead you came out and said, hey, you have to publish how many kids are
online at 10, 11, midnight, 1, 2, 3 a.m. You know, we're not saying you have to cut kids off at 10.
And then parents have the right to make that decision also. Or like one of the things I often
suggest is, you know, we've known for 20 years that if you make a site slower, people use it less.
So like my husband always likes to joke,
we all have willpower at noon, right?
Like 10 a.m., everyone has willpower.
But like by the time you get to 10 p.m.,
that's when you start doom scrolling.
You start dissociating and self-soothing
with like your phone.
Imagine a world where a kid stays up till 2 a.m.
And the next day when they're kind of hung over on Instagram in math class,
you know, a little thing pops up and says,
when do you want to go to bed tonight?
And the kid says, 11.
My mom wants me to go to bed at 10.
I want to go to bed at 11.
And for two hours or three hours before 11,
the app gets a little bit slower and a little bit slower and a little bit slower.
Like around 11.
But isn't every kid just gonna say,
I'm gonna go to bed at 2 a.m.?
I don't think so.
Cause like kids feel bad the next day.
Yeah.
And if you could have a soft landing.
But it is so powerful.
But that's because we don't have a soft landing right now.
We pop up a little thing saying,
it's 10 p.m., do you wanna go to bed?
And I don't know about you, I hit dismiss, right?
We don't have soft landings right now.
It's tough.
You're asking kids to bring a knife to a gunfight,
with these incredibly powerful tools
that are pitted against developing minds.
Even the most conscious and well-intentioned among us
are powerless to fight its wiles.
And so the fact that young people are contending with this,
we're running this massive experiment
at the greatest scale imaginable.
And it's very unclear how this is gonna pan out,
but early indications are it ain't fucking good.
It's not good.
So the thing that was terrifying for me
about the Surgeon General Advisory was, they said,
if your kid uses social media
for more than three hours a day,
they're at like double or triple the risk of getting depression or anxiety.
The average kid in the United States uses these products for three and a half hours a day.
Right?
30% of kids say they're on until midnight or later, most weeknights, most school nights.
Like you're totally right.
Like we are, when I went to the State of
the Union, so I was a guest of the First Lady at the State of the Union, I don't know, a year ago,
year and a half ago. And he said, we are running a national experiment on our children. And I had
actually forgotten about that line until like when my book came out. In one of the interviews I did,
they played that soundbite.
And I think if more people thought about it that way and said, you know,
we're running a national experiment on our kids.
Like what say should the public have in that experiment?
I think we'd have very different conversations.
Yeah, we should all have a say.
And that kind of gets to the core theme of the book,
which is agency.
Like the book is called The Power of One.
You might think that that refers to like you being
this all powerful whistleblower,
but it's really a call to action to all of us
to get more engaged with this and to get more involved
and to realize that we all have more agency
than we believe that we have.
So talk a little bit about, you know,
how to get people engaged and active around these issues
and where they can plug that enthusiasm.
You know, I think it's interesting.
Like I had an interview like two weeks ago
with a relatively young woman, she's like in her twenties.
And it kind of felt like we had,
I had stumbled into a therapy session.
Like I spent like a solid 20 minutes of that interview,
just trying to convince her that there's hope, right?
That things can be done.
So she's so down the social media addiction rabbit hole.
She's like, how is there any chance
we're ever gonna stop these giant corporations?
And one of the things I pointed out to her was,
I think the first step is that we have to believe
that change could be possible.
And the thing I told her was like,
one of the first quotes in the book
is from a guy named Vaclav Havel,
who wrote the power of the powerless.
So for those of you who are not Cold War studies minors.
I'm old enough to remember.
Yeah, in the Soviet Union, if we'd yeah uh and in the in the soviet union you know if we'd
gone to anywhere in the soviet union in 1960 and said will the soviet union fall no one would have
said that they'd be like there's this like iron iron grip and and he said you know there's this
this uh you know you can feel like there's no potential to change because the Soviet Union figured out a way to glue things,
like to make them stuck.
Instead of being a totalitarian state
where they just ruled by force,
he said, they're a post-totalitarian state.
They rule by ideology,
that if you cross the ideology,
the ideology just heals around you. So in a system
where if you are powerful or you're powerless, if you speak out, the system just heals around you
and shunts you to the side. What is left? What is your power? What is your agency? What is your
power left? And he says, you start by valuing yourself and saying, I am a human being and I have dignity.
I think if we started more conversations on social media
just from that simple of a lens,
like saying, right now,
Facebook is not really indicating they value our dignity.
What would social media look like
if it did look like they valued our autonomy
and our choices?
For example, you mentioned Netflix. When I was sick, I watched a lot of really
depressing Netflix. And I think I didn't even realize that I had like gotten into the darker
depths of the catalog. Like the true crime Dahmer stuff?
Oh no, like, I don't know, moody Scandinavian dramas.
You know, like you can be real dour on Netflix.
Scandi crime.
Yeah.
I like it.
But when I started getting better,
you know, I'd open up Netflix and I'd be like,
nothing appeals to me on Netflix.
Like I try to search for comedies.
I try to search for anything. If Netflix only knew that I feel better now.
Yeah, exactly.
You're thinking like, I need to fix this algorithm.
Well, I didn't even think about it until I moved in with some new people during COVID. So I're thinking like, I need to fix this algorithm. Well, I didn't even think about it
until I moved in with some new people during COVID.
So I moved to Puerto Rico and moved in with some new people.
They had their own Netflix account and it was delightful.
Like there was all this wonderful stuff on Netflix,
but I hadn't even realized that I had gotten
into a corner of the algorithm, right?
Like imagine a world where-
And you of all people.
Yeah. It's interesting. So imagine, you know Like imagine a world where- And you of all people. Yeah. It's interesting.
So imagine we said, hey, step one, I matter as a human. I have dignity. I deserve to be treated
with respect. You can imagine things like, maybe we should have the right to reset algorithms.
You shouldn't have to give up all your watch history and all your like ratings. Like I have a watch list of
stuff I want to watch in the future. Should I have to walk away from my data in order to get
the algorithm to reset? And this becomes like a really big deal when it comes to kids. Because
like when I've talked to therapists about kids with say an eating disorder or who have been
self-harming, you know, kids are forced to choose between their pasts, all their friends,
all these moments of their life they've documented, and an algorithm that threatens their future.
That's really interesting. It seems like an easy fix. You would be able to do that
because imagine the young person or anybody who's contending with some kind of mental health issue,
eating disorder or otherwise, they realize they need to get well, they engage in some kind of mental health issue, eating disorder or otherwise, they realize they need to get well,
they engage in some kind of recovery process,
they're getting better,
but when they open up their app,
they're getting exposed to the same content
that fed that maladaptive behavior pattern.
That's not great.
Yeah.
And so they're faced with the choice.
It's that whole thing of like, it's on the user, right?
And so what, they have to delete their account
or unfriend everybody in order to reset it
when there could be some healthier tools.
A happy medium.
Yeah.
And I think that's like the recurrent theme
that I want people to walk away from
is that we have been sold by the social media companies
that there are only these very stark absolutes,
like extremes.
And there could be like happy mediums,
you know, things where we say,
you know, let's design to put control back
in the user's hands
or to make patterns that were invisible, visible.
And, you know, that's not enough.
Like we need companies to have to pay
for social costs too,
but they're like really easy first steps.
So the second thing is, you know,
we've started from change is possible.
I matter.
I deserve to be respected to, you know,
taking care of other people, right?
Like coming in and saying, if you have,
and this was part of what Vaclav Havel talked about.
He said, you know, start just by saying,
I refuse to let you humiliate me.
I refuse to let you devalue me.
Second is I'm going to care
for my friends and family, right? Like I'm going to also remind them they matter. You know, I think
until we get social media that is safer, we have to start saying, I'm going to, we need to
redefine some of the rituals of what it means to care for each other. You know, talk to your
parents, talk to your kids, talk to
your friends and family about how they use social media, right? How would you feel if you found out
your friend died because they were getting sucked down a rabbit hole of despair and you didn't know,
right? And I think it's just one of these interesting things where like, I don't know,
maybe I'm like a radical and I'm like, what if we said, hey,
I'll show you my Instagram feed if you show me yours. Right. Right. Like, like maybe I'm a radical
because I'm like, what if we just tried it? You know, like what, how would, how would the world
change? Yeah. You know, and we're all operating on some level on this low grade assumption that
our feed looks like our friend's feed. Exactly. But we're all living in our own siloed, unique
like worlds as a result of this.
My husband is much better at Twitter than I am.
Like he much more actively engages in it.
I think he has more time than I do, more patience, whatever.
I like swimming in the ocean more than he does.
Does he still have a bunch of troll accounts?
Oh, he does.
Oh, he doesn't have as many.
He actually has, he has a check,
he has a blue check mark now
and he actually uses his own name
because he can't just troll all the time.
Because when Elon came in,
he lost like 20 or 30 troll accounts.
Interesting.
Because my husband used to have a hobby
of he really likes finding Chinese information operations
and then needling them.
So he likes to make them like-
Interesting.
Like ask them very reasonable questions
and watch whatever hilarious answers they say back.
But he lost like 20 accounts when Elon came through
and did the first wave of housecleaning.
I lost my blue check and I have not subscribed.
And I found my use and engagement on that platform
to have gone from being my favorite place
to engage with people to almost nothing.
Yeah.
I think the enough people left,
there's a very different tone.
Yeah, it feels very different to me.
But what I was gonna say was like,
when I say this idea of like,
what if we looked at each other's feeds?
Like I actually lived that,
like my husband, because he engages with Twitter more, he mostly does it for like national,
like his, his account is much more like national security news oriented. Like I literally, like we
lay in bed and when he starts like doom scrolling late at night, like I'll sit and like, you know,
enjoy his feed with him. And occasionally I annoy him because I'm like, I want to read that. So you
have to slow down. But we can do these things. It's not like they have to be alien actions.
Yeah, somebody suggested, I'm not taking credit for this.
And I'm sure you've thought of it.
A great tool to add to Twitter or other platforms would be
when you visit somebody's profile page,
you can hit a button and then you see their feed.
Like so you could live in that person's experience
and see what they're seeing.
I think that would be very cool.
People could opt into it.
Yeah, and then you can go,
oh, this is what this person's reading all the time.
And that's maybe why they're posting in this way.
I understand this person a little bit better
than I did prior.
Yeah, that'd be cool.
So in the interim,
while we're wrestling with all of these problems,
existential and practical, like what are some best practices around how to interface with these
tools? Like we need to understand just how powerful they are, how addictive they are,
the many pitfalls and risks of engaging with them, but they are part of our lives and our identity.
And, you know, we're not going to,
nobody's going to become a Luddite or, you know, delete their account. I mean, some people may
delete their accounts, but for the most part, these are things we have to figure out some kind
of healthy symbiosis with. And I want to acknowledge, like, you know, when I talk about this
idea of like, I would work at Facebook because we can't leave people behind, right? You know,
there's people who, for whom Facebook is the internet. The flip side of that is there's lots of people in the United States
who don't have the luxury of getting to do a lot of socializing in person. You know, socializing
in person costs money. You have to get there. Maybe you have to buy food to be in the space.
Maybe there's other things. We keep devaluing, getting rid of the public spaces where we could
interact with each
other. People don't join bowling leagues anymore, garden clubs, that kind of thing.
And so there's a bunch of people whom even in the United States, we need to not leave behind.
But the next thing I would say is, I saw this app recently on my book tour a couple weeks ago that I loved,
which was it just, when you open any social media account,
it tells you how many times you've opened it today.
And a little thing goes across the screen very slowly
that counts down like five, four, three, two, one.
And then asks you, do you really want to open Instagram?
Or do you really want to open YouTube? Do you really want to open YouTube?
And I love that idea of just giving people a little bit of friction, a little friction.
Because one thing that people do is they use these as like digital pacifiers. You know,
you're feeling a little anxious, you're feeling a little uncomfortable instead of sitting and just
enjoying the silence or like watching the scenery go by in
your Uber or whatever. We pull out our phones and we immediately dive in. Any kind of moment
of intentionality gives you a chance to make more of a real choice. What is that app called?
You know, I have my phone for me. We'll figure it out. We'll put it in the show notes. I like
that idea. I haven't heard of that.
I've heard of other sort of apps
that you can put on your phone
that you can set hours where certain apps don't work,
et cetera.
I think it's important to reclaim our boredom.
If we wanna be able to express ourselves creatively
and do our best thinking and be able to respond
to the world around us and be available
for the people we care about,
we have to reprioritize the importance of rumination
and quietude and those moments.
And I'm as guilty as anybody,
like I'll fill every opportunity,
standing in line or what have you to check the phone
and lose myself a little bit.
I'm like, I'm powerless.
It's really hard.
I have a lot of compassion for people that are suffering and feel powerless to make any kind of adaptive behavior
change around these things.
I think there's some other low hanging fruit.
It's like charge phones, not in the bedroom.
That's a tough one.
Kids should charge their phones in the parents' bedroom.
Adults should.
Once the cat's out of the bag,
as somebody with a bunch of kids,
like it's hard to put the genie back in the bottle
in that one.
Well, that's why the kids' phones can charge in your room,
but like your phone could charge in the kitchen
or something.
You know, you can buy an alarm clock for 10 bucks.
It will wake you up in the morning.
That's one of those ones that I'm really,
I need to do it where like my vice is I do them scroll late at night
and my husband totally calls me out on it
and so I need to like start charging the phone
in the kitchen probably
the final thing I kind of want to ask you about
is this conflict or tension between immediacy.
Like we need solutions.
Like we've got, you know,
they're like, as you,
like as I opened up this conversation with this idea
that if we don't fix this,
tens of millions of people are gonna die, your own words.
There is a sense of urgency around solving these problems
while also simultaneously like appreciating long-termism.
Like we're in for the long haul here.
We gotta get through this uncomfortable adolescent phase
and learn how to do better.
Both those things are true and important.
How do you kind of, as an advocate now,
who's in the public spotlight,
talking about this stuff all the time,
what is the message that you wanna put out there
in terms of what we need to do now
to address the immediacy aspect of this?
I totally get, like, I think I frustrate people
because I don't come out and demand more sooner, right?
So like, I know when I first came out,
a friend of mine who is very active in this space, you know, he was like, you need to pick like one thing that you want them to do.
Like maybe it's like cut the reshare chains.
You need to pick one thing and push to get that one thing because you're not going to get anything more than that.
And part of why I have resisted that so hard is like I think the problem of social media is there weren't enough people sitting at the table, right?
Like, there are all these tradeoffs and all these externalities, these social costs, and there weren't enough people who actually got to have a say in the process.
Like, it was made internally by very small number of people who were working under very constrained incentives.
but we're working under very constrained incentives.
And we have to start with transparency because I really believe having a democratic conversation
about how to move forward is so essential.
And so it's a little hard
because like I fully acknowledge the house is on fire
and like coming in and saying,
we should have smoke detectors.
Like I totally get that's not satisfying.
But right now there are so few
people in the world who get to participate in a conversation about how to move forward.
I don't think it would, I think you'd be swapping out one dictator, you know, Mark for another
dictator and me, if you were just like Francis fix it for us. Right. And so we really need to
immediately get transparency. We we're, we're going to keep working at beyond the screen on things
around like how do we onboard and up-level concerned parents, litigators, regulators,
investors, because all of those actors together are how safety happens, right? It's investors
saying we need to manage for long-term risk. It's litigators that say we need to create the right set of incentives.
And so I think all in, you know, we start with transparency
and then we begin the hard work of actually figuring out where do we go from here.
What is the floor?
Are you doing your own kind of like youth education,
red teaming with social media projects where you get kids in
and you're like, design your
own social. Okay. What happens when that person says that? What do you do? Like, how would you
fix that? Like creating literacy in like a generation of young people who are obviously
going to be inheriting all of this at the earliest phases of their development seems to be, I mean,
that's a longer term solution to this, but I really like that
idea of injecting that awareness in as early as possible. I think there's two kinds of education
that are really urgent. So one is that, so did you do high school journalism? No. So there exists
in the world high school newspapers, There even exists junior high school newspapers.
So hot take,
do you think there are any good
junior high newspapers in the world?
I have no idea, probably not.
I think almost certainly not.
But I mean, like, I'm not trying to rain on them.
Like, that's not why we do it.
Like, we're not funding those.
You're exposing them to the principles around journalism.
We're saying journalism plays such an important role
in a democracy, right?
So the first pass at history,
it's digesting information,
helping people get to have a chance to have opinions
that we believe we need to expose
the broadest number of people possible
to the process of journalism.
We are shifting over our information environment
from one where it's all about a structured process with journalistic traditions and ethics and protocols to one that's decentralized or is run by algorithms.
But we don't have a way of actually being able to expose people to those choices.
And so one of the things that we want to do is build a lab bench, kind of like, did you take chemistry in high school? So, the chemistry
lab bench you used in high school is remarkably similar to what you would use in college or if
you're a graduate student or if you're a full-blown chemist. That lab bench facilitates a lot of
levels of understanding. You know, if we had a simulated social network, we could have, you know,
high school data science clubs where kids could argue over, you know, should you have to click on a link to reshare it?
And you could play the ads team and I could play the new user team and they could play the non-English user team or the kids team.
So you're bringing the debate ecosystem into this as well. Or the Model UN, you know, or mock trial, you know, like, like how do we make kids say,
hey, actually these products I use can be built in a very large number of ways.
You know, there are choices being made.
There's consequences of these choices.
And to be real clear, every single time we change them, it's not like it's either good
or it's bad.
It's, it, some, some of those stakeholder groups are gonna benefit,
some are not.
Do we still ship?
So that's one kind of knowledge that I really want.
Like we're very actively building towards
and we'd love support if anyone wants to give us support.
Yeah, so what is the website for that?
Oh, so our website is beyondthescreen.org
and please reach out.
Also my email is frances at franceshaugen.com.
But the second kind of knowledge,
I like to think of it as like community governance, right?
Or like in addressing the idea
that we are actually very rapidly stripping
social capital from our children.
So I'll give you an example.
I was told by a member of either social capital from our children. So I'll give you an example.
I was told by a member of either like the American Academy of Pediatrics
or the Academy of Pediatric Psychologists
that there now exists nonprofits
that teach 14 year olds how to play pickup basketball.
So take a step back for a second.
Like when you and I were kids, there were not
startups that taught kids how to play pickup basketball. No, but there wasn't a need for that.
So why wasn't there a need for it? Right? Because we weren't at home staring at screens. We were
out at the playground and somebody's older brother taught us how to do it. And so I think what
happened was it used to be, if you wanted to play, you just showed them. And sometimes there
weren't enough kids. And so like, you know, the 14 year olds
had to let the 12 year old play.
They're like, he's smaller, she's smaller, not very good,
but we need one more person.
Today, those 14 year olds all coordinate on their phones.
They show up at 4 p.m. on Tuesday.
They always have enough people
and they never let the 12 year old play.
And so now you have to figure out how to do intentionality
around propagating that knowledge.
We are facing, I think a really serious crisis
amongst teenagers because for two years,
we shut the schools down, right?
The juniors and seniors that were supposed to socialize
the freshmen and sophomores, they didn't have those moments.
And now the juniors and seniors
are those old freshmen and sophomores, right?
Like we're cutting-
They weren't indoctrinated into that mentorship mentality.
Yeah.
Yeah.
And I really felt this in college, right?
Like I was part of the first graduating class at a college.
And so we never had upperclassmen
that taught us how to be adults, right?
Like we had to go and be seniors for four years.
And I don't think it was good for any of us.
So there's this interesting question of like,
let's say we went into high schools and we said,
hey, we're gonna teach you about network effects.
Like the idea of you guys are locked in right now.
And we're gonna teach you about the business model.
And we're gonna teach you about the idea that like,
if you want to decide to spend more time in person
with your friends,
they have to make that same decision at the same time
or you don't get to, right?
Okay, now that you all know that you're not,
you know, just leaves floating in the river,
like you are people who can make choices,
let's fill an assembly and say,
hey, let's just start out. How does this make you feel? Like have the kids vote and just put it up
on the screen. How does it make you feel when you use social media? Because most of you say
it doesn't make you happy. How much time do you spend on social media? Like let's check on our
phones together. How much time do you spend? Okay, if you could set a rule for the whole school on how much time you're gonna spend online,
what would you choose?
Are you a three hour a day school?
Are you a four hour a day school?
Are you a one hour a day school?
Because every hour you're on there,
you're not with your friends.
What would happen?
And like, that's like an area I really wanna play out in
where like this question of,
can kids intervene and start teaching
each other social skills?
Can they start, can we intervene at the level of a community
instead of a single kid or a single family?
Well, that speaks to agency
and also this idea of self-regulation,
because ultimately it is,
as much as it shouldn't be on the individual,
it is on the individual
and we have to make these choices ourselves
about how we're gonna engage with these things.
And the younger you can get to somebody
and give them that sense of agency where,
listen, you're a sentient human being and I trust you.
And you have to figure out
how to be responsible for yourself.
What do you want?
And engaging with a young person at that level about these things,
I think is really, can be a powerful thing
and should be part of like elementary school curriculum.
You have a hopefulness in how you're approaching this.
And I think that hopefulness is rooted
in an understanding that change takes time.
And so this is not, hey, I came out and I testified
and I went on 60 Minutes and my work here is done.
This is perhaps work that has no end,
but is worthy of devoting your time,
attention, energy, and life to.
And when you think of the whistleblowers over the years,
you mentioned auto safety, obviously Ralph Nader,
he's sort of a talisman for you.
There aren't that many.
I mean, there's Edward Snowden in the NSA,
there's Daniel Ellsberg, and we all saw Russell Crowe
portray Jeffrey Weigand in the Insider.
And there's you, like there's these sort of iconic figures
that don't pop up that often
and create kind of a bit of a lightning bolt moment.
But the groundswell of change in the aftermath of that
is something that we need to appreciate does take time.
I would say the transformative moment for me
that was kind of like the moment my life pivoted
was I really like, so I'm,
I'm,
I'm a big nerd.
When I travel,
I love maritime museums and railroad museums.
I'll show you.
I,
I,
I'm odd.
I really like museums about cities.
Like I like places that tell their own stories.
And I really like museums about individual people because like you,
you get to like,
I really like,
I really like understanding people.
Like I really like connecting with them.
And, you know, I've been to ones about artists,
political figures, architects, like, you know,
I just enjoy people.
And one of my favorite single person museums
is the museum for Indira Gandhi in Delhi.
So I was in Delhi for a wedding
and it's a really cool museum
because they tell her life story from like letters
and newspaper clippings that are in context to that moment. So it's all contemporaneous documents
and like artifacts. And they had this article about her when she was five. So she's the only
child of Nehru, who was like one of the founders of India.
And this article is about her leading a march of children
through the streets of Delhi when she was five.
So like think thousands of children,
it was called the monkey army.
And one, it shows you how different parenting standards
were a long time ago,
because who would let their child go
in a mass of like thousands of five-year-olds today.
But two, it was like a catalyst for me
because I think she was in her 40s
when India became free, right?
And I had never,
like I had watched the Gandhi movie or whatever
and been like, oh, like, you know,
Gandhi made salt in the ocean
and India became free or whatever.
But I hadn't thought about the idea.
It took them decades.
Like it took them like 70 years to become free. And if you had
asked anyone in the world in 1875, will India, will the British ever leave India? No one in the world,
except for like, you know, the hundred people gathered from the Indian National Congress would
have said, there's a chance. And I think today, like I have a line in the book, you know, fatalism
is a sign that someone is trying to steal your power.
Like when we say there's no chance we can do something, like so much innovation comes from like some crazy person saying, what if it was different?
And like I look at it as, you know, we are going through a transformational moment.
Like the idea that our economy could go from being transparent to opaque.
Like we will wrestle with this for decades.
And so it's easy for me to get up every day
because it takes time.
Any last thoughts?
Like, is there anything else
that you think people get wrong about you?
You've done a million interviews.
You've got this book out now.
Interesting.
Obviously, you know,
a lot of people are talking about the ideas
that you're talking about,
but they're also talking about you.
Is there anything that frustrates you
around maybe some narratives that get out there?
I've really liked this conversation.
Like one of the things that's nice about podcasts
is you get to have like such a larger depth.
Like you get to talk about such a broader set of things. A lot of the press coverage
in the United States has been, you know, in seven minute or 15 minute chunks. And so we
ask the same questions over and over again. I think the thing,
probably the thing that is most frustrating for me in terms of like narratives is we have always,
like we've really,
like my nonprofits really struggled to fundraise
because there's a perception
that people have given us lots and lots of money.
And if people realized how little we had been given
and like I've had to pay like out of pocket
for even basic staff for things,
I think they would be really shocked
because there's this perception that like-
Well, Piero Midiar is funding you.
Yeah, yeah, yeah.
So his nonprofit, Luminate,
has never given me anything
other than like logistical support.
Oh yeah, that was one of those conspiracy theories
that there was like the billionaire in the background.
For people that don't know,
he's one of the PayPal guys,
made a bunch of money and is a big philanthropist now.
But his nonprofit gave you aid,
but it wasn't like he had any personal involvement.
They paid me, they paid for plane tickets to Europe and like people to introduce me
to like politicians and stuff in Europe.
But like, it's one of these things where it's like,
you know, my salary is paid on public speaking.
So like, if you wanna invite me to a conference,
that's always lovely.
Cause that currently is like the thing that pays
for my assistant and things like that.
And so I think that's probably the thing
that frustrates me the most because like-
Or people think if you have a book out
that you're a millionaire.
Yeah, yeah, yeah, yeah. I'm not sure if there's anything else
where I haven't gotten asked it.
But you got that sweet crypto cash, right?
What's going on there?
I love that meme.
I would say I was raised on Vanguard,
you know, like Vanguard mutual funds.
Like you should have a diversified portfolio.
And I totally believe you should have
like five to 10% in crypto,
but that is not a
life-changing amount of crypto. But you moved to Puerto Rico. You're part of like the whole,
you know, the bleeding edge of the crypto movement. Yeah. No, I, well, I lived in San
Francisco during COVID and it was really cold and really lonely, right? Like I talk about in the
book, I had to relearn to walk and I still have very bad pain in my legs.
The neuropathy from the celiac.
I used to be, it got bad enough
that I was paralyzed beneath my knees.
And I can hike all day now.
I can walk on my toes, but like whenever I'm cold
and LA today is cold.
I know you guys don't believe that, but 70 is cold.
In Puerto Rico, I can leave my AC off in my office and it just stays 85.
And like that costs a thousand dollars a month in San Francisco.
Right.
And so like, I can't speak to any of the reasons why my other friends moved there.
But like I moved there because like a bunch of people were like, oh, we're going to go on an adventure.
And now I'm there.
Yeah. You've been there for a while too.
You like it down there?
I love it.
In Puerto Rico, you can live on the beach.
Like you can have a condo on the beach
where you open your windows and you hear the water.
And like my husband and I,
we go swimming four to five days a week in the ocean.
And we have this thing we call
bodies of water of descending size,
where we go in the ocean
and at some point we get a little cold
and we go to the pool at our condo
and it's like one or two degrees warmer.
And then we have an inflatable hot tub,
which sounds like a ludicrous object,
but for $250, we got a Coleman hot tub on sale
and it only goes up to like 104,
which turns out is hot enough, right?
But like, I don't know,
like I can't imagine living anywhere else.
Like we get to do that, you know, four days a week.
Good for you.
And we could never afford to do that in like Los Angeles.
Yeah.
That's great.
Yeah.
I appreciate you coming. Puerto Rico welcomes you. Yeah, I've been there. I love it. I like Los Angeles. Yeah. So. That's great. Yeah. I appreciate you coming.
Puerto Rico welcomes you.
Yeah, I've been there.
I love it. I like it there.
It's super nice.
I get the appeal.
I get the appeal.
It was great to talk to you.
Thank you. My pleasure.
I think that you're courageous
and the example that you're setting is inspirational.
And I think the work that you're doing is vitally important.
So I appreciate you coming here to share the book,
the power of one is out everywhere.
Easy to find.
We'll link it up in the show notes and all of that.
And I'm at your service.
There's anything I can ever do for you.
Yeah, it's really amazing what you're doing
and I'm here to support you.
So appreciate it.
Thank you. So appreciate you. Thank you.
That's it for today.
Thank you for listening.
I truly hope you enjoyed the conversation.
To learn more about today's guest,
including links and resources related to everything discussed today, visit the episode page at richroll.com, where you can find the entire podcast archive, as well as podcast merch, my books, Finding Ultra, Voicing Change in the Plant Power Way, as well as the Plant Power Meal Planner at meals.richroll.com.
If you'd like to support the podcast, the easiest and most impactful thing you can do
is to subscribe to the show on Apple Podcasts, on Spotify, and on YouTube,
and leave a review and or comment.
Supporting the sponsors who support the show is also important and appreciated,
and sharing the show or your favorite episode with friends or on social media is, of course, awesome and very helpful.
And finally, for podcast updates, special offers on books, the meal planner, and other subjects, please subscribe to our newsletter, which you can find on the footer of any page at richroll.com.
footer of any page at richroll.com. Today's show was produced and engineered by Jason Camiolo,
with additional audio engineering by Cale Curtis. The video edition of the podcast was created by Blake Curtis, with assistance by our creative director, Dan Drake. Portraits by Davy Greenberg,
graphic and social media assets courtesy of Daniel Solis, as well as Dan Drake. Thank you, Georgia Whaley,
for copywriting and website management. And of course, our theme music was created by Tyler
Pyatt, Trapper Pyatt, and Harry Mathis. Appreciate the love, love the support. See you back here soon.
Peace. Plants. Namaste. Thank you.