Your Undivided Attention - Real Social Media Solutions, Now — with Frances Haugen
Episode Date: November 23, 2022When it comes to social media risk, there is reason to hope for consensus. Center for Humane Technology co-founder Tristan Harris recently helped launch a new initiative called the Council for Respons...ible Social Media (CRSM) in Washington, D.C. It’s a coalition between religious leaders, public health experts, national security leaders, and former political representatives from both sides - people who just care about making our democracy work.During this event, Tristan sat down with Facebook whistleblower Frances Haugen, a friend of Center for Humane Technology, to discuss the harm caused to our mental health and global democracy when platforms lack accountability and transparency. The CRSM is bipartisan, and its kickoff serves to boost the solutions Frances and Tristan identify going into 2023.RECOMMENDED MEDIA Council for Responsible Social Media (CRSM)A project of Issue One, CRSM is a cross-partisan group of leaders addressing the negative mental, civic, and public health impacts of social media in America.Twitter Whistleblower Testifies on Security IssuesPeiter “Mudge” Zatko, a former Twitter security executive, testified on privacy and security issues relating to the social media company before the Senate Judiciary Committee.Beyond the ScreenBeyond the Screen is a coalition of technologists, designers, and thinkers fighting against online harms, led by the Facebook whistle-blower Frances Haugen.#OneClickSafer CampaignOur campaign to pressure Facebook to make one immediate change — join us!RECOMMENDED YUA EPISODES A Conversation with Facebook Whistleblower Frances Haugenhttps://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugenA Facebook Whistleblower: Sophie Zhanghttps://www.humanetech.com/podcast/episode-37-a-facebook-whistleblowerMr. Harris Zooms to Washington https://www.humanetech.com/podcast/episode-35-mr-harris-zooms-to-washingtonWith Great Power Comes… No Responsibility? https://www.humanetech.com/podcast/3-with-great-power-comes-no-responsibilityYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Transcript
Discussion (0)
Imagine if U.S. politics worked like this today.
I used to put 60 members in a room every night at 5 o'clock, 60 Democrats,
to talk about the issues that we had to figure out.
And I had people from far left to far right, everything in between,
all kinds of views, all kinds of disagreements on every issue.
So that's how the discussion started.
I used to say to members, none of us knows everything.
We have to listen to one another.
We have to learn from one another.
And after we had had 20 meetings, people would come up to me and say, you know, when we started this, I just knew what our policy should be.
And I listened to him and her and I changed my mind.
That's human interaction.
We can come to agreement, compromise, consensus on how to solve controversial problems.
That's former congressman and House Majority Leader Dick Gephardt, who served in the U.S. Congress for 28 years.
And he's referring to a time just two decades ago that seems so far away now with how hard it is to imagine American elected officials seeing eye to eye.
I'm Tristan Harris, and this is your undivided attention, the podcast from the Center for Humane Technology.
If that's what it took to build consensus in Congress 20 years ago when things were far less polarized,
imagine what kind of things it'll take to build consensus now.
But when it comes to social media risk,
there is reason to hope for consensus,
because in a way, we should be able to agree on the very thing
that's preventing us from agreeing on anything.
Division is a unifying issue, as we just said on 60 Minutes.
And back in October of this year,
I went to the launch of a new initiative
called the Council for Responsible Social Media in Washington, D.C.,
and Congressman Gephardt was there.
And it left me feeling really excited
because this was a coalition of public health experts, national security leaders,
religious leaders, people who study these issues, former representatives and Congress from both
sides, people who just care about making our democracy work.
And the changes that will come out of this will make a major difference for users of these
platforms around the world, not just in the United States.
During the event, I sat down with Facebook whistleblower Francis Howgan,
who obviously many of you know is a friend of the Center for Humane Technology,
and we wanted to release that conversation today on your undivided attention
because Francis and I really got to unpack some of the issues we both care about in a timely way
and we think it's very relevant as we go into 2023.
So with that, here we go.
I'm very honored to introduce Tristan Harris and Francis Howgan.
You all hear us?
Yes.
You hear us?
Yeah.
It's a little worried about the people in the back.
Yeah, go, good.
So Francis, let's start with you.
So I think people underestimate the amount.
out of risk that you took on to come out? I mean, you risked your life going to prison,
having a trillion-dollar company throw billions of dollars and legal fees at you. Could you say
why you were willing to take that risk? So I was working on how was Facebook influencing the
information environment, some of the most vulnerable places in the world. Most people are not aware
that Facebook went into countries around African, around South America, around Southeast Asia,
and said, we use our products, the internet is free, and Facebook became the internet for at least
a billion or a billion and a half people. The version of Facebook that we use, of Instagram that we
use is the safest, cleanest version of Facebook in the world. And I genuinely, genuinely fear that
there are tens of millions of lives in the line in places like African countries in Southeast Asia.
We've seen two genocides so far, Myanmar and Ethiopia. And I think those are opening chapters
of a dystopian novel that I desperately don't want to read the end of.
In talking to you, I know that you were with the Civic Integrity team at posts.
You would be looking at the 10 most popular posts in Ethiopia and seeing it was beheading videos.
It was things like that.
Just give you a taste of the kinds of things that you were seeing.
Because I think you open up a Facebook feed here.
You say, okay, occasionally I see some political content.
I see some weird divisive conspiracy theory going viral.
But what were you seeing in those countries?
So Facebook had a team that was called the Social Cohesion Team.
which was the team that was responsible for fighting ethnic violence.
And we had a thing called Viralty Review every week or every other week where they bring in translators
and explain to us the most popular 10 posts in each country that were at risk,
our tier one at risk countries.
And every single post would be a beheading video.
It would be an accusation that the opposition was molesting children,
that this child was kidnapped and tortured, like truly horrific misinformation.
So there's this question of why was this the most popular content in those countries when we don't see that here.
Facebook was spending 87% of its operational budget for misinformation on English,
even though only 9% of users spoke English.
And I want to remind you, the United States is not a country of only English speakers.
It is a country of many, many Spanish speakers, many, many other languages.
And they are not going to use as safe a version of Facebook as English speakers.
Facebook.
They wanted us not to demand more.
And that's why they hid the reality from us.
one of the things that's so interesting about the perception of you publicly is that you're known
as the Midwesterner that cares about the kids. And what you and I first started talking about
when we got to know each other was actually these global harms breakdowns of democracy and ethnic
violence, but you're known for the teen mental health harms. What do you think is the reason why
that's true? They're interconnected. So the reason why we see the most viral content in those
country is being these horrific pieces of content is that the algorithms push us towards extremism
that when you have algorithms and say content is better if it gets more clicks it ignores the
fact that the shortest path to a click is hate anger division and when it comes to kids you can
start with a blank Instagram account no friends no interests and just do some very innocuous searches
like healthy eating healthy recipes and just by clicking on the first five pieces of content each
day within a couple of weeks being taken to pro-anorexia content. If you're mildly depressed,
it can push you towards self-harm content. These algorithms escalates, an escalator that goes up
evermore. And so in other words, as you said, the harms are interconnected. So what seems like
we have one set of harms for teen mental health happening over here. So we should like,
you know, bite that off and chew it. Let's like solve that problem. But then we have this other
set of harms called ethnic violence or breakdown of democracy or conspiracy theories. Let's
bite that problem off and chew it. But what you're saying is they're interconnected. And so I think what
we should do now is just quickly talk about what are the drivers for how all these problems are going on.
So just to say a little bit of our perspective, starting in 2013, when I was studying the attention
economy at Google, it was pretty clear that there were these key drivers that were going to drive
the world culturally in specific directions. Because if you have a certain set of base facts,
Okay, we have a private company worth a trillion dollars whose business model, they have to get more attention every year.
Their business model is ads. So the more attention you give, the more ads they give.
And each of these companies has an ego or an embedded growth obligation.
They have to grow every year and every quarter. With that ego also comes, it's growing what?
Do they grow just data that they could collect? Well, yes. But they also have to grow the amount of attention.
that they get from humanity.
And if I don't get that attention, the other one will.
And so if Instagram doesn't add a beautification filter to match TikTok in the arms race
for teenagers' mental health, Instagram's just going to lose the arms race.
And so it's pretty simple game theory.
But when you then say, okay, if I don't do the three-second videos versus the 30-second videos,
I'm going to lose to the guy that does the three-second videos.
So when you play that out, this race for attention, starting in 2013, the reason that
I came out, my version of Francis's story, is that we can predict the future. I can tell you
exactly what society is going to look like if you let this raise continue. Population-centric
information warfare, weakening teenage mental health, shortening attention spans, beautification filters,
unrealistic standards of beauty for teenagers, more polarizing extreme content, more conspiracy theories.
These are all predictable phrases that describe a future that if you allow this to continue,
I can tell you exactly what the world's going to look like. And part of the reason we're all
all here today is to not just talk about those problems. We want to solve them because we know that
this leads to a total dystopian catastrophe novel that unfortunately is playing out true every day.
And I want to impact that a little bit. We've heard people say things like they're intentionally
designing these systems for anger. They're intentionally designing them for division.
One of the things that I was really struck by when I went to Facebook was how kind and conscientious
the people where they work there. You know, the kind of people who work at social media companies are
people who value connection, right? They're not, you know, shadowy figures. But what Tristan's
talking about here about the market incentives, the fact that these are private companies that we're
asking to run critical public infrastructure in a completely untransparent way. We're asking them
to maintain public safety, to maintain national security when those are cost centers, they're not
profit centers. And so you end up in a situation where they want to do better, but because they have
to meet these market incentives each year, it's hard for them to get there.
So I guess the question I have for you interest on is, like, what conversation should we be having them?
So in preparation for answering that question, I think one thing we have to notice is that per the E.O. Wilson quote that we always go back to, that the fundamental problem of humanity is we have paleolithic emotions and brains, medieval institutions, and accelerating godlike technology. And I repeated thousands and thousands of times because of how true it is and how deep it is as an insight to then see how do we solve a problem.
And part of the medieval institutions is that law always lags the new tech.
We don't need a new conception of privacy until you have ubiquitous cameras that start getting rolled out in the 1900s.
We don't need a right to be forgotten until new 21st century technology can remember you forever.
So one of the problems is that we have technology moving so fast in the current regulatory environment is, okay, well, I have these existing moral philosophies of privacy and data protection.
and these are good. We want these things. But notice that the, you know, the breakdown of teenage mental health or extremism in Ethiopia or this arms race for attention and engagement, it's an adjacent and slightly different set of areas. And we don't have laws or moral conceptions for those areas.
And so often when we write laws, we write them about externalities, right, that when we have the system operating in isolation, there are incentives where these what is an externality?
So an externality is when there is a cost. So let's say Facebook's going ahead. They're getting you to pay attention. They're going to click on ads. They get money for those ads. They are all floating onto you, though, the anxiety that's building in your heart, the child that took their own life, the political division at Thanksgiving.
Those don't show up on the balance sheet of Facebook. They don't have to deal with the Thanksgiving conversations that don't work anymore.
And the thing we want to emphasize is that there are really good, really simple, practical solutions that would reduce a lot of these problems.
It's things like that escalator where you keep going for more and more extreme content.
When we talk to pediatricians, when we talk to child psychologists, they say, kids get that this is happening.
You know, they get that when they go on there, they feel more anxious.
They get that it's making their eating disorder worse.
But they're being forced to choose between their past and their future, right?
They can give up their account, but they have to give up all their friends and connections.
They have to give up all their past memories.
And kids aren't willing to give up their past for their future.
You know, they should be allowed to reset the model anytime they want to.
Any of you should be allowed to reset your model.
You should have that right, even if it's going to make Facebook less money.
It's things like saying, how do you put mindfulness in the sharing process?
Do you require people to click a link before you share it?
Or things as simple as what level of hypervirality do we want to endorse?
You know, when something gets beyond friends of friends, imagine a world where instead of having
a little reshare button where you can keep spreading the misinformation, we said, we value
choice. We value intentionality. You can say whatever you want, but once it gets to be on
friend to friends, you have to copy and paste if you want to spread it further. That change sounds
pedantic. You're like, Francis, why are you asking me about colors on share boxes or share fingers?
The reality is that simple change has the same impact on misinformation as the entire third-party
fact-checking program. Only no individuals now saying this is a good idea or a bad idea.
So let's actually break that down because this is a profound statement.
that we were both, you said it first,
it came out of Francis's disclosures
to the Wall Street Journal,
that in Facebook's own research,
simply taking away the share button
and having you say,
I can still copy and paste the text manually
and share it again,
but adding that one piece of friction in
where I have to share manually,
I have to intentionally do it.
I have to intentionally do it.
Not mindlessly.
We're talking about a tiny change,
something that a JavaScript engineer
can spend a day and it's done.
And that would be more effective than I think you said in the documents, a billion dollars spent on content moderation and all the other sort of trust and safety.
I don't know about all that, but the third-party fact-checking program where they pay journalists to go out there and write articles and say, this link, this concept is no longer allowed on our platform.
So I think this gets to the point then.
So why wouldn't a trivial change that an engineer could make in one day?
Why isn't that happening?
So this comes back to this question around externalities and incentives.
The reason why we have to push for things like platform transparency, right?
So the Pata Act, Platform of Transparency and Accountability Act, it would allow us to see inside those companies.
You know, what would it look like in terms of what people would be willing to stand up and demand
if they could see from themselves that data instead of just taking my word for it
or looking at the documents that I brought out.
So the Platform Accountability and Transparency Act, or Pata, is not yet been introduced to Congress,
but it's a bill with bipartisan support
that obliges tech companies like Facebook
to open their data to researchers
so that we can actually study the effects of these platforms
in a meaningful way.
The bill came about in direct response
to disclosures from Francis and other social media whistleblowers.
We don't want to live in a world
where we have to wait for the next Francis Hogan
or the next whistleblower
to know what's going on inside these platforms.
We have to have the ability to have those countervailing incentives
because otherwise the profit motive
will just keep pushing away from these really simple, sensible changes.
And the great irony, and I've had to repeat this in every interview since then, one of the core parts
of my testimony was the idea that when we focus on content moderation, solving these problems
after the fact, it doesn't just distract us from real solutions. It leaves behind everyone who doesn't
speak one of the 20 biggest languages in the world. And that's what causes that ethnic violence
that I talked about in the beginning. So one of the things that I find so ironic is when Francis came
out. I actually was in a gym in New York watching from a hotel room and I was like on the elliptical
and I was working out. And I was watching live her give the Senate testimony hearing. And I started
crying because I was so moved by the fact that she had done something that I had never seen
happen, which was she got bipartisan consensus. And you had senators from the left and the right
saying, we don't disagree. We're going to solve this problem. And then the next day,
Stories went viral on Facebook and Twitter, saying that Francis was some kind of operative
or that she was actually a plant by the government to justify more censorship or that Facebook
planted, it was like more and more bizarre stuff.
And she had a meeting next day with one of the major senators who the day before had agreed
and said, we got to do something about this.
And this is a, I won't say the name, but this is someone who like doesn't typically agree
on these issues.
And the senator canceled the meeting.
And one of the reasons I wanted to bring this story up is that when people so distrust your
motivations as a person, the point is that our society runs on trust and Facebook and Twitter
are trust degrading machines because they reward those who are more and more innovative
and coming up with new species of cynicism and distrust, you will be paid 100x more
likes, followers, rewards, and reach than if you do not do that. And when you set in a society
up like that. I think one of the things that's actually hopeful about that is we've been living
in this 24-7, I was a magician as a kid, we've been living in this 24-7 magic trick that everyone's
cynical about everything when really there's a handful of people who are cynical and we gave them
a hundred-x reach. And so there actually are trivial solutions like the things that Francis talked about.
She just mentioned one. There's a hundred things that we could do and we could do them tomorrow.
And we'd agree on them. And the disagreement is an illusion. It's a magic trick.
So I want to, I think we're running out of time.
So we're hit on just one or two more quick things.
First, national security.
So one of the big themes for this council is on national security.
And people need to understand what we're facing now is a national security problem.
We had a major whistleblower come out in the last couple months.
So Peter Zaco, I always mispronounce his name.
He came out and he said, Twitter is running on duct tape and gum.
Their data centers are running hundreds of thousands of computers on out-of-date software,
they can't staff the people to update them.
But if you took out one or two data centers at certain times,
all of Twitter could have been lost permanently.
Here's a clip from Twitter whistleblower, Peter Zotko's testimony
before the Senate Judiciary Committee this past September.
And we'll include a link to the whole thing on our show notes.
I'm here today because Twitter leadership is misleading the public,
lawmakers, regulators, and even its own Board of Directors.
What I discovered when I joined Twitter was that this enormously influential company
was over a decade behind industry security standards.
The company's cybersecurity failures make it vulnerable to exploitation,
causing real harm to real people.
And when an influential media platform can be compromised by teenagers, thieves, and spies,
and the company repeatedly creates security problems on their own.
This is a big deal for all of us.
When I saw this testimony, a giant red light went off for me.
Because the thing that we don't understand is that right now,
like we're going to Mars with SpaceX,
because there is a pipeline, a supply chain of aeronautical engineers and PhDs,
every security professional that worked at Twitter
learned it on the job.
they learned it at Twitter. They learned at Facebook and went to Twitter. They learned a handful of
other companies. And Twitter is not profitable or sexy enough to get enough really basic security
people. We have a huge supply chain problem and it is endangering our national security.
One of the things you told me, and then I want to switch to optimism, is when you were working
at Facebook, you left, I think, after the 2020 election. And you said up until that point,
we had these teams that were staffing election integrity, election safety. And then you said,
after the election, everyone was so burnt out. It was like we lost the senior neuro technicians
for the brain implant, and now we got the team who's left to try to keep handling our systems.
And because of this talent issue you're talking about, people also get burnt out. They're aware
of the problems that we're talking about. And so there's this kind of moral internal issue,
where it's very hard to actually get people who can solve these problems, especially through the
growing success of this kind of movement that's revealing that there's a problem. And so we do
need to have more moral courage and more people working at these companies. That's why Francis
said at the beginning, I want more talented people working at Facebook, not less as a result of me
coming out. One of the reasons why the disclosure is so large. It's 22,000 pages. Like, I literally
wore a back brace at points because hunched over my computer to take these photos. It was that hard.
The reason I brought them out was that top tier. The only reason why they were able to ask the
questions they asked to have the intuition they had was because a lot of these people were veterans
that had been at Facebook for seven, eight years.
And they were all retiring.
They dissolved civic integrity,
and something like two-thirds of our team just left the company.
And a lot of those people didn't go into security in other companies.
They went into other things because they couldn't do it anymore.
Of course, Facebook's recent announcement
that it's laying off 11,000 of its employees
will almost certainly push the company in the opposite direction,
less security and more risk,
because they have to let go of many of the people
who work on the trusted intent.
integrity teams that exist to help keep the platforms safe.
We have military academies because we understand that national security is so complicated
and nuanced that we have to train leaders who understand the complexity of how to keep the nation
safe.
So I think these problems can feel very overwhelming.
They're massive.
We need scaffolding to fix it.
And so we need a scaffolding model of our minds.
Like how are we going to build up towards solutions?
There's triage solutions, which are things that existing companies can do.
Today, sometimes in the span of a day, an hour, or a week, or a month, that when, for example,
Apple makes changes to the privacy features, that can take $10 billion a year off of Facebook's
revenue and the entire surveillance capitalism kind of model by Apple kind of being a short-term
regulator and interim solution. There's solutions on teen mental health where we could turn
them on in the next two weeks and you probably cut the suicide rate the next year by 25%.
Like, that sounds ridiculous. I could put a whiteboard up air and be,
and just lay it out for you.
Yeah.
So there's triage solutions, and we should do all those.
We should make lists of them.
And frankly, we shouldn't have just a handful of people in this room.
We should have a collective intelligence, almost like an incentive prize.
We want as many people in the world saying, within the boundaries of the current incentive
model, what are the immediate things that would make the biggest difference that,
for the sake of getting more good faith returns from the public, what can these companies do?
Then there's transition, which is how do we actually set the turns in motion to catalyze a shift
in the incentives and the structure, you know, things like passing the platform accountability
and transparency act.
Having national, just like we have cybersecurity requirements now growing on companies, can we have
different kind of requirements, you know, on national security from government.
And then we also have long-term solutions, which is that in the long-term category, I think
what we need to do is get from a thinking model that's how do we get to less toxic social media
to instead tech plus democracy equals stronger democracy.
Because right now, China is employing, and the CCP is employing the full suite of exponential
technologies to make a new form of 21st century authoritarianism.
I don't think, you know, when we look at democracies, we're not employing the full suite
of 21st century technologies to make new and upgraded forms of democracy.
We're instead allowing private business models to profit from the degradation of democracies.
So the long term has to be, what is a vision for a digital democratic future?
And I, to end on an optimistic note, because I, you know, I talk about things like ethnic violence a lot,
and I have noticed it can be a downer.
So to end on an optimistic note,
because we can't change if we don't have hope,
I want to remind everyone,
our democracy has gone through crisis before.
I have only had the right to vote as a woman
for about 100 years.
Contemplate that.
That's insane.
African Americans basically didn't have the right
to vote in large parts of this country
as recently as the 60s or even more recently.
We have had crisis before.
And the thing about democracy is resilient.
that democracy is a tree that grows because we tend to it, but it can heal, it can grow back.
When it comes to new forms of media, every single time before when we have invented a new form of media,
it has been incredibly disruptive. It might get substantially worse before it gets better.
But every single time before, we have figured out how to respond. You know, when we invented the cheap printing press,
newspapers, we literally had wars over misinformation. But we invented journalism schools and journalistic
Ethics and transparency rules over like who owns what and how much concentration can you have.
When we invented the Telegraph, this sounds crazy. The Telegraph did not cause the Civil War,
but historians believe it influenced when the Civil War was because suddenly people could see
that they disagreed with Washington, that they disagreed with Maine. They disagreed with, you know,
Minnesota. Suddenly we had one nation and we had to work through those problems together.
Radio. Radio gave us personal relationships with our leaders. And it led to a lot of the movements
that caused World War II.
This has happened over and over and over again.
And the reason why it feels overwhelming right now,
but it feels like we don't know how to get out of it,
is because this is our time to shine, right?
This is our moment to stand up and do something.
And we are going to figure out how to get out of this,
but it takes work, and we have to do it now,
but I have faith that we're going to make it through this.
So thank you.
Let's solve it together.
The Council for Responsible Social Media will be working to pass the Platform Accountability
and Transparency Act in the next Congress, as well as social media safety regulations like
the Kids Online Safety Act, or COSA, and its next campaigns will advocate for other concrete
actions that social media platforms can take to ensure that they're not undermining U.S. national
security. To learn more about the Council for Responsible Social Media, which is a project of
issue one, you can follow the link in the show notes, and we'll also post a link to the
interview we did with Frances Howgan back in October 2021.
Your undivided attention is produced by the Center for Humane Technology, a non-profit organization working to catalyze a humane future.
Our senior producer is Julia Scott, mixing on this episode by Jeff Sudakin.
Original music and sound design by Ryan and Hayes Holiday, and a special thanks to the whole Center for Humane Technology team for making this podcast possible.
You can find show notes, transcripts, and much more at HumaneTech.com.
A very special thanks to our generous lead supporters, including the Omidiar Network, Craig Newmark Philanthropies,
and the Evolve Foundation, among many others.
And if you made it all the way here,
let me just give one more thank you to you
for giving us your undivided attention.