Pivot - Frances Haugen at Code 2022
Episode Date: September 14, 2022Frances Haugen, the former Facebook employee who turned over a trove of documents known as the Facebook Files, speaks with Kara Swisher and Casey Newton at Code 2022. What was the impact of Haugen’s... disclosure, and what does she still wish to see? And, ultimately, was it worth it? Recorded on September 6 in Los Angeles. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
Support for Pivot comes from Virgin Atlantic.
Too many of us are so focused on getting to our destination that we forgot to embrace the journey.
Well, when you fly Virgin Atlantic, that memorable trip begins right from the moment you check in.
On board, you'll find everything you need to relax, recharge, or carry on working.
Buy flat, private suites, fast Wi-Fi, hours of entertainment, delicious dining, and warm, welcoming service that's designed around you.
delicious dining and warm, welcoming service that's designed around you.
Check out virginatlantic.com for your next trip to London and beyond,
and see for yourself how traveling for business can always be a pleasure.
Hi, everyone. This is Kara Swisher, and today we've got a bonus episode for you.
It's my conversation with Facebook whistleblower Frances Haugen from this year's Code Conference. Casey Newton
and I spoke with Frances about the impacts
of her disclosure and what changes she'd still
like to see. I interviewed her first on
CNN Plus. Do you know how to show there?
How to show there on CNN Plus? No, I didn't.
I heard it got canceled. Anyway,
a little unexpected reminder during the conversation
about the crucial role of technology.
Everyone's phones went off with an alert about the California power grid potentially going offline.
You'll hear it. Enjoy.
It's now been basically a year since you made your revelations.
What practical effect do you think came about as you're coming forward?
came about as you're coming forward?
And I think one of the biggest things that happened was there is a major generational piece of legislation
that passed in Europe called the Digital Services Act,
which I think is a landmark bill
in terms of being focused on process and on transparency
versus, say, having prohibited types of content.
And I think the thing that I'm most grateful for
is Europe didn't have a whistleblower law
prior to last year. And they
said explicitly that the information in my disclosures show them the importance, especially
in a time where more and more critical parts of our economy are driven by these opaque systems
of the need to have protections for whistleblowers. So that's probably the thing I'm most happy about.
Yeah. And what hasn't happened that you wish might have happened in the past year?
Yeah. And what hasn't happened that you wish might have happened in the past year?
Oh, I think the stories around the mental health impacts on kids in the United States have really landed. But the core thing that brought me forward originally was we forget here in the United
States where we have good sources of independent journalism, like The Verge, that in lots of places, particularly on non-English
parts of the internet, Facebook is the internet still. And Facebook's underinvestment in safety
and security plays out with really catastrophic consequences in places like African countries
or in Southeast Asia. And I think there's a huge discussion that really hasn't come to fruition yet
that we have to keep drawing
attention to about that. Well, one of the interesting things is the way I started to
pay attention in writing about all these problems, sort of screaming about what was happening at
Facebook back in 2015, is when Maria Ressa came to me. She had been trying to get the attention
of Sheryl Sandberg and Mark Zuckerberg on these issues. And she called me and said,
they seem to listen to you. So could
you tell them this, this, and this? And she sent me a bunch of data. And she said, we're the canary
in the coal mine. And I have to say, it changed. I made a call right away to Charles Emmerich saying,
this is, you need to see this woman. You need to talk about it. And it was the Philippines,
obviously, where she is now under trial, possibly will be jailed again and again and again.
So when you think about what
Casey was asking, do you feel like you made a difference? It got a lot of attention. I often
feel like I get a lot of attention. And you, of course, did it in a quantum way with all the
documents and everything else. But not a lot has changed from my perspective. How do you look at it?
So the way I look at it is,
you know, when I went to business school,
one of the biggest things that changed for me
was how I viewed time, right?
So in Silicon Valley,
we look at things and say,
if something doesn't happen in two years,
it won't happen, right?
If it doesn't happen maybe four years,
it won't happen.
But the reality when it comes to things like regulation
is the arc of history is much longer.
And when it comes to social platforms, our entire oversight process has been cut short
because the public muscle of accountability never got built.
So the most important thing about the DSA is it demands access to data from the public
for the first time.
And we are starting 15 years behind or 20 years behind. And so it's
going to look slow for a little while. But ideally, you know, starting January 1, when data access
begins, we're going to begin being able to have the public have their own theories on how to reform
these things, be able to ask our own questions. And so I think that's the most important catalyst.
Speaking of things that take time, you filed a bunch of documents with the SEC
in hopes that they would investigate. What can you tell us about what you know that the SEC may or may
not be doing, and have they granted you whistleblower protections? So by the act of filing,
I have whistleblower protections. The SEC is a federal agency. These are federal investigatory processes.
They have a history of being incredibly opaque
until they bring sanctions.
And so I can't comment on my interactions with the government,
but the fact that it took 18 months
from when Cambridge Analytica came out
to when they did their first sanction,
which was a lightning-fast process.
So as soon as we could even imagine seeing something
in those corners would be another six months.
Yeah.
Meaning investigation, meaning something.
No, I mean like with Cambridge Analytica
from the day that the story broke
to the SEC announced sanctions, it was 18 months.
So that's like the fastest we can imagine the process going.
So, you know, if we still don't hear anything a year from now,
I would still not be surprised. But six months even would be very, very fast. What would you expect them to do?
So the case that we've made for them is that Facebook's share price was artificially inflated
because people used the product who wouldn't use it had they known the truth. There were advertisers
who would not have advertised had they known the truth. And that Facebook was able to spend less on safety systems than they would had they told the truth.
When you look back over the five years before when my disclosures came out in the Wall Street
Journal, there'd only been 27 instances where the price of Facebook stock price declined more than
5% versus the NASDAQ over those five years. And something like 60 or 70% of the time, it was
because they announced they're going to spend more on safety, or their users had gone down.
And so I think there's a case to be made that the share price is artificially inflated,
because it's gone down more than any of the other fang except for Netflix. And it went down
substantially more than the other social media companies did. Right. So assuming that this
complaint moves forward, what do you hope comes
out of that specifically? What specific changes do you hope might be made at Facebook or a lot of
these other platforms that have all of the same issues? So my personal fantasy, I have no idea
how realistic this is, is that they would force, the SEC has disgorgement powers. And right now,
Mark Zuckerberg has 56% of all the votes at Facebook, to the point where he actually introduced a whole other class of shares
just so he could sell his shares without losing control.
So I think it would be amazing if he was required to sell some of his shares
and put them in trust, because then we could actually have
the normal corporate oversight processes take effect.
The shareholders have been voting for one share, one vote for years.
This might be a way to do it.
the shareholders have been voting for one share, one vote for years. This might be a way to do it.
I also hope things like actual accountability around things like, you know, having to work with American Association of Pediatrics around mental health, like kind of the kinds of things
that happened after tobacco. Right. Yeah. You mentioned mental health. Obviously, the regulators
seem to seize on that more than I think anything else in your revelations and have proposed a variety of regulations around the world. In response,
the company said like, this was taken out of context. This was a very small survey.
What did you make of the company's response and how has our understanding of those issues,
do you think, advanced over the past year? What I find fascinating about some of those
responses is, so in that first round of documents the Wall Street Journal published, they published six of a large number of documents that dealt with teenage mental health.
Facebook knew what those documents were.
They promised the Wall Street Journal they wouldn't try to front run it.
They wouldn't release it on their own.
The day before they released the documents, Facebook put up two of the six documents.
And those were the two with the smallest sample sizes.
The document
that had 150,000 participants was not published, right? So it's not all the documents are small
sample sizes. It's the ones that Facebook is trying to draw our attention to. But the focus
on that versus the things happening in other countries, you yourself even said,
this is an issue, but not like this. How did you, talk about the rollout and how you looked at
how the press covered, how the politicians reacted. How did you talk about the rollout and how you looked at the,
how the press covered, how the politicians reacted. You were very fetid. You got a lot
of attention, but a lot of it was on that issue. Did you mind that being the issue?
So I think the, it's interesting. I, I actually care a lot more about the teenage mental health
issue now than I did a year ago. And part of that is I have had the honor of getting to like meet with the parents of kids who've died, things like that. And I don't personally have children. I hope to
one day. There is no pain as bad as like losing a child. And I appreciate a lot more now how severe
the harm is to our nation's children. I think one of the reasons why it has resonated so much is
if you talk to a parent of a 13-year-old, almost all of them are deeply, deeply concerned.
And it's one thing to worry about large-scale communal violence thousands of miles away. It's
another thing to watch your niece suffer, watch your daughter suffer, watch your neighbor's child
suffer. And I think
that's why people move so much quicker on kids in the United States. And you yourself, when you were
talking about it, one of the things they did was attack you. Either you weren't in the room.
Oh, I was a low-level employee that never was in a C-suite meeting. Yeah, you weren't in the room.
That's essentially, which is one of the favorite things of Silicon Valley. Everybody does that
at the center of Silicon Valley.
The other was you didn't know, it wasn't your area of expertise, which you said it wasn't
several times, and that it was skewed, as he said.
Talk about the control on you in terms of doing this and the impetus.
They also tried to imply you had all kinds of plots.
Who were you working for?
And so my partner really likes conspiracy theories. He more freely forages on the internet
than I do. And he loved, so one theory was I was a crisis actor. My favorite one was there's no way
I could have been that good. And for context, so I did National Circuit Debate in high school.
I taught it for four years.
My coach was so good, he became the head of the National Debate Association.
And these little details don't come out in those contexts.
But I think the larger one was, I never intended to come out.
And that's why I did things like, I didn't just photograph the documents, I photographed all the comments on the documents, because I wanted
Facebook's employees to be the ones saying this is real. And I want to say contemporaneously,
when people were discussing the context of this document, no one was questioning that it wasn't
true. You know, you also took another leap that I thought was interesting. So I was, you know,
one of the journalists that was able to get access to these files after the journal reported. And
something that I found remarkable was how little effort you attempted to exert over what journalists
did with any of it, right? It was just sort of like, here are the documents, you know, let us
know if you have any questions. Talk about making that leap and how you felt about the aftermath.
Do you feel like you got what you wanted out of putting that kind of trust in journalism? I was really, so part of the reason
why we made them more accessible, so I always intended at least to have non-English journalists
get access. There were certain issues with the rollout in terms of I had a very chaotic summer
last year, which one day if you read my memoir, you'll see why.
But I really believe in journalism as a critical component of democracy.
And if you try to control journalists, you are underwriting that immune system function of journalism and democracy.
And so I was incredibly, incredibly honored at how hard you guys worked.
I saw how seriously you took the process.
And I am incredibly grateful for the amount of resources, the number of publications invested.
Let me ask you, in that regard, Casey likes that.
I wasn't fishing for a compliment. It's just like, if I were in your position, it would be kind of scary to just, you know,
because theoretically, journalists could reach opposite conclusions and say, like, I don't know, there's not a lot here, right?
Well, that's part of the reason why I wanted these documents
outside of Facebook, though, right?
Which is that when we talk about, like,
we put journalism programs in junior highs
because we believe the general public should understand
the process of journalism.
We don't, like, one of the things that I'm working on now
is a simulated social network because I think every student should have a chance to play with how we structure social
media platforms, because little choices radically change the information works.
Right. So in that being, how, I want to talk about impact, because do you think it changed
their minds? They just changed their name. Oh, so there's little...
And their stock ticker.
Don't forget about that.
Their stock ticker, right.
That's a big thing, too.
Which has not been doing well.
So there's some things that we've seen from them
that do look like meaningful changes.
So they had years to roll out
any parental controls on Instagram, right?
They could have done this 10 years ago.
And the first parental controls they rolled out were a couple months after my disclosures. So there are things where it sounds like they've been listening. There are other
things that are really sad. Like they've further dissolved various parts of election integrity.
They've invested less and less in, you know, responsible AI. So I think there are things where
in responsible AI.
So I think there are things where they haven't really learned the right lesson,
which is, I know,
because I've talked to these consultants,
people were warning Facebook for years,
if you hide the dirty laundry,
eventually the dirty laundry will get aired
and it'll be worse than it will be
if you just fess up to it now.
And I think instead of coming back and saying,
hey, we're running critical,
vital infrastructure, like we are the internet for at least a billion people online today,
people who live in societies that make up three or four billion people,
maybe we should be more participatory. I don't think they've learned that lesson yet. And why is that? I know when I talk to them, they feel like victims continually. I'm mean, I'm done, I'm finished with them.
But you were dishonest, you were taking advantage of them,
which I'm sort of like, how in the world do you take advantage
to a multi-billion dollar corporation?
But okay, if you're going to believe that.
But where is the problem of that?
Because at some point, too much beating up, absolutely, no question, and unfairly.
At the same time, there is a moment where companies, I'm thinking Airbnb, go, oh, I really need to fix it.
We have to do something different.
Right.
So the thing people talk about hitting bottom is, in fact, you can keep going down, right?
You think you've hit bottom.
There's actually more.
I think the issue with Facebook is they haven't yet admitted that the way that they were doing business is what caused their problems, right? That they have this echo chamber internally,
where because Mark is unaccountable, he can surround himself with people who tell stories
like that, who say like, no, no, no, no, you're the victim here. And reality never has to weigh in with consequences. And so I think it's one of
these things where until the incentives change, we should not expect behavior to change.
How would that change? Sorry. Oh, sure. How would it change? So there's a bubble around. You're
saying a bubble of everyone who works rooms being paid for him is telling him he's very pretty.
Yeah. And therefore he's so smart. He's very pretty. Yeah. And therefore he's...
He's so smart and misunderstood.
He's so smart.
Yeah.
Oops.
Uh-oh.
Is everything okay?
Flash flood warning.
Flash flood.
Electrical.
Uh-oh.
Well, if everything goes off, here we are.
Let's blame Facebook.
It's kind of like how Facebook went down the day after my 60 minutes.
Yeah, they did.
They did it.
So how do you get...
So you have to get to Mark, right?
Correct?
This is really funny.
I don't know, turn off their phones.
So actually, while we're riffing, so when it went down, the day after my 60 minutes
then, people who I will not disclose, all the A name records disappeared for Facebook.
They disappeared in a way that-
The A, I'm sorry.
Sorry, when your DNS records route you around the internet,
like the phone book of the internet,
Facebook's entry from the internet disappeared.
And it's very odd because to do that inside of Facebook,
like three different employees would have had
to all sign off on the same change.
So either there was a conspiracy inside of Facebook
or Facebook brought itself down.
And so I just, I wanna know what happened so badly. Like someone one day is going to leak that one. It'll be delicious.
We'll be back in a moment with more from Frances Haugen at Code.
Fox Creative. This is advertiser content from Zelle.
This is advertiser content from Zelle.
When you picture an online scammer, what do you see?
For the longest time, we have these images of somebody sitting crouched over their computer with a hoodie on, just kind of typing away in the middle of the night.
And honestly, that's not what it is anymore.
That's Ian Mitchell, a banker turned fraud fighter.
These days, online scams look more like crime syndicates
than individual con artists, and they're making bank. Last year, scammers made off with more than
$10 billion. It's mind-blowing to see the kind of infrastructure that's been built to facilitate
scamming at scale. There are hundreds, if not thousands, of scam centers all around the world.
These are very savvy business people. These are organized criminal rings. And so once we There are hundreds, if not thousands, of scam centers all around the world.
These are very savvy business people.
These are organized criminal rings.
And so once we understand the magnitude of this problem, we can protect people better.
One challenge that fraud fighters like Ian face is that scam victims sometimes feel too ashamed to discuss what happened to them.
But Ian says one of our best defenses is simple.
We need to talk to each other. We need to have those awkward conversations around what do you do if you have text messages you don't recognize? What do you do if you start getting asked to send
information that's more sensitive? Even my own father fell victim to a, thank goodness, a smaller
dollar scam, but he fell victim. And we have these conversations all
the time. So we are all at risk and we all need to work together to protect each other.
Learn more about how to protect yourself at vox.com slash Zelle. And when using digital
payment platforms, remember to only send money to people you know and trust.
We're back.
Now, more from Code.
Let's talk a little bit
about what you're doing now.
You mentioned that you're working
on a simulated social network,
which sounds like a sort of
very grand project.
Like, what does that mean
and what will be done with it?
So right now,
the way we teach data science,
I think, is really reductive, right?
So we teach data science
with little toy problems where I think the biggest problem with the methodology is they all presume there's
answers. So when we do real industrial machine learning, it's not like there's a clean yes or
no on do we ship. It's that there's 20 or 30 stakeholders on every change, and some win and
some lose. And we still have to decide, do we go forward? And right now we wait for students
to land at Facebook or at Google or Pinterest
to learn how to think about those trade-offs.
And so there's a number of academics
who have made simple simulations of social media.
We want to build on that history
and be able to teach big, sprawling,
messy data science classes
where we put students in seats where, you know,
if we talk about, should you have to click on a link before you reshare it? It sounds obvious.
It's like 15% less misinformation right there. You didn't have to censor anyone. You just put
a little human in the loop. But Twitter did it and Facebook didn't. So there must be something
more interesting there. I want to have students sit in those seats and argue about it because
we need 100,000 people who understand the physics of social media because that's how we'll design safe social media.
Meaning that they will create a diff, that they could make different choices instead of engagement, virality and speed.
You would have context, accuracy.
Or even really simple things, right?
Like if you say, okay, so let's say you write something.
It lands in Casey's news feed.
Here he shares.
It lands in mine.
Now it lands in some rando's.
That person doesn't know you.
You don't know them.
Imagine if they could still say whatever they wanted,
but they had to copy and paste to do it.
We just put a little bit of a speed bump.
That little bit of friction, that chance to contemplate,
has the same impact on misinformation as the entire third-party fact-checking program.
And it's not obvious.
It's not obvious that a change that small would have that big an impact.
But that's why we need more people who understand that.
So do you imagine there will be – I talked to the lawyer for Google.
Oh, I'm totally blind.
David Drummond?
No, no.
Kent Walker.
It's a woman.
Okay. Of course, it was a woman. Okay. Okay.
Of course, it was a woman because it was smart. She was talking about doing slow internet,
the idea of slowing everything down. And her premise, this was 10 years ago, we did a podcast on this, was that the engagement, virality, and speed were the design. It's about design, really.
virality and speed where the design, it's about design really. And if you change it to context,
speed is always there. People want speed, accuracy, context, something else. It changes the entire experience. It's how you could make something, a building different, essentially, a building
different. Is there any impetus to make it different? Why would there be if it's just to do
good? I mean, that's not something
you hear a lot. It's like, let's sell ads, let's make money, let's do this.
So let's imagine we rolled back in time to the Facebook of 2008. So one of the things that makes
me feel kind of old is I go to college campuses now and I'll get up on my soapbox and be like,
do you remember the Facebook of 2008? And these like 18 year olds will look at me with big eyes
and be like, no, I don't. But the Facebook of 2008 was
about our family and friends. It was the Facebook that if you stopped a random person on the street
and said, what is Facebook? It was that version of Facebook. I think Facebook would not be seeing
its users bleed away today if it had optimized a little bit more on slower growth, on actually
giving people that connection instead of just seeing how many ads
could they get them to view each day. And so I think there's a lot of investors who want to
figure out how to help these companies be more long-term successful. I think there's a lot of
litigators that are all kind of, they can smell there's blood in the water. All those things are
incentives that can push companies towards
more responsible decisions. And I think building out that ecosystem of accountability, that's what
gives us safe cars. You know, Facebook loves to say, cars kill people. We don't get rid of cars.
But we have a really good federal transportation agency that checks those cars. And we have
litigators, investors, advocates who understand how to hold them accountable.
Yeah, they love that argument. And then when you say that, they're like,
well, they walk off in some fashion. Go ahead, Casey.
Another thing that is interesting that you're doing that I think other whistleblowers have
not taken, some whistleblowers drop the complaint and then disappear. This is an active set of
campaigns that you're working on, has many projects associated with this.
Is this your life's work?
I think there's this question.
He's also asking, can you ever be hired by a tech company again?
Probably not. That's a no. You know, there's a lot of tech companies out there.
And it turns out good data science managers are hard to find.
No, but I genuinely believe that if we don't figure out
how to do this in a responsible,
sustainable way, like we've seen Myanmar where hundreds of thousands of people died because of
a social media fueled act of communal violence. We saw Ethiopia over the last couple of years.
We're starting to get glimmers of it in Kenya. I think there's tens of millions of lives that
are genuinely online in the next couple of decades. And so, you know, I know there's tens of millions of lives that are genuinely online in the next couple decades.
And so, you know, I know there are very few people who understand the human side and understand the algorithms and can organize people.
And so I am going to push on this as long as I think I'm making some distance.
Support for this show comes from Indeed.
If you need to hire, you may need Indeed.
Indeed is a matching and hiring platform with over 350 million global monthly visitors, according to Indeed data.
And a matching engine that helps you find quality candidates fast.
Listeners of this show can get a $75 sponsored job credit to get your jobs more visibility at Indeed.com slash podcast.
Just go to Indeed.com slash podcast right now and say you heard about Indeed on this podcast.
Indeed.com slash podcast.
Terms and conditions apply.
Need to hire? You need Indeed.
Thumbtack presents the ins and outs of caring for your home.
Out. Procrastination, putting it off, kicking the can down the road.
In. Plans and guides that make it easy to get home projects done.
Out. Carpet in the bathroom. Like, why?
In. Knowing what to do, when to do it, and who to hire.
Start caring for your home with confidence.
Download Thumbtack today.
How do you think the other social media companies behave, too?
I mean, right now, there's another whistleblower, Mudge.
It's Peter Zetko?
Peter, yeah.
Zetko.
Claims about the platform's security issues.
Speaking of conspiracy theories,
it seems rather timing is interesting.
He's being represented by Whistleblower Aid,
same group that represented you.
Talk about that, what's happening there.
So for those who are not tracking
the latest whistleblower
hijinks out there um much came forward and said twitter is chronically under invested in very
basic safety systems it's not even covering um uh you know engineering practices 101 like you
could take out a couple data centers and you might not ever be able to bring twitter back
um there are known foreign operatives working in the company. I mean, the list is quite long of the problems. And the thing
that really stood out for me is I think most people aren't aware that there are professionals
in software who work on safety and integrity. And this is just like how we have front-end designers.
This is like how we have ML engineers. This is like a profession. And right now, the whole industry is dependent on people who are trained in-house to do this job.
And the reason why I want to build a simulated social network is we need to educate 10,000 or 100,000 of these people a year.
Because right now, Twitter can't hire even the skeleton crew to take care of things.
Because there's such a talent war for them.
And so I don't think Twitter's problems are unique. I think it's a question of there's extreme scarcity and companies
that don't have, you know, the balance sheet that Facebook does. They can't even afford. Yeah, I
think one of, one, someone said that Facebook was able to hire hundreds when they were starting to
see problems, hundreds of people and, and Twitter can hire three people and a cat essentially, you
know, and not very good. It was a very cute cat. Yeah, whatever. I'm not very talented. But when you face a thing
like that, if it's not safe and there's no accountability and you talk to Senator Klobuchar,
this bill might not pass. Many people don't think it will, even though she's absolutely determined.
How do you then change that if there's been no legislation on technology regulation,
really significant regulation, except in Europe? How do you then, you can do your thing where you've
got a ton of attention, but it flames out. And I think someone at Facebook said, oh, she'll go away.
I remember, and I'm like, again, another person I wanted to punch in the face.
I feel like Will Smith right now. But I think it was here,
actually. No, it wasn't here. How do you deal with that, which is the long game, the long rope-a-dope,
they'll go away, etc.? Well, I'm a little anomaly for a tech person in that I have a history minor,
and I focused on Cold War studies. And you look at the study of the 20th century, and there
were a lot of seemingly impossible things that took place because people worked on them for
decades. So if you had asked anyone in the world in 1870, will Britain ever leave India? No one but
like a thousand crazy Indians in India thought that was ever going to happen, or the fall of the
Soviet Union, or the end of apartheid. The way these things will change is people will realize they could change. And a year ago,
Facebook's narrative was the only narrative. They had spent hundreds of millions of dollars to say,
you have to choose between safety and freedom of speech. That's the only choice. It's like,
oh, you want some safety? It's going to cost you. The reality is there are product
choices, things like that. Should you be able to just reshare infinitely? Should you have to copy
and paste at some point? Should you have to click on a link before you reshare? Which have nothing
to do with freedom of speech. Which have nothing to do with freedom of speech. Well, they want it to be about freedom of speech,
don't they? Because they know. Tom Cotton can yell. And Facebook, some of the documents on my
disclosures were of studies about how angry
people get when they get censored. They knew that as long as we argued about censorship, we would
never do anything differently. And so the reason I believe we will find some kind of new social
contract for social media is that people know there's more options now. And that's the first
thing for change. I have one more that I want to ask, and then should we move to audience questions? You mentioned changes to the products,
and there's this huge one unfolding now at Facebook, which is this shift to this discovery
engine and AI-powered feed, basically turning the app into TikTok. Years ago, Mark said that
they had found it was really bad when people are just passively consuming video by thumbing,
and now the entire product is becoming that. Based on what you saw in your documents and what you know from your time
there, what are the risks of a world moving toward these sort of purely AI-recommended fields?
I think it is an incredibly important and incredibly pressing question. So Facebook
looks to TikTok and says, oh, TikTok's growing. They have the solution. The thing that we in this
room really need to understand is TikTok can
operate the way it does because it is designed to be censored, right? That algorithmic feed
hyper-concentrates content onto a small enough number of items each day that they literally
have humans scan each one, look at it, and approve it before it goes out. And this has been known for
years. There was a scandal a couple years ago where if you were visibly gay, if you were visibly
disabled, they took your content down.
We're in trouble.
We're in trouble.
To protect you from being bullied.
They really love you, that's why they did it.
But Facebook has internal studies that say if they give you more stuff from your family
and your friends, from pages you actually follow, groups you actually joined, you get
less violence,
less hate speech, less annuity for free. And so the world that they're choosing to go towards
is one where you have to do censorship to be safe. And I think that's a bad idea.
Right, because they will make choices, which they do now. And to be fair, they do it behind
the scenes. They just don't talk about it. They make choices every day with every design they
make. So one of the things, so your point is social media doesn't have to be bad.
It doesn't have to be despair, denial, destruction, dysfunction, everything else.
Do you still own Facebook stock?
You know, I think I do.
Oh, wow.
Because, like, I wasn't supposed to sell it for a while because I was an insider, I guess.
I thought you were too low down for that.
Well, it turns out if you can crash the stock price
50% maybe you shouldn't sell
but
the
my lawyers told me don't sell and so I didn't sell
I was good SEC you can hear me
but I think I do
still have like a small amount like not a huge
amount
it's like a souvenir
I have one last question we talked in 2021
I asked what you would say
to Mark Zuckerberg
you said a lot
has that message changed
or if you want to answer
even better
if you could be
Mark Zuckerberg
for a day
besides wrestling
what would you do
I think you look great
so one of the things
that he said recently
in one of those interviews
where you know
I've been saying this since the beginning my heart always goes out to Mark Zuckerberg,
because like, I think the reason why he thinks we're all going to spend like all day in the
metaverse is because like he spends all day in the metaverse, right? Like when he walks into a
restaurant, people glare at him, right? If he goes to the grocery store, people glare at him.
He said in a recent interview that every day when he opens his phone it's like getting punched in
the gut and I think the thing I would say to him is Mark you are so smart you are so capable you
have functionally infinite resources there's all sorts of things you could do do you really want
every morning to wake up and feel like you got punched in the gut yeah is there something else
you could do with your life yeah because. Because I think you might be happier.
You're so nice.
I'm like, stop, get out of the way of my foot, my friend.
Or something else.
So you really think he can do that?
I think at some point, the body keeps the score.
He can keep being in denial and he can keep hurting himself.
And as long as he keeps hurting himself, he'll keep hurting us.
And so I think the reason why I can be compassionate to him
is I don't think people change because we yell at them.
I think they change because the cost of changing
is less than the cost of not changing.
On that note, wow, you're a nice person.
Thank you.
Questions from the audience.
Questions, please, for Frances.
Hi, Frances.
Lauren Good from Wired.
Hi, Casey and Kara.
So I wrote a story for Wired last year
that was about our relationship with memories online,
and a lot of it was focused on what was Facebook then, not meta,
and how Facebook was trying to keep us super engaged in its app
by constantly resurfacing things from the past
that in some cases we don't necessarily want to see.
So in Europe, there's obviously the right to be forgotten.
And I've wondered if there's potential for something around the right to forget
and what that looks like.
So I'm wondering if you can put into context
what you think Facebook is really doing and other social networks
with basically
monetizing our memories and what a better version of that looks like in the future.
I mean, the reason why they have those features is really simple. It's retention, right? That
they can remind you, oh, remember that effort you went to in the past of putting stuff on Facebook?
Like we have very complicated ML algorithms to guess what are the things that if we show them to you again, you'll reshare them,
right? And so they can tell like what tugs at your heartstring. Is it your new baby? Is it that
moment when you were really happy? And, you know, these systems are very nuanced and they're very
good at emotionally manipulating us. And I think the question is,
you know, if we were to design to respect autonomy and dignity, what kinds of different choices would
we make? And I don't know the answer to that question yet, but I think having some accountability
for the public in that conversation, I think is really important. So that's not something that
you are specifically focused on as part of? So I am an algorithmic specialist.
And like the thing that I feel like I can give to the public conversation is the physics of
social software is very counterintuitive. You know, it's driven by things like power laws,
like very small changes on the margins can have very different consequences. And that's the kind
of thing that I can make a difference in terms of educating people, like making them realize they have many more options. And I'm not a social
psychologist. I'm not even a UI designer. Like the products I build are invisible, right? And so
that's the place I can help. Great. Thank you. Are there any other? Okay. I was going to say,
I would keep going, but go ahead. I'm sorry. Thank you, Lauren. Frances, thank you for being here.
I'm curious, and not to minimize the importance of design
in creating outcomes here,
but can any design ever be more important
than the business model of the network?
Yeah, great question.
And as long as the business model doesn't change,
won't it always force design to the same results?
No.
So I think there's a couple different things there. So unquestionably, if the incentives do not change, we should expect no change from these companies. But if we were to look at, say, an auto
company, some of the tools that are available to us for any physical product are things like we can
buy the product, we can buy a car, we can take that apart, we can put sensors on it.
If Volkswagen wants to lie about their emissions,
we can challenge those.
And because we can form our own knowledge,
we can form our own theories,
we have a public muscle of accountability.
We build that.
We have never gotten to do that with social media,
and that's why litigators don't know how to bring
product safety
or product liability lawsuits. It's why even regulators don't know how to write effective
regulation, right? The only places you learn these skills, you learn things that I learned,
is by working at these companies right down in the trenches. And so we need to be able to bring
way more people to the table so we can begin talking about what are the right incentives
to move these companies towards the public. That a great answer was it worth it francis
100 like there's nothing there's nothing in the world that feels as good as giving another person
hope right and and like i meet so many people who like like i'll start the conversation off
and they'll be angry and they'll be frustrated
and like, I'll talk to them for like 20, like I'm planes. I I'm, I'm a plane talker.
I'm willing to confess that. Don't sit next to me. Yeah. Um, but I'll talk to people and they'll,
they'll, they'll feel hopeless. Like it's like, Oh, like we're just doomed, right? They buy into
the Facebook narrative of it's, you know, freedom of speech or safety. And, and I love, I love that
I get to watch people
go out into the world
believing that the world could be different.
And so I don't know, it was totally worth it.
Yeah, you're from the Midwest.
Thank you, Frances.
We'll have more conversations from Code in the feed.
Stay tuned.