A Bit of Optimism - The Asymmetry of Power with Tristan Harris
Episode Date: November 30, 2021A small number of tech companies have amassed enormous power in our modern world. Do they wield that power responsibly, or do they abuse it to benefit themselves? I talked to Tristan Harris, “the cl...osest thing Silicon Valley has to a conscience,” to better understand what we can do about our social dilemma. This is… A Bit of Optimism. If you want to know more about Tristan and his work, check out:About | TRISTAN HARRISCenter for Humane Technologyhttps://www.thesocialdilemma.com/
Transcript
Discussion (0)
If you saw the documentary The Social Dilemma on Netflix, then you heard a lot from Tristan Harris.
He used to work in big tech, and now he's on a crusade to tell us all about the dangers of social media and what we can do to protect ourselves.
This is a bit of optimism.
Hey Tristan, thanks for joining me. You've already had an incredibly intense year.
Can we start with your experience with the California wildfires?
Yeah, I had been based in San Francisco and then when the pandemic hit, I actually moved into my family home that I grew up in up in Santa Rosa, which is in Sonoma County.
And two weeks after the social dilemma came out, a huge wildfire came through Calistoga and Sonoma
County and basically overnight incinerated everything that I know and grew up with.
And we had to evacuate. We had 30 minutes
to leave. It's crazy to go through something like that, you know, waking up the next day and get a
text message from your neighbor saying, it's gone. Two other houses in the neighborhood also burned.
Many were spared. We were one of the unlucky ones. But as is true in existentialism,
the meaningfulness of something is what happens
afterwards and the meaning you continue to make. And so I think I'm trying to turn it into the
sweetest thing that's ever happened by rebuilding a better life and living situation afterwards. So
it's been a crazy year. Did you have a chance to mourn? And what about your family? I mean,
this was the house you grew up in. So this was the house that I grew up in with my mother.
And actually, strange fact, the house burned down exactly two years after her passing.
Wow.
Which is, it's this very strange twist of fate.
I don't think I really mourned until November after getting through a huge amount of the press and media tour.
And I think there was a big release of kind of just recognizing what had happened.
And yeah, it's been a crazy journey.
Well, talking about The Social Dilemma, you know, that was one of those shows on Netflix that had everybody talking.
And really, for the first time, made people realize the danger and feel fear for the way that social media is in our lives.
And more importantly, the way that companies manipulate our lives. How did you get on that journey in the first place?
How did you become the spokesperson waving the flag at all of us? Look here, look here.
How did that journey begin for you?
I think it's easy to look at a film like The Social Dilemma or My Background and assume that
I'm kind of this Luddite anti-tech critic
or something like that, and that we should go back to the Stone Age or that I'm an anti-technology
person. I actually went to Stanford, studied computer science, was classmates with the
founders of Instagram, was part of a group with one of the founders, Kevin Systrom, was the CEO
of Instagram. And we were part of a social impact oriented group about how could we use computer
science to make the most positive social impact in the world. And so that was the upbringing that
I had at Stanford. And you worked in big tech. Yeah, yeah, I worked at Apple when I was 18 years
old as an intern, I worked at Jimmy Wales, Wikipedia spinoff called Wikia. And then I
started my own technology company and raised venture capital and went through that whole,
you know, round. So I really saw the pressures that are on entrepreneurs, you know, when you when you raise venture capital, and you have to grow a
certain way, there's a pressure to grow to x million users as fast as possible. And if you
don't get that flywheel turning, you know, you know, your thing is dead, you can't raise more
money. So in my case, our company that I that I'd started called Apture is actually, you know,
it's an important story story because it sets up the
fallacy that I think is at the root of everything the social dilemma is calling out. So back in
2007, I was at Stanford, a young, naive, 21, 22-year-old kid with my friends thinking, hey,
we could make the world a better place if we could get people to care about things that they didn't
know about. And my naive thinking when I was 21 years old, with our friends brainstorming this was what
if when you're in the internet, and there's, you know, someone's talking about something like,
you know, a specific food shortage in Ethiopia in a certain region. You know, you read that over
that you've never been to Ethiopia, you don't even know the names of the cultural tribes there,
the history, how could you expect someone to care about it? So Apsure was this service that we
worked with web publishers, like The Economist, like Washington Post. Originally, the publisher could actually link
up a phrase like, this is the region of Ethiopia that has the food shortage. And you'd hover over
it and boom, you'd see an immediate video. You were transported there and you saw the people
rummaging around in some refugee camp and they didn't have food. And it was like, say, some
video from the BBC. And then it had right there next to it, a map of exactly where that was. And maybe you could link a map of where the reader
was from or something like that. You could associate these things and make it visible
right there on the Washington Post or right there on The Economist or the BBC without leaving the
page. So, at the risk of leading the witness here, because I know where we're going,
what was so naive about that?
If we're jumping to the punchline of what the core thing that, you know, after it took
me, by the way, six and a half years to learn this lesson, which was really not learning
a lesson, but honestly getting out of denial as the CEO and founder of the company.
You're so good at telling yourself the story.
We're getting people to care about things.
We're getting people to care about things.
You can't see through your own story, which by the way, we're going to get to when we
talk about the Silicon Valley leaders that are telling themselves stories about what
their products are doing.
Right.
But the core point you're bringing up was, what did The Economist or The Washington Post
care about when we did that deal with them to put this on their website?
And what really mattered to them, Simon, was, are you increasing the amount of time on my website?
Are you increasing engagement? Do you get five more page views for every page view someone clicks
on? Do you get more ad revenue? And if we didn't keep engagement high, they would take our product
off their website. So my business was existential to whether or not we were delivering on the
engagement proposition. And you think about Facebook and Twitter and YouTube and you say, what are those services doing? Well, if I want to, if I'm running YouTube,
I could tell myself a story. We're helping people learn about things. We're helping people watch
those videos that pump you up and get motivated with that pounding music. And you wake up in the
morning, we're helping people look at music videos and artists, and we're helping creators,
you know, have new businesses. There's all these young people who can create whole new $15,000 a
month, you know, businesses off of reviewing products and we're enabling new businesses. There's all these young people who can create whole new $15,000 a month
businesses off of reviewing products and we're enabling new economies. You can tell all those
stories. But at the end of the day, what is YouTube's business model? It's got to drive
more time on site from more users and getting people to share more of those videos to more
people every single month. And their stock price is directly connected to those set of facts. Can you sum up what you believe the danger
of social media is? When you have three or four technology companies that are kind of the brain
implant for human civilization, meaning they think about you being at home for the last year,
and you're staring outside your window as I'm doing right now. Do I know what's happening in that world by through my own eyeballs
or do I have to reference my brain implant of social media to say what's going on in Oregon
right now? Is it a war zone? Is there a heat wave happening in the Northwest? That's all dictated
by social media. So the first thing to recognize is that social media has become
the reality constructing infrastructure for our minds. That by itself is not a problem,
but it's important to understand. Let's say that again. Social media constructs our reality for us.
Correct. It is not looking out a window. It is looking out their screen.
Their screen, their window of reality. So are my friends all happy and basically on beaches in
Tulum? You know,
according to Instagram, that's the reality of my friends' lives. Is everyone angry all the time,
or is Twitter basically casting this huge net for everything everyone was angry about over the last
week and then saying, in case you missed it in the last 72 hours, here's a list of outrage.
So these things have basically sorted for a very particular set of things that are good for them,
which is getting attention. In the case of Instagram of things that are good for them, which is getting
attention. In the case of Instagram, things that inspire anorexia are really good for teen girls
at getting their attention. In the case of Instagram for young boys, basically soft porn
rabbit holes of just infinite photos of hot girls are what keeps those young boys attention.
In the case of YouTube, the conspiracy theories and kind of rabbit holing on Holocaust
denial and flat earth conspiracy videos were really good at getting attention. In the case
of Facebook, extreme groups recommended to people were really good at getting people's attention,
and that radicalized the world. In the case of Facebook and Twitter and TikTok, personalizing
people's newsfeeds is better at getting attention than not personalizing their newsfeeds, which means that affirmation rather than information is what dominates that reality
constructing infrastructure. We have been deranged as a society inadvertently, not intentionally,
but by these brain implants, which are these three or four tech companies that have dominated,
especially in a COVID world, the way that we see the world. And I think most people can relate to
they have friends or family members that feel like they can't talk to anymore,
because they have some other different set of views than them. And we've all been kind of
radicalized in some more extreme direction. We don't know it because we only see, well,
the other side looks like they're really crazy, while they say the same thing about ours.
You know, increasing what we're seeing, you know, there's a study that came out in the Wall Street
Journal, it was basically cited in a Facebook internal document that 64% of the extremist groups. So it's like moms who want to make their own baby
food. Okay. So it sounds great. And this is by the way, back in like 20, maybe 16, 17.
What do you think was the top recommended group for new moms who are in this organic baby food
group? I shudder to think. You shudder to think. Well, it's obviously been politicized now, but
this was basically moms against vaccines. And what she found in her research is once you join
one of those groups, the vaccine groups, then it said, hey, by the way, do you want QAnon? Or hey,
by the way, do you want Flat Earth or chemtrails? And it would show people all of these other
groups. Now, regardless of the truth statements of some of the contents of some of those other topics, just noticing this is how the machine worked for about 10 years, right?
So now you rewind the clock and you say, okay, our society has gone insane. And you realize that
we've got this brain implant stuck in our society. And you rewind the clock and say,
we've been through 10 years of this unregulated derangement process from social media recommending
certain things and not others and selecting for the things that would most hijack our limbic system. What's going through my mind
as you're talking about this is it's a little bit like Oppenheimer and the bomb, right? Nobel
and TNT as well. Same thing. At the time Alfred Nobel was developing TNT, he believed he was going
to create the most powerful weapon ever so that he could bring about peace in the world.
That didn't happen.
He regretted his work.
Ironically, he is then the architect and author of the Nobel Peace Prize.
Oppenheimer is working on the nuclear bomb, the Manhattan Project, thinking that if he can develop this weapon, it'll bring peace in the world because it'll be more powerful than anything else.
And turns out he was thrown off the project as soon as he starts quoting the Bhagavad Gita. And I think what ends up happening
with extremely powerful things like nuclear bombs or social media is that though they may have been
founded with idealistic and positive intentions, there's always balance. Social media does a lot
of good and it also does a lot of evil. It's balanced. But if it did less good, it would do less evil. And so the question I have is, is the whole thing
to restrain it in? And one cannot help but look at authoritarian regimes, whether it's Russia
or the more restrictive application of social media in China, that they just don't have
the troubles that we seem to have with an open access social media. In other words,
they don't have the benefits, but they also don't have the destruction, or am I completely
missing something here? The world is trending towards these two different attractors right now,
that if you sort of take your hand off the steering wheel, you know, we're heading into
climate change, we're heading into these extreme volatile periods of history.
And the two ways of relating to that are governments trending towards authoritarianism
and oppression, which is the Orwellian world, which is like, let's just have one big master
state that just changes the climate timelines that just, you know, locks up human behavior
that coerces people and is against
freedom. That's one dystopia we don't want to head towards, but also has certain benefits in the ways
that it will maybe deal with climate change, because it can just top down exert that that
governance structure. The other attractor is chaos, which is basically just, let's maximize
the sort of freedom, which is basically maximizing polarization, disagreement, chaos.
This is all getting very philosophical, right?
And it's – the trouble I have with this whole discussion is that it's all very lofty.
It also abdicates any kind of personal responsibility.
Big tech needs to do this.
Government needs to do this.
Other people need to do this. Government needs to do this. Other people
need to do this. And I disagree with the idea that it's either freedom or chaos. The reality is,
is all of these organizations, all of our government entities, all policy starts with
an individual. If we go into Facebook, where are the employees? We've seen it happen. We've seen
employees exert tremendous pressure on a company.
Like, where is the moral outrage?
Where is the ethical standard?
All of this is solved with personal accountability because one of the things we neglected about looking out the window is who's in your house?
How are you treating the people in your house?
Forget about what's out the window.
That's where I think the solution lies.
It's not chaos or freedom.
It's me the window. That's where I think the solution lies. It's not chaos or freedom. It's me to you. It's how I treat my friends and how I think about myself in this system,
rather than a victim, somebody who can do something at a very small scale,
which turns out lots of people working at small scale adds up to big scale.
What I hear you saying is the third attractor beyond oppression or chaos is responsibility.
Yeah. Like a parent who restricts their kid on Facebook. A couple who one says to the other,
honey, let's put them away and leave our phones at home and go out for a walk. Do a jigsaw puzzle,
read a book, watch a movie, you know, whatever it is. You and I have both been victim to it.
I'm victim to it, which is you're with a friend.
One of you picks up the phone.
The other one feels stupid.
So the other one picks up the phone.
And before you know it, you've both gone down a rabbit hole.
And 20 minutes later, one looks up and goes, we should probably stop, shouldn't we?
I literally have probably more encyclopedic knowledge on the various persuasive mechanisms
and manipulation techniques than, I don't know,
99% of people on the planet. And I literally get influenced and sucked into these things all the
time because the point is we are all trapped in that meat. And I want to say one thing,
which is to just acknowledge asymmetries of power. So we can understand when we're being
victimized or being manipulated without identifying as a victim and staying in that place. I'll give
you the example of a teenager who sees social dilemma or listens to this you and I right now
and says, you know what? Today's the day I'm actually going to right after listening to this,
turn off my notifications from all these damn things. I'm going to uninstall these five apps
from my phone and I'm going to make a commitment about what things in my embodied life in the real
world I'm going to choose to do for a a teenager, I just want to go into this teenager example.
All of their friends are still, all the sexual opportunities, all the homework gossip,
all the gossip about who's doing what to whom, that is all still happening in the Instagram world,
in the TikTok world, in the Snapchat world. And one of the diabolical aspects
of the asymmetry of power here, the victimization, is that social media companies actually, by design,
prey on social exclusion. That if you leave by yourself, it's not an individual choice because
you're actually losing access to what it means to be growing up within that social cohort of your
other friends. And they know this and they manipulate
it. And so just to extend, I don't disagree with the taking responsibility thing. I would just say,
we have to do it as a group. We have to do it as a society. Because if a whole school-
Do it as a society. You can't make a society do something. That's not how societies work.
I think we're missing something here, which is the fact that we are not just a society
of numbers and random individuals, nor are we a society of individuals who just live
a life in our own silos.
But we are both individuals and members of groups at all times.
And what makes us feel worth is not necessarily what we feel about ourselves, but what others
feel about us.
worth is not necessarily what we feel about ourselves, but what others feel about us.
And simply telling someone, do this for yourself so that you feel better is not a good motivator.
Just like it's not a good motivator to say, go to the gym so you'll feel better about yourself, because as I've learned, I'm totally fine disappointing myself. Like sitting in bed
and not going to the gym and not working out, I'm fine. But if I have someone that I'm going to
work out with, I do not want to let them down. And that's different. And I think the thing we're
missing is when you go down these rabbit holes, you are making your friends feel excluded. You're
making your friends feel lonely. And if you want your friends to feel loved, if you want the people
you love to feel loved, then put it away or turn it off.
And I think we've forgotten the service element of this.
This whole journey began of selfishness.
How can I make more money using this thing?
How am I going to make people watch?
And then the people who go down the rabbit hole, it's all very selfishly driven.
And I think the part that's missing here is service.
What makes you optimistic?
This is, I mean, like the podcast is a bit of optimism and, you know, a lot of this is very dark
and social dilemma, it's kind of dark, you know? I mean, what makes you feel that all of this work
is not simply idealism and trying to do the right thing, but actually is working?
What's going in the right direction
that makes us feel like we should continue fighting this because there actually is progress?
If you told me back in 2013 that we'd have Chris Hughes, the Facebook co-founder,
saying it's time to break up Facebook, that we'd have internal employees writing letters and
basically joining whistleblowing programs to saying we have to change things from the inside,
that you'd have 130 million people watch the social dilemma, that you would have governments
from the EU to New Zealand, to Canada, to the US that are actively working on dealing with these
issues. The structure of this machine has been very hard to penetrate. I don't want to give
people some false hopes about that. But the pace of the conditions of people turning around in
their worldviews on
all of this stuff, whether it's actually frankly on climate or on social media, I think that the
pace of that has increased dramatically. And you have to use that to extrapolate forward saying,
just like five years ago, eight years ago, I would have never been able, I wouldn't have believed you
that the things that are happening now that are positive would have ever happened. I have to sort
of anchor right now and say, given the pace of those things, that there's
going to be future things that I would never have believed you that are going to have to happen.
You've made some personal sacrifices to do this. I mean, some of the stuff that you designed
back as an 18-year-old working at Apple is still in a shipped Macintosh to this day.
You worked at Google, you had big jobs, you had the startups,
you made the money, you could have been one of the billionaires. And yet you walked away from
that life to do something that you thought was on a moral and ethical higher ground.
At great personal sacrifice, you are not the friend of the big companies with huge amounts
of money and tons and tons of lawyers. What makes it all worth it?
If the pace of change is slow, even though there is some change,
you could just go write some nice technology and make an okay living and your work will be good
work. Why should you keep doing it? The very end of The Social Dilemma,
doing it. The very end of The Social Dilemma, there's a quote from me where it says,
it seems crazy to think that we have to change the whole thing, the whole system,
the way that it's working. But when you see where it's going, you realize that we don't have any other choice. And then the director asked me, do you think we're ever going to get there?
And my answer is, we have to.
Thanks, Justin. I appreciate you taking the time. It's a hard subject. And when I invited you on,
like, I know what the podcast is called. And I was like, okay,
I know what we're going to talk about. But given the world that we're in, I think it's important to talk about it. The summary is that technology will
not solve a technology problem or technology will not solve a human problem. We have to stop using
technology to solve human problems. Give people a hug. Say, I love you. Say, do you want to go
for a coffee with me? Go for a walk. That's what solves all of this stuff. All of it. All of it.
Not an app, not a social media, anything, not a regulation, although we need all those things because it's run amok. To truly get back to being human, we actually have to be human.
Couldn't be more.
Tristan, thanks so much for joining me. Please keep fighting the good fight.
If you enjoyed this podcast and would like to hear more, please subscribe wherever you like to listen to podcasts.
Until then, take care of yourself.
Take care of each other.