Your Undivided Attention - Here’s Our Plan And We Don’t Know — with Tristan Harris, Aza Raskin and Stephanie Lepp
Episode Date: February 3, 2022Renowned quantum physicist Richard Feynman once wrote, "It is our capacity to doubt that will determine the future of civilization." In that spirit, this episode is a little different – because we'...re talking openly about our doubts, with you, our listeners. It's also different because it’s hosted by our Executive Producer Stephanie Lepp, with Tristan Harris and Aza Raskin in the hot seats.How have we evolved our understanding of our social media predicament? How has that evolution inspired us to question the work we do at Center for Humane Technology? Join us as we say those three magic words — I don't know — and yet pursue our mission to the best of our ability.RECOMMENDED MEDIALeverage Points: Places to Intervene in a SystemSystems theorist Donella Meadows' seminal article, articulating a framework for thinking about how to change complex systems. Winning Humanity’s Existential GameThe Future Thinkers podcast with Daniel Schmactenberger, where he explores how to mitigate natural and human-caused existential risks and design post-capitalist systemsLedger of Harms of Social MediaThe Center for Humane Technology's research on elaborating the many externalities of our technology platforms' race for human attention Foundations of Humane Technology CourseCHT's forthcoming course on how to build technology that protects our well-being, minimizes unforeseen consequences, and builds our collective capacity to address humanity's urgent challengesRECOMMENDED YUA EPISODES 36 - A Problem Well-Stated Is Half-Solved: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solved42 - A Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen43 - Behind the Curtain on The Social Dilemma: https://www.humanetech.com/podcast/43-behind-the-curtain-on-the-social-dilemmaYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
Transcript
Discussion (0)
So I, at the age of 22 or 23 years old, had started this company called Apsure, Capture Without the Sea.
We raised venture capital, we did the startup thing, we had employees, we were building a product that made it easier for people to learn more about anything without leaving the website that they were on.
And I had to come up with this whole narrative to recruit people.
You have to say, we're going to change the world, you have to have a purpose, so it's not just going to make money, it's going to be a positive.
impactful, mission-aligned objective in the world.
And our friend Joe Edelman started this thing called Doubt Club,
which gathered a group of startup founders together once a month to express our doubts.
And one of the insights behind this thing called Doubt Club
was that as a technology founder, when you're building technology,
there's nowhere you can really go to express doubts about the entirety of what you're doing.
There's certain things that you can tell your co-founders about,
but maybe you don't tell your investors about.
And there's certain things that maybe you can tell your investors about,
about your co-founder, but you won't tell your co-founder about.
And then there's certain things you can tell your employees,
but there's no really safe place to turn
if you just want to question the entire thing.
And so much of the things that people work on in the tech industry
are based on these narratives that we persuade ourselves of,
that we are changing the world,
we are making the world a better place,
Facebook is making the world more open and connected.
Twitter is enabling everyone without a voice to speak.
And those narratives are convincing.
They're partial truths.
But Doubt Club was a place that both A's and I went to
that was where we questioned whether or not
this was even a good thing to do in the world.
Renowned quantum physicist Richard Feynman once wrote,
It is our capacity to doubt that will determine the future of civilization.
of civilization.
That's Stephanie Lep, the executive producer of your undivided attention.
And this episode is going to be a little different.
It's going to be different because Aza and I are going to be in the hot seat and Stephanie
is going to be hosting.
And it's going to be different because we are going to be talking openly about our doubts
with you, our listeners.
I'm Stephanie Lep.
I'm Hazer Askin.
And I'm Tristan Harris.
And this is your undivided attention.
In 2018, Tristan, Eza, and our third co-founder, Rondima Fernando, launched the Center for Humane Technology, or CHT.
And in launching CHT, we went from privately questioning the startups that we were building
to publicly challenging what the entire social media industry was creating.
And we continued to evolve our understanding of what is generating our social media problems in the first place.
We continued to doubt whether what we were.
we were building was up to the task, which is what we'd love to share more of on this show
with you, starting today.
Hi, Tristan. Hi, Aza.
Hey, Stephanie.
Hi, Steph.
Okay, Tristan, I'm going to start with you.
So, I know there have been many influences on your thinking and on inspiring you to ask
big questions, but I'm going to ask you about one in particular.
So in 2017, you were in the midst of launching the Center for Humane Technology, this new organization to address the implications of social media.
And you heard an episode on the Future Thinkers podcast featuring Daniel Schmachtenberger.
And I actually remember that episode coming up on my Facebook feed from you.
And so to kick us off, can you talk about your experience hearing that episode?
Sure. Well, so here we were working on the Center for Humane Technology where a new nonprofit, we think we're going to somehow change these companies, specific actors, and we're going to change their business models, or we're going to change how the designers think, and we're going to get new designers to think in a different way.
And somehow, we didn't know how we were hoping we could kind of change tech, looking at that as a narrow system, as a narrow set of actors.
And I was driving in my car one day, I was driving, I think, down to the Hewlett Foundation.
for an event on disinformation.
And I remember crossing 280
and I was listening to this podcast
someone had sent me with Daniel Schmachtenberger.
And I think it was called
the existential game.
And it was really about
the broader predicament
that we're in
in the way that our economic system
organizes incentives.
He brought up the point,
you know, can you have world peace
with a for-profit military industrial complex
where you have to have growth
in arms sales every year?
Maybe, but notice that you have to
have growth in a for-profit arm sector every year? Can you have health in a world where you have
a for-profit pharma industry that wants to sell people more drugs every single year and create
in the U.S. one of the most medicated countries on earth? Can you have humane technology that's
oriented around people's offline lives and experiences when that's not nearly as profitable
as social media that organizes people's behavior
for as many online experiences as possible,
virtualized identity, virtualized human interaction.
I saw that the problems that we're seeing around the world,
whether it's deforestation or plastics or climate change
or social media driving these negative trends,
we're all part of a kind of a win-lose game
where if I don't do it, the other guy will,
with a perverse incentive.
So we don't have a thousand different environmental problems or pharma problems or food problems or tech problems.
We just have what he calls these generator functions of existential risk.
And once you see that, you start seeing how there's a part of it that's disenchanting, right?
Because one part of it is, oh my God, how do we stop that?
And then there's the other part of it, though, that's clarifying and empowering, which is you start seeing that there's a core
set of reasons why we keep producing plastic even though we know it's ruining the planet
or why we keep deforesting even though we know we should stop and why you know we shouldn't
create social media engagement that goes that low in the brainstem that manipulates people and
personalizes so much that it causes polarization but we have to have a growth paradigm on top
of social media engagement so there's a kind of a clarifying aspect to that realization and
specifically what I remember is I knew about externalities but Daniel talked a lot about how
as different agents are racing to do their extraction,
whether it's extraction of generating plastics
or racing to pull attention out of human beings
with more and more efficiency,
across the board, you get more efficient profits
by socializing the costs.
So you privately profit,
and then the costs show up on society's balance sheets.
You extract the attention with personalized AI,
but then the cost to society of a broken shared reality,
that doesn't show up on Facebook's balance sheet
as a trillion-dollar cost of breaking democracy,
that shows up on society's balance sheet
as democracies that don't work anymore.
And when I saw that,
and just even the notion of externalities,
that's actually where at Center for Community and Technology,
we came up with a project called the Ledger of Harms,
which was meant to accumulate the ledger of unaccounted-for costs
onto society's balance sheet from technology.
And that included mental health, alienation,
loneliness, conspiracy thinking, polarization.
And that was just a big insight
of basically this is the system.
This is the collective system
that produces kind of our existential game
for whether we as a species make it.
And frankly, we had had those conversations
back in 2017.
I remember Asa rushing back into the Common Sense Media Office
and telling you and Randy,
like I had just listened to this mind-blowing podcast
and I was like, we have to start the ledger of arms
and you all have to listen to this
and we have to figure out a way
that we're going to deal with the existential game
and how do we actually narrowly focus on social media
when this is a set of problems that's consistent
across the entirety of how our society is structured.
He's a, do you remember Tristan sharing that episode with you?
I do remember because I was driving up to Grass Valley for my birthday.
And there was one line in particular that really struck me.
And that was, there was a line about what is the value of a tree?
and Daniel in this episode is walking through it at trees value well it's really hard to measure
but you know tree is providing shade for you and other animals it's providing an ecosystem
like there's a whole root system and there are bugs and insects in there it's providing nutrients
to the other trees that are connected to it there is an incredible wealth of things a tree
provides but how do we value a tree we value it just as lumber with one number and that one
number hides all of the complexity of that tree. And later I ran into Danella Meadows' work
who talks about systems, thinking in systems, how to change systems. And, you know, she had this
wonderful quote, which is, the world is a complex, interconnected, finite, ecological, social,
physiological, economic system. We treat it as if it were not, as if it were divisible,
separable, simple, and infinite,
our persistent, intractable global problems
arise directly from this mismatch.
And it was a kind of waking up,
realizing that you couldn't possibly have a flourishing world
if you value things based on a single number, right?
We do the exact same thing of over-optimization
in our startup world
when we have a metrics like engagement
and we just game everything to make that metric go up.
And, of course, everything else gets swept into externalities.
Yeah, if I recall correctly, I mean, Daniel, who is actually borrowing from Forrest Landry's work,
one of his brilliant collaborators down in San Diego, California,
who I think the framework is called abstraction, extraction, and then commodification.
So abstraction is, that's not a tree, let's abstract it into, it's a bunch of two-by-force, right?
So we go from a living tree
It's a complex system versus no
There's just this many two by fours I can get out of that tree
So you're abstracting the value of the tree
Into something more predictable
Then you extract it into that form
And then you get depletion of how many trees there are
Because you're cutting down trees into two by fours
And then you get pollution
Which is that you get ways in which that tree
Was creating kind of ecologically balanced environments
For its surrounding ecosystem
And that balance goes away
And turns into either pollution or different
And that notion of abstraction, extraction, depletion, and pollution is what's happening everywhere.
So now if you look at attention, you know, here's a person and there's, like imagine seeing a baby,
you see the potential of this child and all the things that they can become and the living being
that is each of us that when you open your eyes in the morning and just for a moment before you look
at your phone, the kind of infinite possibility of what could emerge from that consciousness
in that day. That's like the tree. You're like this living complex, you know, theoretically free
if we talk about free will, which is its own conversation, a person.
But of course, that free person isn't worth nearly as much that way as they are as finite units of attention.
The two-by-fours of a tree is the same as the predictable time slices of you scrolling with your finger over and over again
as one second of attention, two seconds of attention.
Because Facebook and social media companies also need to abstract you into predictable forms of attention,
extract those forms of attention, those units, those commodifiable units of a person's eyeball
staring at a thing that can be sold to an ad and monetized at a predictable rate of exactly 0.67
but then that scale times 3 billion people is billions of dollars.
And then depletion and pollution.
So the depletion in this case of people's well-being, because you're sucking people's life force
out of them and making them spend more time with their esophagus compressed at 45 degrees
staring at a screen in the morning for an hour, then the kind of freedom of that person
if they were to wake up and do yoga or express themselves or go dancing or do something
totally free and open.
And then you get depletion, then you get pollution, which is what is the pollution that
comes in society from people's freedom being commodified into these predictable slabs
of not just scrolling, but again, outrage generating.
So the more people are posting highly engaging material on Facebook as opposed to lowly engaging
material, well, highly engaging is going to be more profitable.
And highly engaging produces these negative society.
because you end up with the most crazy town of the attention economy becoming mainstream
town of our world.
So I think there's both a kind of terror that comes with seeing that big interconnected
system operating that way, but there's also a kind of clarity that comes from that.
But then what do you do about it?
And I remember actually having these moments where, you know, this is classic and developmental
theory and learning where like you kind of learn a new thing, but then you don't know how
to stabilize that new thing.
And, you know, that's also what society was wanting at that time from us, like they wanted
to know how is this affecting our children, how is, how addictive is it, and we would list all
these stats, and here's how it's affecting attention span. And so it actually kind of felt like
not being honest with the world in a way about our true understanding of the broader
situation. And that's kind of why I think, Stephanie, we wanted to have this podcast actually
was so we could talk about that transition so people could understand, I think, the scale and
scope of what we really need to address. It's not just social media. It's how all these
systems interconnect and what's at the root of it all.
I think one of the things that made it really difficult was this awakening realization
that if I cared about tech or was making tech and was not aware of these underlying runaway systems,
then I was part of the problem.
And yet still not knowing what to do.
So, you know, in that period when what would become Chty was gaining steam,
remember Tristan, you and I sitting in a cafe in West Oakland having a conversation
and you were critiquing some of the interfaces I was making
and I was critiquing some of your approach
because it was like, hey, I see what you're saying
about the attention economy,
but I don't see how you're going to do anything about it.
I don't see what the form you could be choosing will do.
You could talk about it, but is that going to really change
any of the economic incentives?
Startups are still the most capital-efficient way of changing the world.
Those are the words I remember saying, word for word.
So I'm going to continue working on this product
and I'll have bigger effect than you sort of like complaining from the sidelines.
I remember that sort of like the, it's ramping up a little bit,
but it's sort of like I remember what my mind was like back then.
Well, this is actually really, really important.
It really gets to when we talk about in systems change,
changing the paradigm from which people are thinking,
because what you're speaking to is it is the paradigm that you and I were born and raised into,
basically.
You know, I went to Stanford, which is conveniently situated right between Sandhill Road
where all the venture capitalists are and Page Mill Road,
what all the lawyers are, and it makes it the most efficient sort of venture capital startup
creating ecosystem.
And I remember, I was a Mayfield fellow, was a program at Stanford.
Many different tech companies actually were part of this program.
And they train you, basically.
If you want to be a socially acceptable human being as a graduate from Stanford, you need to
start a tech company.
And that's the only form of success.
And when you tell your friends that you're thinking about building a nonprofit or thinking
about learning how to cook, you don't get any reaction.
But if you tell people you're starting a tech company, then that's how you get your
social validation.
So when you go back to that conversation, you and I were having at that cafe, you know,
I remember that too.
And I remember thinking very cynically about my own thing that I was doing.
I mean, what are you going to do when you see this problem?
You're going to think about it.
You're going to talk about it.
It's this massive amorphous thing, which is just the system inside of which we are living.
How can knowing about it or talking about it change anything?
Great question.
And I know that that has been a big source of tension.
for both of you. And so I would love to lean into that for a second. So you created the Center for
Humane Technology in order to address the doubts that you were discussing in Doubt Club. But then
you're confronted with these deeper and more amorphous doubts about what's generating the issues
with technology and climate change and inequality and and and and so indeed as you asked Tristan
what do you do with that? You know, what do you do?
with that. Meanwhile, in the early days of CHT, you were going to D.C. and talking with legislators
and just kind of getting drawn into this burgeoning but somewhat narrower conversation about tech
reform. And so I would love to hear a little more about the tension there, you know, between
the doubts or the awakening that you were having on one hand and the actual work that you were doing
and that the world was asking you to do on the other.
Well, yeah, I think that's such a great prompt.
I distinctly remember actually flying down to Mexico
for this conference called Ciudad de Idaeas.
It's the City of Ideas Festival in Mexico City
and giving a talk on tech reform.
And I remember going back to my hotel room
after giving the talk about all the issues of technology,
but I was really first waking up to this bigger meta-crisis.
And I felt physical pain in my sense.
stomach, and Randy will remember this actually, because I just didn't know how to square
this specific issue of technology, which we are obviously experts in and had some capacity
to advance, with this bigger issue that even if you got the tech issue addressed, even if you
perfectly addressed that, you still have abstraction, extraction, depletion, and pollution
running on a finite planet and a finite ecosystem across every other domain. And so the
sense of it doesn't add up. You could quote unquote solve some narrow aspect of some of these
problems, but you wouldn't be addressing the bigger risks that determine whether we make it or not.
And that physically and emotionally and spiritually, I think, inside of me was really, really, really
hard. And in fact, especially the kind of gap between the two, because the world wasn't asking us,
well, what do you do about abstraction, extraction, pollution, and depletion, and existential risk,
and how do we make it through the bottleneck of the systems that we've built into some different paradigm.
They were asking us, here's what Facebook did yesterday.
Would you please comment on it?
And maybe that'll help advance the conversation.
You know, the gap between what we were holding and what we were known for was painful.
Now, the good news, though, I would also say, and you should jump in,
but is that the world really wanted to know about what was wrong with technology.
It was a new issue.
You know, we had climate change.
We had environmental issues.
We had inequality on the agenda.
we didn't have technology on the agenda.
And so there was this opportunity
to speak to how these systems were breaking
in a way that reflected some of the bigger trends
that we were looking at.
Issa, do you want to say anything about that?
I just wanted to add in,
to hold these sort of frames
was very isolating.
One of the things that really helped me
is there are two types of ways
of talking about the world's problems.
The first kind of ways that you just sort of like,
okay, here's the world,
and then there are a whole bunch of problems.
problems just sort of like emerging happening to it.
But if instead you talk about what are the mechanisms,
like describe the machine that when you peddle it as an outcome,
a predictable outcome, you get climate change, pollution, inequality, exploitation,
then your mind it can only reach for solutions that change the fundamental working of that machine.
And while that doesn't mean like, wow, that's so easy,
it really changes at least my felt sense of agency
for what we can do
because now what we can do
can fit the scale of the problem
like we're describing the problem well
which means the problem now becomes
at least partially tractable.
So within the last year
I think there has been a bit of a breakthrough for you
in terms of speaking publicly
and explicitly from this
generator frame. You know, last June on your undivided attention, we had for the first time
Daniel Schmachtenberger in dialogue with you, Tristan. And so I just want to ask you, how has that
felt for you to explicitly inhabit this generator frame in public? Yeah. When you understand those
generator functions, that there's a growth imperative tied to abstraction, extraction,
depletion and pollution, it really does create this split.
And the reason I want to double down on this feeling of alienation or isolation is because
it may be something that you listeners out there might be feeling listening to this podcast.
I mean, I definitely have heard from people who say, you know, wow, like this podcast is
so informative, it's so nice, so great, but then I feel like alone because then I'm understanding
these things about the world that other people around me don't understand.
And one of the reasons we wanted to bring these worlds together is first I think we do
need more community, right? People need to understand that other people feel this too and understand
this too. And that walking worried sense that something's not right does have grounding.
Your mammalian system is not lying to you. But it's not that it's a pedophile Q&on elite that's
kind of running the world and that's the big crisis that we're facing or it's not that it's just
one political party is taking power. It's not that, hey, there's just these one or two
evil oil company executives who aren't doing the right thing because if one oil company
didn't do what the other one would. Again, it's that we're caught in these races that produce
these negative outcomes, the runaway systems beneath. And so Stephanie, to your question about how
does it feel getting to kind of go out publicly and talk about these things, it feels really,
really good and important to me because these have been two parts of my own identity and my own
psychology that I've been holding and they felt separate for too long. And it hurts to
hold them separately. And that won't go away because the world does still perceive technology
as just an issue. How do we deal with what Facebook did yesterday? They changed their policy
on X or Adamasseri from Instagram testified to Congress about why. What do you think? Microphone
in your face. You're commenting on these narrow moments or choices or design events instead of
how these systems are operating. And I do feel a lot better communicating more from these
generator functions, and we're hoping with this podcast also to take you listeners out there
along with us for this ride, because that's what it's going to take to address these problems.
They're not separate issues. They're connected by these fundamental runaway systems that we mentioned
earlier. In Dinella Meadow's framework, there are 12 leverage points in a system going from
constraints and stocks and flows at number 12, all the way up to transcending paradigms at number
one. And it's not like you just then focus on paradigmatic change. You have to do all of them from
times and delays to changing the incentives of the system to the stocks and the flows. And so I think
there's a way in which if we all as individuals and communities and societies share a similar
diagnosis, if we can articulate the problem, then we naturally, in a decentralized way, all start
pushing in the same direction.
And when we all push in the same direction,
like no system can stand up to that.
And it's just the acknowledgement
that there is room in here
for hope.
So to bring it back to systems theorist
Dinella Meadows,
one thing I love about Dinella's
leverage point framework and just like her general
M.O. is that she
totally and unabashed
acknowledges we don't know.
You know, she had humility.
And so in that spirit, I wanted to read to you the last paragraph of her seminal piece,
leverage points, places to intervene in a system, she writes,
magical leverage points are not easily accessible, even if we know where they are
and which direction to push on them.
There are no cheap tickets to mastery.
You have to work hard at it, whether that means rigorously analyzing a system,
or rigorously casting off your own paradigms
and throwing yourself into the humility of not knowing.
In the end, it seems, that mastery has less to do with pushing leverage points
than it does with strategically, profoundly, madly letting go.
We're trying to change a system.
And there are a couple theories for how you change systems.
Danella Meadows has this 12 leverage points on how to do it,
But even people who are immersed in it, like there is no magic bullet.
And these are complex systems, which means you do not know how they're going to react because there are so many subcomponents until you start to act on them and sense how it changes.
And so it's this sort of continual dance to try to change a system.
And you don't know whether it's going to work before you try.
Which is to say also, like it's not like the Center for Human Technology, this podcast, we have some kind of master plan, that we have some master purpose of narrative.
narrative story that we know exactly what's going to fix all this. We're going to tell you what
it is. We're all figuring it out together. But I think we want our audience to come with us in
thinking about this ecosystem of change as a system. And so with that, I would love to bring us back
to Doubt Club and ask you, how can this show be a kind of public doubt club? So startup
founders don't necessarily share their doubts with their co-founders or teams or investors,
but how can your undivided attention be a venue for us to share our doubts,
you know, our biggest questions and unknowns and uncertainties with our listeners,
you know, in a way that serves us all?
Yeah, I mean, I think we would like on this show to be, you know,
coming with questions and expressing our doubts about what is the right thing to do given the situation.
You know, I just did a big interview this morning and people are always asking,
asking, you know, what should we do about this? What can I do? And the truth is that I don't know
the answer to how this whole thing is going to change, right? We have been working on this for now.
I think me personally, I'm clocking up to eight or nine years. And that's a long time.
And I still don't know how this is going to change. But I will say that I didn't know back
then either. And so many unexpected things, per de Nella Meadows, have continued to happen.
And I don't know where all the change is going to come from. Who would have predicted a Francis
Howgan five years ago, who would have predicted Attorney General lawsuits years ago?
Who would have predicted that the co-founder of WhatsApp would come out and say, hashtag, delete
Facebook? There's a million things that are happening now that we would have never anticipated.
Yeah, one of the things I'd love for us to get better at is figuring out how to ask our guests
on your undivided attention to voice their own doubts. And it's hard, right? Even when I think
about answering that question, Steph,
I find it difficult because there are days
when I think the social dilemma,
the movie that I know has been seen
by 100 million, 150 million people.
I'm like, did that really matter?
Did it do anything?
Like, what did it push on?
And I'm out in the world
and people reference it
and that'll make me feel good in the moment.
But, you know, Facebook's market cap
continues to go up.
And none of the underlying
generator functions have really shifted. So did it matter? And I go back and forth. Yeah, and it's hard
because sometimes people look to you for certainty. So if you're going to be forthcoming about your
doubt, how do you kind of do that in a responsible way? It's kind of a delicate dance.
Yeah, I mean, imagine if you're running for president and someone says, like, so what's your plan?
And you're like, well, I'm not really sure. Here's some things that I'm thinking about might help.
But like, obviously none of us really know. Who can say that?
Here's my plan and I don't know. I mean, that's my plan. I don't know. I mean, that's
That's also what's amazing about Dinella, though. Here's my framework and...
I don't know.
I don't know.
Eza, can you talk a little bit about the work that you're doing outside of the Center for Humane Technology and how it enables you to lean into or engage with your doubt?
Yeah.
So the other work that I do is a co-founder for a project called EarthSpeed.
Project or ESP, and its goal is use the latest in machine learning to decode the languages of non-human species.
And the goal is actually very much inspired by Donella Meadows, which is, you know, a paradigm change.
Can we shift the way that we think about ourselves and our relationship to the rest of the planet?
And I don't know if it'll work.
Roger Payne, a whale biologist, who in the 1960s released this record,
The Songs of the Humpback Whale, with his wife at the time, Katie Payne.
And it created Star Trek 4, go back in time and save the whales.
It goes on Voyager 1 as the first track on the Golden Record,
representing not just humanity, but all of Earth.
It's distributed 100,000 times back then, which is huge.
And it's played in front of the UN General Assembly,
and it's sort of like the galvanizing artifact for banning deep sea whaling,
which is why we still have humpbacks today.
There's humanity going to the moon,
those images of seeing ourselves from the outside, Earthrise, and Blue Marble
are still the most viewed photos in world history.
And when there were human beings standing on the moon,
that's when the EPA came into existence.
Noah was born, the modern environmental movement was born,
Earth Day was started, the Clean Air Act was passed in the Nixon era.
And so I don't know whether Earth species will create the moment,
which then galvanizes a shift towards radical sustainability and empathy
and shifting our own self-image.
But I have hope that it could.
Quantum physicist Richard Feynman, who won the Nobel Prize for Physics in 1965
and is considered the founding father of nanotechnology,
he said the secret to his scientific success was his ability to embrace doubt and uncertainty.
And so to close, the question I want to ask you is what one action can our listeners take
to strengthen their capacity to doubt?
It is really hard to doubt on your own.
Because you don't know when those doubts are true doubts that you should be listening to
and when those doubts are the doubts that you should like continue past.
And so one thing that you can do is at whatever level you are,
whatever your peer group is,
pull together two, three of them and make yourself a doubt club.
I think it's a really powerful way to find the places
that you have been deceiving yourself
and get to see yourself from the outside.
Yeah, I totally agree.
I do think starting a doubt club with your friends,
don't listen to this stuff alone
don't think about this stuff alone
be in community
I do think that's one of the right answers
but I'm not sure about it
the other thing
which despite all the things we just said
about doubt
in fact maybe not despite
we don't have all the answers
but by the way
here's some answers that you can check out
we have answers on our website
with 100% certainty
it's all in the spirit of our new mantra
here's my plan and I don't know
but one of my life models
just so you know is things went
exactly as unplanned
great but something we can recommend with a hundred percent certainty except for the doubt that we come along with it is we're just releasing a course shortly that talks about a lot of fundamental principles a few main technology and that can help arm you to make your own good decisions locally because it's very hard to answer the question like what can you do but we can give you a good set of questions that help you answer the question what can you do all right well tristan aza thank you for being willing to put out a course put out a plan
put things out and say, I don't know.
Thank you, Seth.
Thank you so much, Steph.
Tristan Harris started his career as a magician.
He studied persuasive technology at Stanford
and used what he learned to build a company called Apshur
that was acquired by Google.
It was at Google where Tristan first sounded the alarm
on the harms posed by technology
that manipulates attention for profit.
Today, Tristan is the president and co-founder of the Center for Humane Technology.
Aza Raskin was trained as a mathematician and dark matter physicist.
He took three companies from founding to acquisition before co-founding the Center for Humane Technology
with Tristan and Rundema Fernando.
Aza is also a co-founder of the Earth Species Project, an open-source collaborative nonprofit
dedicated to decoding animal communication.
And if you work in technology,
one way that you can start thinking and problem solving
from an understanding of the generator functions of existential risk
is through a new free course that we're launching.
The course is called Foundations of Humane Technology,
and it'll prepare you and your product team
to build technology that protects well-being
and help strengthen our collective capacity
to address the most urgent challenges facing humanity.
You can sign up for updates at HumaneTech.com.
Your undivided attention is produced by the Center for Humane Technology,
a non-profit organization working to catalyze a humane future.
Our executive producer is Stephanie Lep.
Our senior producer is Julius Scott.
Engineering on this episode by Jeff Sudaken.
Dan Kedmi is our editor at large,
original music and sound design by Ryan and Hayes Holiday,
and a special thanks to the whole Center for Humane Technology team
for making this podcast possible.
You can find show notes, transcripts, and much more at Humane
A very special thanks goes to our generous lead supporters, including the Omidiar Network, Craig Newmark
Philanthropies, and the Evalve Foundation, among many others. And if you made it all the way here,
let me just give one more thank you to you for giving us your undivided attention.
Well, is this the end, Steph?
I mean, that... I don't know.
We're going to do... I don't know. It's my plan.
