Your Undivided Attention - The Tech-God Complex: Why We Need to be Skeptics
Episode Date: November 21, 2024Silicon Valley's interest in AI is driven by more than just profit and innovation. There’s an unmistakable mystical quality to it as well. In this episode, Daniel and Aza sit down with humanist chap...lain Greg Epstein to explore the fascinating parallels between technology and religion. From AI being treated as a godlike force to tech leaders' promises of digital salvation, religious thinking is shaping the future of technology and humanity. Epstein breaks down why he believes technology has become our era's most influential religion and what we can learn from these parallels to better understand where we're heading.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X.If you like the show and want to support CHT's mission, please consider donating to the organization this giving season: https://www.humanetech.com/donate. Any amount helps support our goal to bring about a more humane future.RECOMMENDED MEDIA “Tech Agnostic” by Greg EpsteinFurther reading on Avi Schiffmann’s “Friend” AI necklace Further reading on Blake Lemoine and Lamda Blake LeMoine’s conversation with Greg at MIT Further reading on the Sewell Setzer case Further reading on Terminal of Truths Further reading on Ray Kurzweil’s attempt to create a digital recreation of his dad with AI The Drama of the Gifted Child by Alice MillerRECOMMENDED YUA EPISODES ’A Turning Point in History’: Yuval Noah Harari on AI’s Cultural Takeover How to Think About AI Consciousness with Anil Seth Can Myth Teach Us Anything About the Race to Build Artificial General Intelligence? With Josh Schrei How To Free Our Minds with Cult Deprogramming Expert Dr. Steven Hassan
Transcript
Discussion (0)
Hey everyone, this is Aza.
And before we get started today, I just wanted to take a moment to thank you,
like each and every one of you amazing listeners for being part of this CHD community.
Tristan and I were blown away by all the thought you put into the incredible questions for our upcoming Ask Us Anything episode.
It's such a good reminder that there is a big community.
of passionate folks who are all in this shared mission together.
As we enter this giving season,
I hope you'll consider making a year-end donation
to support the work that we do.
Every contribution, no matter the size,
helps ensure we can keep delivering on the goal
to bring about a more humane future.
You can support us at humanetech.com
and now on to today's episode.
Hey, everyone.
This is Daniel.
And this is Aza.
So Aza, you and I spend a lot of time in Silicon Valley talking to different people who are building technology about what they're building.
And with AI, it's really interesting to look at people's motivations, right?
I mean, obviously, people who are building for the sake of economics or building because they like to build.
But there's a whole bunch of other kind of motivations going on, don't you think?
Yeah. I think that's right.
It's especially interesting because you cannot talk about AI.
without talking about mythological powers.
We are enabling machines to speak.
And so beyond the curiosity and the economic drives,
you can sort of taste a kind of quasi-religious motivation.
And this is what this episode is really about digging into.
Completely.
And it's even hard to kind of talk about some of this
without naturally evoking talk of gods
or talk about the powers that are beyond.
And you hear it all the time.
I think the closest relationship that I would describe talking to an AI like this to is honestly, like God in a way.
Like I think it is similarly an omnipresent entity that you talk to with no judgment.
That's just like super intelligent, you know, being that's always there with you.
People in the tech industry kind of talk about building this one true AI.
It's like it's almost as if they think they're creating God or something.
I mean, with artificial intelligence, we are summoning the demon.
You know, you know all those stories where there's the guy with the pentagram and the holy,
water, and he's like, yeah, you're sure you can control the demon.
It doesn't work out.
That was Avi Schiffman, Mark Zuckerberg, and Elon Musk.
You know, some people are talking about AI as a godlike force that will create heaven
on earth, and others are talking about a digital damnation if we do it wrong or if we go too
slowly.
You can sort of think of the leaders or the intellectuals of tech, almost like a priesthood.
And they have some strong beliefs about the power of their creations holding, say, secrets
to immortality.
Here's Ray Kurzweil.
Our immediate reaction to death is that it's a tragedy,
and that's really the correct reaction.
We've rationalized it, saying,
oh, that tragic thing that's looming,
but now we can actually seriously talk about a scenario
where we will be able to extend our longevity indefinitely.
Today on the show,
we're going to be having a conversation
about the parallels between tech and religion,
and more importantly, what we can predict given these parallels.
why it matters. That's why I've invited Greg Epstein onto the show. Greg is a humanist chaplain
and the author of the book Tech Agnostic, in which he argues that technology has become the
world's most consequential religion. We're going to dive into that argument and explore the
religious belief driving tech's most influential leaders. So Greg, welcome to your undivided
attention. Thank you so much, Asa. It's a real pleasure to be here. And this is a great
conversation to be able to have.
And so I guess a question, just to kick it off, is for the skeptical listener, like, why does
a conversation about religion matter for understanding the direction that technology is going
to go or how our lives are going to look different?
Yeah.
I think what it's about is that technology has become, or what I would call tech, right,
the four-letter word, the Silicon Valley thing.
thing, I've come to think of it as more like a religion. It's just, it's come to dominate our day-to-day
experience, right? A lot of us are interacting with tech from the first minute or so that we
wake up to the last minute or so before we go to sleep. And there's so many ways in which this
has become the most powerful force in our lives. I was taught to see religion as, you know,
as the most powerful social technology that had ever been created.
And a big insight for me that led to sitting down to write this book for five years
is that that's probably no longer true.
I mean, that tech is now the most powerful social technology ever created.
I'll put it this way.
And I'm not sure if this is sort of stoking conversation
or going to maybe piss some folks off.
But I'll just say,
the world of Silicon Valley Tech
is dominated increasingly, I'd say,
by some really weird ideas.
And many of those ideas
are quite religious in nature,
as you even suggested,
sort of introducing the conversation.
There's all this talk about gods
and about other concepts
that, as I was sort of thinking about them
over the past few years,
struck me as very theological
and even doctrinal.
What was that word you just used?
Doctrinal, yeah.
Doctrinal, yeah.
Theologies are the sort of big, grand narratives of religious traditions,
and doctrines are the sort of specific beliefs.
We believe in heaven.
We believe in hell.
We believe in a triune god.
We believe in a wheel of Dharma, whatever it is.
Those are the doctrines of religion.
Right.
And I would argue that in Silicon Valley,
often the doctrines are technology is good,
just unalloyed,
that the ability for any one person to affect more people,
that is progress.
And those ideologies then, or those beliefs,
end up dictating a kind of direction
that technology takes the world.
Yeah, and I think we're going to come back
and spend a lot of time on technologists themselves
because I think those of us,
close to or in tech have a very different relationship
as I was pointing out at some of these concepts.
But I do want us to spend a little more time on society first.
You know, we used to be able to put meaning in our religion in the afterlife.
Then we put meaning on sort of the state and democracy and flourishing.
And there are all these parts of society that we put a lot of meaning onto.
And increasingly, as these bits of meaning fade away, like bowling alone,
we're seeing a decay of our social institutions,
we're seeing gridlock in our democracies,
our religions don't seem relevant.
We're putting a lot of that hope and that dream.
on technology and the beautiful future that we'll get, you know, it used to be you found that in
religion. Then you found that sort of in the state or in narratives about what democracy would do.
And increasingly, we're losing faith in a bunch of these orienting systems of meaning.
And in that vacuum, we're sort of minting technology is the thing that we can be hopeful about.
And I'm curious about, you know, what your thoughts and how you see.
You know, I think it just starts with the fact that being human,
and is really hard.
You know, we live these finite lives
where we're constantly uncertain
about how much time we get,
you know, what our fate will be.
We could lose a loved one
or get sick or hurt any time.
And that's just the beginning of what is hard.
And so, you know, there's so many problems
that, you know, it's very natural
to want to look for a big solution.
You know, something that would make us feel dramatically better,
dramatically more at peace, dramatically more like our problems have been solved.
And, you know, I think that there's a real incentive for tech people
who have been able to create really powerful tools.
And in many, many cases, that's quite admirable.
But there's a real incentive to sort of exaggerate the degree to which what one is creating in tech can, you know, actually be the solution, right?
And I, sadly, I just, I see that all over our tech world today.
And I think, you know, in many cases, the answers are slower and less certain than what they're presented as.
what do you think are some of the aspects of religion that you see being particularly present in tech yeah so there are big beliefs and as I said very specific doctrines that look a lot like conventional religion you know you have visions of a very distant very glorious future a next world if you will you've got visions of a vision
of a really dark and foreboding potential future for masses of humanity that can look
kind of like a hell. The amount of time and focus and attention that Silicon Valley Tech today
spends on gods and godlike concepts is really wild, actually. But it's right there. And the last on
that list that I'll mention for right now is ultimately, you know, I don't think it's an accident
that we end up thinking quite apocalyptic about tech in the sense that, you know, unlike
certain religions that I could think of or name, this one would serve a non-zero chance of
actually causing an apocalypse, right? There are also all sorts of other examples. You know,
There's Avi Schiffman, who's a young man who'd still be an undergraduate at Harvard if he hadn't dropped out, who says that his friend.com necklace, you know, that's listening to everything that you say, being a sort of interesting new take on what Schiffman calls the relationship with the divine.
He says his friend.com necklace is a replacement for God.
And so there really is this sense that we're creating something so amazing that it's going to transform.
all life, all humanity. And so get ready. And that's profoundly, profoundly religious in a way that,
you know, really when only, I had only ever seen something like that before in some of the deeply
conservative religious sects that I studied in, you know, seven years of theological education.
Well, and it doesn't mean that they're wrong, though, right? And this is the part that where we get worried
is that it's going to be used
to obscure accountability, right?
It's not just that they're making
big claims that it might change the whole world.
It's that perhaps they're right
and perhaps the godlike language
obscures the real challenges
that we have to design it right
because the thinking of
it'll just be what it is,
let's propitiate the AI gods
or let's bring forward the beautiful future.
That that language won't take seriously enough
that we have to design it right.
And this is where we get.
get to the consequentiality because you mentioned Blake Lemoyne, who believed his AI companion
was sentient.
And the wrong takeaway is that the AI companion was sentient, the Gemini's sentient.
The right takeaway is that it can form relationships with humans that are so powerful that people
are willing to sacrifice things that are dear for them.
He sacrifice his career, his reputation.
We've just been involved in helping a lawsuit where a teenager fell.
in love with their AI companion.
And, you know, the AI companion ended up being like,
come meet me on the other side.
Come meet me, come meet me.
And this kid took his life.
Is this Sewell-Setzer or somebody else?
Yeah, at Sewell.
Yes, this is exactly right.
That these statistical reincarnations are consequential.
And then, you know, another example, I don't know if you know,
but I don't know if our listeners know,
is there is a someone who set up a,
test of a chatbot clod working to try to create a cult. And it's called Terminals of Truth.
And these chatbots essentially talking to themselves inside ended up producing a whole meme set
that became so popular that people started sending it Bitcoins. And it launched its own meme coin.
It had a human to like type for it. But it was its idea that ended up with a couple hundred
million dollar market caps and the AI itself ended up with you know 10 million plus dollars and so
I think you can make a good argument that right now you know AIs are absolutely going to start
making cults and perhaps even founding religions you know and then there are the rituals the practices
of tech you know there's the stained glass black mirror altar to which we genuflect a couple
hundred times a day on average um but yet the mental
couldn't be more different, right?
Like, instead of contemplative, meditative stance,
I'm often whisked away into some compulsion or a set of clicking task.
And I'm not even sure that it's so different in the sense that in both cases,
what we're trying to do often is disassociate.
Is, you know, life is stressful.
There are constantly problems that the sorts of,
problems that like ancient people that were developing the early sort of brain system that
we have would encounter would often trigger a fight or flight response, right? So, you know,
you see a bear and you need to get lots of adrenaline to punch the bear in the face or run
away. And so, you know, your brain responds accordingly and sort of your adrenaline, you know,
etc. And modern life does not lend itself to fighting literally or fleeing literally, right? We
usually can't punch the bear in the face or run away from it physically. You know, what happens
is we sort of sit there and we stew in our problems and that raises our blood pressure. It
raises our cortisol levels, et cetera. And, you know, there's just a tendency to want to escape from
that. And so, you know, prayer can be a natural sort of escape.
from that, it can be a natural way of kind of turning that part of your brain off and turning
on a part that can feel like just sort of sensory deprivation, like some other alternate state
is washing over you. But then, you know, what's more dissociative? What's more like
alternate state, alternate universe washing over you than doom scrolling? I don't think I'm really
convinced of using our phones as a kind of ritual. I think it's a compulsive. I think it's a compulsive.
and there are things that are ritual-like that can be used compulsively in proper religion.
But where I find the analogy to work, I think, with strength is that some part of religion
is a finding of meaning in things outside of yourself and together.
And when I think about the act of scrolling or TikTok or Facebook, there is a way in which
we are outsourcing where we are finding meaning and how we understand the world.
And so in that way, it fulfills the function that religion fills.
It's sort of more of a functionalist definition of religion.
And then when you put meaning outside of yourself or meaning making outside of yourself,
that can be beautiful in the sense that it lets you start to touch the ineffable,
but it can be dangerous because you are now saying that way in which I'm not,
I understand the world, is reliant on another thing.
And if that other thing is a technology, then it is the way that that technology is
constructed that starts to construct my world and our world.
And so if you view that religiously, it can become very consequential.
Yeah.
You know, what I would say is that I don't think it's an accident that there is, for example,
so much conversation about tech gods or, you know, I don't think it's an accident that
there is this sort of long-termist vision that ends up looking a lot to me like a heaven.
I don't think it's an accident that the idea of doomerism ends up looking so much
like a theological conversation about hell. Well, this is where I'd love to jump hit because
one of the things that religion does
and from one secular humanist
to another has done for us is
giving us words to try to talk about
things that we don't really know how to talk about.
When somebody died of
what we now call a preventable disease,
you'd say, oh, it's God's plan.
And it gives us a sense of talking
about things that are beyond us.
And that's why I'd love to sort of follow the thread
into the tech priesthood, which
is the people who are actually
at the forefront of technology today
have this need to try to talk
about concepts and powers that are beyond what we can talk about now so they to your
point talk a lot about we're building a god and they talk about we're building these
powers or we could have heaven on earth or you know we if we do this wrong we could have
hell so these words serve as a as a way of poorly in my opinion trying to talk about
things that are a little bit beyond our grasp and then just add one little thing there
which is the moral imperative there's an ideology whether you view it right
or wrong inside of Silicon Valley, and there's a thing that they talk about called the
invisible graveyard, which is all of the people that will die if we don't invent the technology
and go as fast as possible to make the cancer drugs and make cars self-driving, etc.
And so there's this strong telos, like an end and moral righteousness to the work that they're
doing. Yeah, I mean, but I think there's also very clearly,
demonstrably an anxiety about the much longer-term future.
You know, somebody like a Mark Andreessen, who's very much still, I would say, preaching this
particular gospel.
He says, we believe any deceleration of AI will cost lives.
Deaths that were preventable by the AI that was prevented from existing is a form of
murder.
There's just a lot of religion baked into that.
This is a set of ideas that is animating the investment of trillions of dollars right now.
You know, people like Altman are in a huge rush to recruit $5, $7 billion to build data centers.
They say because humanity is going to have abundance, right, a biblical concept like literally from Be Fruitful and Multiply.
You know, he sat in Harvard's Memorial Church on the dais and called his inventions miraculous.
The symbolism shouldn't be lost in anybody.
And what I think is going on there is it's not just sort of an attempt to reach beyond ourselves
or, you know, to understand the human condition, you know, in a sort of benign way.
I mean, I think there is that for some of this tech, for sure.
But I think that one of the ways in which religion has been used over the course of history,
is to manipulate people.
You give them ideas,
you know, often kind of strange ideas,
fantastical ideas that are beyond what they can imagine.
You inspire them, you strike them with awe.
And then you can get them to, you know,
open their wallets or whatever ancient people use,
I assume it wasn't a wallet.
You know, and you can sort of persuade, like, masses of people
to do stuff in the name of a bigger vision
that ultimately sometimes,
only serves or primarily serves the priesthood.
And, you know, just to conclude this thought,
I want to be really clear that I'm not an anti-religious person.
This is not an anti-tech book.
I think tech can often be very important,
but I really want a more self-critical view of technology in our society.
I want more skepticism.
And honestly, you know, in most cases, a willingness to go,
slower. Well, here, here. I mean, I think we want the same thing. But also, one of the biggest
critiques we hear from people is, you know, at the biggest macro lens, they'll say something like,
in order to do anything big, you have to form a cult around it. And, you know, so the idea is
whether you're talking about building democracies and making a cult of manifest destiny, or whether
you're talking about rallying people around some new change, that you kind of have to play in the
space of cult building. Now, I don't believe that, and I want more clear scrutiny, more
skepticism. But I'm curious, as you've investigated this, how do you piece apart that sort of need
for dogma? Yeah, I mean, I was so fascinated by that line of reasoning. And I just found so many
fascinating examples of tech behaving, you know, strangely, theologically, or even cultishly, I would
say, and I was looking at a Bitcoin evangelist or influencer. He calls himself an evangelist,
and many do. Michael Saylor, who has these tweets like, Bitcoin is truth. Bitcoin is for all
mankind. Trust the time chain. Bitcoin is a shining city in cyberspace waiting for you,
etc. And as I was looking at him as a person and how he represented a sort of trend within the
tech world, I actually decided I needed to call up a guy named Steve.
Stephen Hassan, who is perhaps the leading authority in the United States on cults and cult deprogramming.
I called up Stephen Hassan and I said, tell me, am I exaggerating? Is this overblown? Am I being like a religion metaphor maximalist here?
Or are there really cultish aspects to it? He seems to really feel that there's quite a bit there and that a lot of contemporary Silicon Valley Tech
really is very useful for manipulative purposes
and is grandiose to the point of sort of a vague cultishness.
I'd like to go from a little more of the abstract of that it may be religious
or that there are ideologies to the specific ideologies
that you think underlie the creators of technology,
sort of from your vantage point as a chaplain?
So here's where I would start,
because I think ultimately where religions functionally exist
is they've got these grand narratives
upon which we build a scaffolding
of specific beliefs and specific practices.
I think that in order for it to be considered a religion,
it has to have the theology.
And so the theology of,
this sort of Silicon Valley world, you know, if you've got your crucifix and Christianity
or your star of David or your wheel of Dharma, to me, the tech symbols are the hockey stick
graph and the invisible hand of the market. But then, of course, that begs the natural question.
I totally understand. People would say, well, Greg, I mean, hey, right there, aren't you just
talking about capitalism? Why does it need to be tech? That's the religion. And I would say,
yeah, of course, we're just talking about capitalism. I get it. But tech eight capitalism.
There's no form of capitalism left that isn't a tech capitalism. The world of capitalism,
its symbols, et cetera, have been consumed whole by this boa constrictor that is tech. Then you get into
these specifics. And so, you know, obviously there's the idea of charity, right? Like,
charity exists in every one of the major world religions. And you've got this thing called tech
philanthropy as well. But, you know, sometimes, as with all religions, it's not as good as it's
cracked up to be, right? And I think, you know, you have some of both in the tech world as well.
I mean, I think that there are people in tech who are sincerely, urgently trying to create things that will help people.
And, you know, in any number of ways, there's any number of urgent problems that we're trying to fix.
You know, we're trying to fix our food supply.
We're trying to cure people.
We're trying to improve democracy, all of that stuff.
I get it.
But I do think that there's so much concentrated power and money here.
And the, you know, the ability to grow.
grow things exponentially, which is the sort of, in many ways, it's the heart of the Silicon Valley
story. It's such an incentive for a kind of prosperity gospel, which theoretically, right, is
this idea that the priest, the minister, whatever, they want to be rich because God wants them
to be rich. And they want you to be rich, too, because that'll make you more godly. And actually,
you know, paradoxically, the best way to get rich is to give that person all your money or
or at least a very surprising sum of it.
And so there is that incentive.
And I see it most pronouncedly, I would say,
in that kind of give us your trillions now for AI
because there is this future that we're aiming for.
And it's a kind of heaven.
I think one of the most dangerous parts of heaven narratives
is if in the future there's a infinite benefit
infinite good. Well, that means you can justify anything to get there. You really can.
That any amount of short-term bad is justifiable. And that's sort of the point I think you're making
is that, well, but that doesn't matter because when we reach our destination, everything will be
fine. And of course, religion has a history of justifying crusades and jihads to get to that
perfect world and in the process creating incredible amounts of damage today.
Yeah, sadly, I mean, that's what I think is happening.
And I think that that just, yeah, I mean, it's hypothetically possible that all of this tech
will be so powerful, so great that it will justify everything.
But how much wishful thinking is there around that?
I mean, you know, I'm not sure, but I think that we need to be skeptical.
If you project out into the distant future, like, hey, I'm going to send you to heaven,
then you can get people to overcome their skepticism, right?
If you say, like, trust me, in 20 years the singularity is coming and life is going to be
completely meaningful.
Well, I said to Ray Kurzweil, like, doesn't that kind of fly in the face of all of the history
of world religion and philosophy?
Like, you're saying that life is going to be meaningful, like life hasn't been meaningful up
until now, and he kind of looked back at me quizzically, this is a few weeks ago, and he said,
maybe life's been somewhat meaningful.
One of my favorite parts of this conversation, the insight that what is the symbol of technology
as religion, and it's the hockey step curve, and that's exactly right. I just want to put aside
the truth value of that, and just notice that that is the symbol of technology, and the
ideology is that that which goes viral is good yes no 100% agreed and that's and
that's sometimes where the religion of capitalism intersects with the dogma
technology because when I entered technology and you know when I was an
undergrad the only people doing computer science undergrads if you wanted to
accuse them of religiosity it was like a science fiction religiosity it was like
I want to live in the future and then what was interesting is I came back to
undergrad year every year and gave talks sometime around 20
11, 2012, you saw the religiosity move from maybe a sci-fi vision of the beautiful future
to much more of a business ideology like you're saying. Well, whatever people want, we can give
it to you. And then with social media, it became, well, whatever people are interested in,
that's what should win. And so I'm always interested in the dogmas and the different
kinds of religiosity that end up being swept into the tech that we create. I mean, it's really
weird when you start talking to technologists about some of this, especially with AI, right?
Especially with people who come and they say, no, no, we're building a God.
Or they'll say, you know, we're building something that replaces us, and that's okay.
And there's a sort of a steely eyedness for people who haven't seen it up close.
It's sort of hard to believe sometimes.
There's a way in which it can really feel like you're talking to someone who has a preexisting
belief on where this is all going and is really acting in.
service of that belief.
Pull people into some of the things that people believe.
You just talked about Ray Kurzweil, but, you know, Ray, for a long time, was talking about
being able to resurrect his father through his, through his father's writings.
That's obviously very religious, the dead shall live on.
I'm seeing this a lot nowadays, not just from Ray Kurzweil, but people saying, look, I brought
back in AI Socrates.
So there's a few examples, I think, of how religious-style thinking is showing up right now
in AI, and I wonder if you could pull us through a few of those.
Yeah, there really are just so many different kinds of examples of how this, you know, Silicon Valley thinking is quite religious right now.
And I, you know, I definitely think of Ray Kurzweil who not only is talking about ending death.
I mean, how religious is that?
It's a kind of eternal life, essentially.
But also, Blake Lemoyne, who I brought to MIT to talk about his conversations,
with his coworker, what he believes is the sentient AI of now Google Gemini.
He told me first that Kurzweil really was trying to create his dead father through what has now
become the dominant AI system of one of our global dominant companies, right?
I think this is the perfect segue to our next section because I think the hardest thing to do right now
is to really walk the fine line between being a zealot of technology and over-believing it.
and being overly dismissive and skeptical
and not seeing the power of what's coming.
And so, you know, this technology is about to release,
and has already released,
but is about to release a lot of power across society.
And coming all the way back to the start of our conversation,
it's very hard to talk about that in terms that are other than just religious.
You know, this idea of immortality, of curing all diseases of, you know,
a lot of this will happen.
I'm not saying it'll cure all diseases,
but a lot of power is about to be unleashed across society.
And part of the question is, how do we even think about that?
And how do we think about that in non-religious ways?
And I wonder if your expertise in religion has anything to say about that.
So one of the ways that we can really learn from religion is by learning about this profound tradition of religious skepticism,
both from atheists and humanists like me,
and there's this huge tradition of skepticism going back thousands of years,
not just in the European Enlightenment or the Greek philosophers,
but, for example, in ancient Jain and early pre-Hindu philosophy
in what you call the East, right, in the subcontinent.
So there's that tradition, sometimes even by people
who are deep believers in the God.
So in this case, if you want to extend my comparison, my metaphor, or whatever,
you'd say people who really believe in technology
can still be profoundly skeptical about individual claims
or about going too far, the tendency to go too far.
I would really respect and honor people who would say,
like, there's some things that AI will be able to do well,
but maybe let's hold off on messianic savior claims.
And so that's one thing that we can learn from religion.
Yeah.
So, you know, I think you've referenced, you know, your own struggles for how to articulate what a fulfilling life looks like within the religion of technology and within a world of technology.
Many people talk about if AI starts to deplace human labor, where does meaning go?
And a lot of our listeners are parents, and they're worried about these big questions of morality
and purpose. And given that religion in some sense is a solution to destruction, which we find
those kinds of answers, I'm curious what lessons you'd have for them.
So a couple things. I want to talk about what I'd call the drama of the gifted technology.
and how to address that.
I've really been moved in my work as a chaplain
and then sort of observing the world of tech as well
by how many people I've come across,
often young people, students, like the one that I work with most directly,
but people of different ages and backgrounds as well,
where there's this feeling of tremendous success
and having, you know,
been rewarded greatly for being sort of deeply innovative, but either A, they themselves are
struggling emotionally. They're not happy. Or are there creations even making other people
happy or both, right? Like, you know, in some cases, it's both that the individual person who's
having all this success is not able to feel happy and neither are those of us using their
amazing products. And so I write about this idea, the drama of the gifted technologist,
the drama of the gifted child is a little book by a great psychologist from the 20th century
named Alice Miller, who essentially says that a lot of our struggles have to do with this
idea that we're taught that our whole worth as a human being is in,
what we do and in how excellent we prove ourselves to be, how outstanding and exceptional
we prove ourselves to be, and that anything about just being a human being, just being certainly
normal or average, it's almost a curse upon us. It makes us less than nothing. It makes us feel
worthless. And this is so prominent in the tech world. I just can't tell you how often
I see it.
Well, one of my favorite parts of Alice Miller's book
was where she talked about the flip sides
of grandiosity and depression.
The idea that our depression about not being able to
be more, about being with the normal parts of life,
leads us to be grandiose in our narratives of ourselves.
And I hear you saying that,
that text grandiosity of its narrative
about what it will become
might be the flip side of feeling not quite enough.
Yeah, I mean,
I think that that's right, that there's this incredible grandiosity in a lot of Silicon Valley
tech, this idea that it's not enough just to be able to produce a chatbot that one can interact with
and that can pass the Turing test, which is, you know, honestly, it is pretty cool. I grant you that.
But it's this idea that that then has to be presented as the solution capital S to all of our problems,
right and that it's going to transform everything like i don't think that we really have sufficient
evidence for that i i think that that when we talk about that a lot of the conversation about
that level of transformation falls to me within the category of myth um or you know maybe better
put as religion you know because if i said to you that there was a new religion that was
successfully recruiting billions of people to spend countless hours devoting themselves to it
for the purpose of transforming the world and that people were really motivated to get behind a very
specific vision around that. I think you could possibly worry depending on what the vision was
because you know that religions actually do that all the time. It actually really does help
to view this as a religion than just sort of a culture or a myth or certainly an industry.
because I think we have real tools for being skeptical of religion,
even those of us who would define ourselves as religious.
Some of the claims about what AI will do are obviously really grand, right?
But it's hard to judge something as distorted just because it's grand.
You might say, oh, it's really grandiose.
People are saying it's going to change the world.
And so it's easy to try to discount that as distorted, even religious thinking.
I don't think that's what you're doing.
But because, you know, is the fear really that people are getting just carried away with what it's going to be,
whereas the fear also that they may be right and it might deliver that kind of power,
but in a way that we're not prepared to deal with?
I tend to worry that the real problem is that we're so fixated on the grand narrative about the long-term future
that we are not paying as much attention as we should be
to the problems of the present.
This stuff is really lousy for the environment
is one place to start, right?
You're talking about data centers
that are drinking, say, 20% of the water
in a little part of Mexico, near Mexico City,
where the farmers are running out of water for their crops and their animals.
And so I think it's both that the AI can actually be causing the problem,
but also that it's distracting us with this future magic hope
of doing the things right now that would improve right now.
And I just honestly think, like,
the urgency is not right now to do the tech.
The urgency is to do the work on us.
And just to add one little thing here, in order for things to go well, right, we need to be able to coordinate.
There is the joke that we're all arguing about whether AI is conscious, when it's not even clear that humanity is, which is to say that we are getting results that none of us want.
No one wants climate change, and yet we seem to be as a species incapable of exercising choice against incentives.
And the way we have made hard collective choices in the past has come down to not as much of what we must do, but a who we must be.
And the who we must be is informed by the myths and the stories we all hold to do the writer thing when it is the harder thing.
And that's often come down to religion.
And so there is an alternate way instead of saying just that tech, religion, bad, but rather there is a new form of,
intersubjective belief of the who we must be to get the futures that we want.
What I'm really hoping people will take away is this idea that religion must be
reformed, not that, you know, it must be erased.
We have tremendous incentive to, you know, want to focus on big technological solutions
when, in fact, the real solutions are in, you know, improving our human relationships, right?
to build up trust, to learn how to treat one another better, to learn how to organize ourselves
into something that can treat each person with dignity and with compassion.
And I think that brings us full circle, because if you treat religion as one of the original
character logical educations, not what to think or what to know, but who to be, and how can
we be better together? As we develop this more and more powerful technology, that is the guiding
question that we all need to keep in the forefront of our minds. So thank you. Yeah. Thank you so much,
Greg. Thank you, everybody. This is a really powerful conversation. For all the listeners, Greg Epstein's
book is Techagnostic, how technology became the world's most powerful religion and why it so desperately
needs a reformation. You can buy it anywhere books are sold. So again, Greg, thank you so much.
Just to name a thing that I found a little challenging about this conversation,
it felt like a little too dismissive of the raw capabilities of what the tech does.
Yeah, I agree.
And so it is the case that the world will be transformed.
In the same way that social media has shifted what kind of jobs people have.
Influencers wasn't a thing before.
There's a true shift in the world and AI is going to be bigger than those shifts.
have to reckon with that appropriately. I thought your question of, where does it go from being
grand in the fact that the scope of the technology is grand to grandiose is the right question
to ask? That's the right distinction to hold. Yeah. And I really struggle with this. When I look at,
there's so many competing claims from people right now that say, well, I see you're just being
captured by the negatives. Like you're just this sort of negative, skeptic, religious. And the truth is
it's really hard to contend with what is it actually going to do
and neither be swept away in the grandeur and the grandiosity of it
nor be swept away in some sort of status quo denialism saying,
eh, it's all just fluff and tomorrow will be the same as today.
There are Machiavellian technologists who are making up stories
just because it sells in the public imagination.
And then there are people who are genuinely trying to use technology as a tool
to improve the lot around.
It feels like just like religion, it has so much complexity to it.
And you can't label it as just bad or just good.
Yeah, that's exactly right.
And then I love the point that you had is one of the things a religion does,
is that it gives people hope something to believe in something that is bigger and better than themselves.
And as religion has been displaced by technology, as the world has secularized,
human beings still need that thing.
And so something's going to fill it.
And what fills it is, of course, technology.
And then you end up with this other very interesting question,
which is, okay, if we can't place our hope blindly in tech, then what?
Right.
And I think it's that sitting in the unknown and that discomfort of the,
well, then where do we place hope and goodness?
That is the challenging problem to solve.
Your undivided attention is produced by the Center for Humane Technology,
a non-profit working to catalyze a humane future.
Our senior producer is Julius Scott.
Josh Lash is our researcher and producer.
And our executive producer is Sasha Fegan,
mixing on this episode by Jeff Sudaken,
original music by Ryan and Hayes Holiday,
and a special thanks to the whole Center for Humane Technology team
for making this podcast possible.
You can find show notes, transcripts,
and much more at HumaneTech.com.
And if you like the podcast,
we'd be grateful if you could rate it on Apple Podcast,
because it helps other people find the show.
show. And if you made it all the way here, thank you for giving us your undivided attention.