StarTalk Radio - AGI, Immortality, & Visions of the Future with Adam Becker
Episode Date: November 28, 2025Are we making science fiction a reality? Is that a good thing? Neil deGrasse Tyson, Chuck Nice, & Gary O’Reilly and guest Adam Becker, science communicator and author of More Everything Forever, dis...cuss sci-fi dreams, tech-bro promises, and the real science shaping our tomorrow.NOTE: StarTalk+ Patrons can listen to this entire episode commercial-free.Thanks to our Patrons Jeremiah Washington, Lawrence Burr, PAscal, Melissa Lange, Noah Naccarato, christian lopez, Matthew Thomas Dunn, thalian, Morten Leirkjaer, Jonathan Markey, Expo, Heather, Brandon G, S Gr, carwingo, Neil, Micheal Rogerson, Torgeir Sundet, Nina (aka HelloDaydream) Scott Polaske, Christopher Branch, Matthew Tarter, Jeff Dasteel, Matthew Light, Dj Stuffin, Virginia Walters, Pablo Rojo, Don T, Jacob Searcy, Jeffery Marraccini, Madam Power, Bartosz Jaworski, Jonathan Amir, Brandon D, Zdeněk Bučko, Mason, Benedikt Hopf, L4NK, Susan Baumgartner, Austin Browning, Kari Windham, How to Poe, Richard C, Margie Baker, SubTheGator, Harry W Peters Jr, Sean, Ravi Kapoor, Diego Sanz, Jeremy Malli, Walter Mashman, Arthur Cousland, Jordan Dck, Ryan Kroboth, Daniel Carroll, Bartlomiej Lepka, Christopher M, Starry Dust, Kingfisher9000, Pdub, Mat Cauthon, Leithor, Wesley Wright, MJ Ladiosa, Minty FreSH RandoMness, Gilberto Garza, Daryle Lockhart, Lyric Kite, Sasquatch, Carolyn Duvall, Heather Renn, DavidX, Mr. Thrasher, and Tracy Boomer for supporting us this week. Subscribe to SiriusXM Podcasts+ to listen to new episodes of StarTalk Radio ad-free and a whole week early.Start a free trial now on Apple Podcasts or by visiting siriusxm.com/podcastsplus. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Transcript
Discussion (0)
Love me some future talk.
But they spook me, though, with where they take the end of civilization.
But we don't know whose hands are on the steering wheel.
We don't know who's shaping this future, and that's why there's concern.
Well, I know who's saving it, and I'm scared to death.
All right.
It's you.
Let's watch Chuck be scared to death.
As we discuss all the ways tech will be shaping our future.
Coming right up, StarTalk, special edition.
Welcome to StarTalk.
Your place in the universe where science and pop culture collide.
StarTalk begins right now.
This is StarTalk, special edition.
Neil deGrasse Tyson, you're a personal astrophysicist,
and when I say special edition, it means I turn to my right.
And Gary O'Reilly is sitting there.
I know.
Gary.
Hi.
Would you get your British accent?
Stole it.
Just a thief
Chuck
You're otherwise known as Lord Nice
But we can call you Lord of Comedy
Lord of Comedy
Can we do that?
Okay
Let's do that
So today
We're going to explore
A vision of the future
And everybody's got their take on the future
Yes
Everybody's got everybody
But they all have different takes
Because they come in from a different place
So you got to hear it all
If you're going to assimilate it into something
That you're going to take action on
True, yeah.
So either make something happen
or prevent something else from happening.
Yeah.
So set us up, Gary.
All right, so what does the future hold for us?
That'll include scientists,
science fiction authors, tech CEOs.
And the so-called futurists.
Everyone has their own idea
for the future technologies.
Vision of AGI, nuclear fusion,
the singularity, transhumanism,
living on Mars.
We've got to get to the moon first.
No, you go straight to Mars.
Do we?
Don't get me started.
Get your ass to Mars.
And there you have it.
Stuff we talk about all the time on Starzo.
And in the face of new technological developments,
we're quickly going from science fiction to science reality.
But are we headed towards the utopian?
Or are we headed towards dystopia?
That will get into,
are these technologies as close as they claim?
is science fiction always a guiding light
or can it be a blueprint
for those in power?
So on that note
So who do we have today?
Adam Becker, Adam, welcome to Star Talk.
Thanks for having me. It's good to be here.
All right, you got a PhD
in computational cosmology.
Love it. That was back in 2012.
Yep.
And you wrote a book in 2018
called What Is Real?
Yep.
That's audacious.
You know, the unfinished quest
for the meaning of quantum
physics.
Sound like there's a little bit of philosophy in there.
Yeah, yeah.
Or a lot of philosophy.
There's some.
Yeah.
And you just came out with a new book because you've been a writing science popularizer
maniac ever since your PhD.
Pretty much.
Yeah, yeah.
So here's the title.
I love this, More Everything Forever.
AI overlords, space empires, and Silicon Valley's Crusade to Control the Thurban
the fate of humanity.
Oh, I got a better title.
We're effed.
You know, you should have had that title.
Yeah, you know, we considered it.
We just didn't think that it would really, you know, sell.
So my researchers told me that we've corresponded before.
Because I've only just met you now.
Yes, that's true.
So what?
So they told you we corresponded, but they didn't tell you.
Wow.
Okay.
So what happened?
So they set me up.
That's what happened.
They're so unlike us.
Oh, God.
Well, what happened was...
What happened was.
I was a snot-nosed kid in grad school
and came to visit the museum
and noticed what I thought was a mistake
on one of the plaques.
And so I emailed just the general astronomy department email
here at the Rose Center.
And then two weeks later, you wrote back.
Oh.
And what did he say?
Well, I'm sure I wouldn't be polite.
Was there a mistake?
And did Neil school you if there wasn't?
Whether there was a mistake or not was a matter of some debate.
Oh, really?
Yeah.
So the question was the size of the universe.
Oh, yeah.
Oh, okay.
There it is.
Yeah.
Well, that's still a debate today.
Well, no, that's not in the way that he's described.
Kind of like this.
Yeah, yeah, yeah.
So, I deal in observables, okay, as do many practical scientists.
Right.
So when you speak of what the universe is doing, you speak of what you see it's doing.
Right.
And we can see galaxies whose light has been traveling for 14 billion years, 13.8.
Right, right?
And so we will loosely say, well, it's definitely the age of the universe, but we speak of the size.
we can be a little sloppy
and say it's
13.8 billion light years to the
edge of the universe. But that's
not strictly accurate.
What you have to do is, since then,
the universe has been expanding.
So where's that galaxy now?
It's like 45 billion light
years away, but you can't see it.
So you have to stick it into a model
of the expansion rate of the universe and come out
with a number that you cannot observe.
So you're being snot-nosed about that.
But that's fine. Tell me I was polite, because I think
I think I'm polite. You're polite. You're polite. We had a little back and forth.
And eventually, eventually, I think you were getting a little impatient.
Oh, really? And you said, uh, why don't you, you know, make a presentation of this to a wide audience in a way that you think of suit the box?
I remember that correspondence now. Okay. Okay. I get it. You said that. Okay.
And then I went up and did a podcast about it and sent it to you.
Okay. All right. So in your book, Overlord.
space empires you just go all out yeah and you're coming to it from a physicist with a philosophy flavor yeah
so you're going to see this in ways pure tech people wouldn't or politicians or just regular
everyday folk walking up and down the street so how did you prepare for this book i read a lot of
really bad writing by you know tech CEOs and people defending tech CEOs online you know writing
long essays and books about why the future is inevitably going to be all about super
intelligent AI, why the future is going to inevitably be about...
These are tech people writing all this.
Yeah, exactly.
Some of these things, you know, were ideas that I had the expertise to say, okay, no,
that's not true and here's why.
But some of them were ideas about, like, you know, biology or areas of physics that I don't
have expertise in or, you know, other things.
And so then I went and interviewed a bunch of people who have expertise in those areas.
Now you're playing journalist in that capacity.
Yeah, yeah, yeah, yeah, yeah.
And, like, read books on those subjects and, you know, pulled out what I needed and stuff like that.
And then I tried to interview, you know, the tech CEOs themselves.
And almost all of them said no.
No, right.
Yeah, because they're looking at you as somebody who is intellectually honest with some integrity.
And they're like, we can't talk to you.
Okay.
because they know they're full of crap.
Yeah, well, I think that they just didn't see any reason to, right?
You know, like, I was very honest.
I said, you know, like, this is a book
that's going to take a critical look at you.
That was your problem.
Well, if you had gone in and said,
I'm enamored of the fact that AI
is going to be such an integral part
of the next chapter in human history
and that you guys are the progenitors
of this amazing tech.
They would have been like,
Come on in, let's talk for a second.
No, you don't meet an appointment, stop by any time.
That was your first mistake.
Yeah, man.
Well, but, you know, I got journalistic integrity.
All right.
Well, good.
Well, look, let's explore some of the scenarios.
Okay, please.
That are going to be potentially the reality of us as a human race.
Just go right on down the list.
Yeah, the laundry list.
Okay.
Well, let's look at Mars in 2050.
Oh, yeah.
Are we saying maybe? Maybe not.
You're kidding me.
Oh, that's definitely going to happen.
Yeah, Elon Musk.
He has said he wants to put a million people on Mars by 2050
to have a self-sustaining civilization that will survive there
even if the rockets from Earth stop coming
because there's been an asteroid strike or nuclear war
or something here.
That's definitely not happening.
There are a lot of reasons why that's not happening.
Getting anyone to Mars by 2050
and bringing them back alive or just having them live there for a while,
that would be incredibly difficult.
The challenge is just to put boots on Mars, the way that we did on the moon, are enormous, right?
Just learning how to keep someone alive in deep space that far away from Earth for as long as it takes to get to Mars, stay on Mars, come back.
We do not know how to do that yet.
Chuck, that's the problem.
They don't want to put boots on Mars instead of sneakers on Mars.
You get a sneaker contract.
You'll pay the whole way.
Nike would have been there by now.
They just do it.
Yeah, yeah, exactly.
Excellent.
Just do it.
Well playing.
It ain't about boots.
It's about sneakers.
What are the biggest challenges of going that far into space?
Is it radiation?
Yeah, there's radiation.
And that's not just when you're in space.
It's also when you're on Mars, right?
You know, the two things that primarily protect us from radiation here on Earth
are the Earth's magnetic field and the thick atmosphere that Earth has.
Mars doesn't have either of those things.
So when you're on the surface of Mars, you're getting pretty much the same radiation dose
you do like out in space and that's not good right you know like the thing that i tell people is
the movie the martian is science fiction one of the things that's science fiction about it is
if mark wotteny really you know had to do all the stuff that he did in that movie he'd come home
and he'd be dead of cancer in a couple of years because he had too much radiation exposure hanging out
in marxam
I'm Brian Futterman, and I support StarTalk on Patreon.
This is StarTalk with Neil deGrasse Tyson.
What about the ISS?
If Scott Kelly could stay up there for a year.
One of the twins.
One staying on Earth and one.
Right?
Yeah.
Why couldn't you just extend that for whatever time necessary to go to Mars?
Even if it's not to live there, if it's just to go there and dig a hole and come back?
Right, so there's a couple of things.
First of all, on the ISS, they're still in the Earth's magnetic field.
They still have a bunch of the shielding.
Oh, wait, and what's that called, Neil?
Wait, the field that goes all the way out like that?
Oh, yeah.
Oh, it's called the...
Magnetic field.
No, it's not the magnetic.
It's the magnetosphere.
Yeah, think of X-Men.
Yes, the magnetosphere.
Yeah, yeah, yeah, yeah, exactly.
It is like the X-Men.
Yeah, they've still got that protection.
Also, if something goes wrong on the ISS, they'll be back on the surface of the Earth.
in a matter of hours.
Like, they can just abort and come back home, right?
And most.
I mean, you come out, you're down within a half hour.
Exactly.
Yeah, yeah, yeah.
The hours is you want to line up
so you don't land in the middle of sharks.
Right, exactly.
Yeah, yeah, yeah, yeah.
So, like, you can get out easy.
And you can also have a real-time conversation
with people on the ground.
Okay.
Because they're, you know, they're not that high up.
And so the speed of light delay
with the conversation doesn't matter.
On Mars, you know,
it's a minimum of something like,
I think eight minutes each way
and a maximum of something like 15 or 20
one way.
And so if you send out a message,
you are waiting at least
15, 20 minutes to get a message back.
Better be a good message.
Yeah.
It better not be like,
so how's it going, over?
Yeah.
Put some contact in there.
Or watch out for the cliff.
Yeah, exactly.
Yeah.
And the other thing is, like,
if you have a problem on the surface of Mars
and you want to come back,
that's going to take you at,
least nine months, maybe more, if you happen to be near a launch window where the Earth
and Mars are like in the right positions. If you're not near a launch window, it could be well
over a year before you can come home. A full-up round-trip mission to Mars with ideal launch
and return parameters is multiple years. Yeah. Right. But you can get to the moon and back in a
week. Yep. In like a news cycle. Yep. Right. So if we overcome the logistics of getting from Earth to
Mars, if, big, big if, where are they going to live? Because they're not going to go out there
and start building. Yeah. And why don't you just build a little sort of half underground thing
that shields you from, sure, from radiation? Yeah, well, so then you have other problems, right?
You know, there's no air. You got to bring in oxygen or, you know, do some sort of reaction to make
oxygen on the surface, which, yeah, you can do that, but it's not the easiest thing. You've got to
bring in all your food. You can't grow it there. The Martian surface, the dirt on Mars, is
filled with toxic chemicals.
You're going to have a hard time getting it out of stuff
because it's very fine. It's not...
That's going to be here on Earth soon, too.
So let's be for real.
But we know you can grow poop potatoes on the...
Right, yes.
We know that. We know that.
Yes, exactly. Yeah, there was a proof of concept
in the movie The Martian.
Duh.
No, but actually, it's funny, though, the guy who wrote the book,
what's his name?
Andy Weir.
Andy Weir, yeah.
In fact, we had him on the show.
Yeah.
He's in our archives.
He has said that, you know, the discovery of these particular poisonous compounds in the Martian surface called perchlorates,
he didn't know about that when he wrote the book because it wasn't widely known.
And so now we know if you tried to, you know, farm poop potatoes on Mars, they'd be poisonous.
Yeah. So that's the unknown unknown.
Yeah. Yeah. Right. Exactly.
And that's still out there. Okay, that's not going to work.
We're not thinking that for some time.
Functional immortality.
Yeah.
And there's a lot of ways we can get there.
I mean, the biological immortality of growing organs in pigs and things
and then transplanting is one thing.
But are we getting towards singularity?
Yeah.
So, I mean, the biological replacing organs thing, you know, you can't replace the brain.
Well, not yet.
Yeah.
I mean, not yet.
I'll never do it with that attitude.
Right.
Sonny.
That was more a British approach to things rather than American.
Someone needs a better attitude about things.
But yeah, this idea of the singularity
that we're going to get to this point
where technology in general and AI in particular
gets faster and faster and smarter and smarter
until it like gains godlike powers.
It's a science fiction story.
What does it have to do with living forever?
Well, so the idea is that then you get
this godlike AI that like grants us immortality.
It has like essentially magic powers.
Or, you know, it finds a way to take human minds.
I mean, it's just smart enough to figure out how to make us live for it.
Right, okay.
It can solve the problem.
Yeah, immortality.
Yes.
I got to.
But whose mortality problem is it going to solve?
Yeah.
Well, it doesn't have one.
So, I'm just saying, are there a select few or is this open for everybody?
Oh, well, we know for a fact.
that it's going to be for select few and it's going to be for the people who are the gatekeepers to
AI. We're already seeing that now, but go ahead. Well, but the other thing is that the whole idea
is kind of, you know, nonsense to begin with. Like this idea of singularity is, well, it's based on a few
really serious, like, flawed ideas. First, this idea that there is this, like, single thing
called intelligence. You can just ramp it up or down in a computer and it can just make itself
more and more intelligent.
That's not really how intelligence works.
Intelligence is like a really complicated thing.
It's not one number.
And also the usual way talking about the singularity
that like the main popularizer of the singularity,
Ray Kurzweil has.
Who's been on the show?
Who's been on the show?
In our archives.
I do these little commercials.
I love that.
Okay, you're gone.
But yeah, Kurzweil, you know, he says it's...
And he came out with a second book.
Yes.
The first one was the singularity is near.
Yes.
You know the title of his second book?
Yeah, the singularity is nearer.
Nearer.
Yeah.
That's so funny.
I tell people that they don't believe me.
I'm like, go look at it.
Yeah, there is.
His next book coming out is going to be called Almost There.
Yeah.
No, he thinks that, you know, Moore's Law, this idea that computer chips are just going to get faster
and more powerful and, like, double in speed every 18 months.
He thinks that this is this, you know,
specific instance of a more general
law of accelerating returns
in technology and in nature
and he says he's traced it all the way back
to the beginning of the universe
and that it shows that a singularity is coming
in like 2045 or something like that.
Precisely. Yeah, precisely.
On October 12th. Oh, man, that sounds to me like
the end is near people. Yeah, I know, it is.
That's out of me like, the people are like, I don't need a bank account.
You know, Jesus is coming back in everything.
You know, it's funny that is near. The end is near.
That's what he should have.
No, so it's funny that you say that, right?
Because he's got a picture of himself
and the singularity is near
with one of those, like, poster boards on him
that says the singularity is near.
Oh, oh.
And there's an AI research group
that's inspired by these ideas,
a singularity called the Machine Intelligence Research Institute,
Miri.
They don't give their employees 401Ks
because they think the end is near.
That's some cheap-ass.
Yeah, I know, right?
Wow.
Yeah.
Well, okay, I still want to get to the immortality thing.
Yeah, yeah.
Because you haven't addressed the fact that right now, with or without AI, there's a lot of research on not just replacing organs, though that might be in our offing, but delaying the aging functions of yourself.
Totally.
That could work out to extend human lifespan or health span a certain amount.
For a long period.
I like the health span.
Yeah, yeah.
And it really is about the health.
That's a good word.
Yeah, yeah, yeah.
What do the Galapagos, what do they live to be, like a hundred and...
The tortoises?
The tortoises?
Yeah, like 200 or something, buck 80 is there.
They have AI, so that's how they?
Remember we spoke with Venki Ramakrishna?
He was telling us about the Greenland shark.
Yes.
Being about 800 years of age.
Yeah, that's crazy.
Yeah.
Yeah, it is possible that some, you know, biotechnology will be developed
that will radically extend human lifespan or health span.
maybe, yeah.
But what these guys are talking about with singularity,
they're generally not talking about that as the end game.
The end game they have in mind
is not just an extended lifespan,
but real immortality by uploading their consciousness into a computer.
I was going to say that's really where this is going.
That's the immortality.
Immortality of your mind.
Yeah, yeah, yeah.
Forget your body.
Before we get to that, do we visit transhumanism?
Do we get, and how are we defining it?
What is transhumanism?
That's it, what I'm just saying?
Define that.
It's this idea that you can use technology to transcend, like, the limits of human biology and physics.
Aren't we kind of already doing that, which is why we live twice as long as people 150 years ago?
Yeah, no, there's definitely a sense of which.
If we told them what we're doing it, we know about nutrition, vitamins, they'll say, what's a vitamin?
Right, we've got vaccines.
We got this, we got, what's a vaccine?
Yeah.
Right?
Aren't we already transhuman compared to what the age,
nature would require us to be dead at totally yeah like look i i think that we have used technology
to make many things much better about being alive like that's that's just true the question is
does that trend continue indefinitely right no because rfk is going to make sure back
to when we lived half as long as we do that that's what's happening
That's RFK Jr.
Yes, RFK Jr.
So if we go to this
uploaded consciousness
and that becomes
reality, that just doesn't
exist without a power source.
Right. The thing
about the singularity
and like Kurzweil's idea
about like this accelerating returns
and Moore's Law just going on forever
and you know this power source thing, right?
The idea that it would need increasing
levels of power as well
And so this leads to this sort of exponential drive for materials and power.
And the thing that Kurzweil forget is exponential trends are not like laws of nature.
The law of nature about exponential trends is they end.
Right.
They have to end.
And so, and it's because ultimately limited resources like energy.
Right.
And so.
Although.
Yeah.
When we talk about how much longer a charge in our computer,
lasts today compared with the early days of laptops.
Part of that is better batteries, but also part of that is more efficient chips.
And when we get to quantum computing, where much more computing happens in much less
with much less of an energy draw, it could be that we're coming at it from the other side
where the energy needs are dropping, thereby not requiring the power supplies necessary.
In my memory, not your memory, I got a few years on you, a room this size was necessary to cool a computer, otherwise a computer would overheat, and the computer is doing like four function mathematics.
So the efficiencies matter, all these tubes that had to be kept cool.
So it's not obvious that it's a linear exponential. Can I say that?
Where the exponential is just going to hit a limit.
Because you can come at it from other directions.
Yes.
However, the other thing is, though, in nature, the exponential acceleration, it's more like the law of diminishing returns is more likely than the law of exponential's acceleration.
Well, no, that's actually exactly right.
Yeah, because there's, you know, if you look at the history of Moore's law, like how it is that the semiconductor industry.
Name for.
Oh, yeah, named for Gordon Moore.
Co-founder of Intel.
And if you look at how Intel and other, you know, semiconductor companies actually made the chips smaller and faster over that time, you know, it's not a law of nature.
It's a decision, a business decision that these companies made.
And in order to keep that trend going, they had to invest more and more and more money just to keep the same sort of level of doubling to keep that exponential trend going.
And eventually, it did stop, right?
Moore's Law is done.
it's over.
Because you can't make silicon transistors smaller than an atom of silicon.
Right.
Yeah, and what they're doing now is just adding more chips.
Right.
So the more powerful computers are not smaller and denser.
They're just bigger now.
Yep.
Right.
Yeah, and they're putting them on top.
You're stacking them.
Yeah.
So the solution in the minds of these tech billionaires is to arrive at a superintelligence
to get at an AGI.
Yeah.
Yeah, an artificial gentleman is saying within the next two years, that will be achievable?
Yeah, two, four, I think he said, yeah.
All right, so they're looking at that as being the solution to this problem where we're saying we're not sure if it will be exponential.
We're not sure where the end point is.
They're looking at it as the solution to every problem.
None of the tech bros have a degree in physics the way you do.
So what are you bringing to the table that they don't see?
I mean, they believe that ageing.
I mean, Altman has said that AGI is going to solve every problem, including, like, global warming, which is crazy.
Why is it?
Well, because...
If it's smarter than you, and you can't solve it, why is it crazy to think it could solve it?
Well, first of all, the artificial intelligence systems that they're building now are just drawing more and more and more energy.
If you did build one that could solve global warming, and you turn it on and said, how do you solve global warming?
I'm pretty sure the first thing it would do is say, well, you shouldn't have built me.
Yeah, it turned me off.
Yeah, it turned me off.
That'll help.
Yeah.
That would be a good test of its own self-preservation.
Right.
You are causing most of our global warming.
What's the best solution?
Does it turn yourself off?
But I mean, the other thing is that we don't need AI to tell us how to solve it.
We already know what the solution is.
Wow, yeah.
The issue is not like that insufficient intelligence has been thrown at the problem.
The issue is primarily not even a technological problem at all at this point.
aside from carbon capture, the main issue is...
No, it's human behavior.
It's greed.
It's greed.
Yeah, it's greed.
It's greed.
Chuck.
Greed is good.
Oh, God.
Yeah.
But the, yeah, the other thing is just that when Altman talks about, you know, he's talked
about things like, oh, AI means that in 10 years, college graduates are going to be getting, you know,
cool jobs exploring the solar system, right?
And I can just look at that and say, well, that's...
bullshit. Or, you know, he says, AI is going to, you know, discover new laws of physics and that's
going to remove limitations that we have in the world today. And I'm like, well, discovering new
laws of physics, I mean, putting aside whether or not the AI can do that, that does not
always remove limitations. Sometimes new laws of physics, in fact, a lot of times create a limitation.
Create a limitation. Exactly. Einstein, with relativity, discovered a limitation in the speed of light, right?
Newton didn't know that there was any such limitation.
Right.
Yeah.
So is it possible that all of these, I'll call them postulates because they're not, that they're making, right,
are just a means of hyping up what they're doing to keep the revenue stream coming to them?
Like, let's be honest.
If I tell you this thing is going to solve everything, right?
I'll give you my money.
You give me, right.
You give me some money.
Yeah.
I mean, it's kind of like.
Not just me.
The government will give you.
No, that's what I'm saying. Everybody is going to give you money.
It's kind of like...
It's 21st century snake oil.
Is that what you're telling me?
I was about to say something different.
Okay.
It's kind of the evangelical business model of a television evangelist.
Like, the whole idea is, hey, you got problems.
And these problems can be addressed.
They can be solved.
All you got to do is send me some money.
That's all you got to do.
And I'm sitting here this little blessing.
cloth and, you know...
Chuck was a preacher in his earlier career.
Yeah, and if he isn't, he will be in the future.
I'll say, one thing I made the wrong decision.
Preaching is good money!
Everything we've discussed has been about being somewhere else.
Yeah. About not solving problems here.
Yep.
So are these people looking at Earth and going, you are completely screwed?
Yep. We're out of here, and we're the ones that can afford it,
and we're the ones with the tech to be able to achieve it.
Why are they walking away?
away. What's in their mind? What's their thinking about turning their back and moving?
Well, they think that... But they didn't grant you an interview.
Yeah, they didn't grant you. So you don't really know what's on their mind.
Oh, you can read enough of their stuff to have it to infer what's on their mind. Yeah, they've given
other people interviews who were nicer to them.
I can be a nice guy. You were charming enough?
Yeah, apparently not. I mean, I just sent an email. A couple of them were almost willing to do it.
And then they changed their minds probably because, you know, they read the email again.
And they're like, oh, he's going to disagree with us.
Why should we talk to him?
Right.
Whatever.
Some of them are being very cynical, like the way that you were talking about, right?
And saying, oh, you know, I can just do this.
And this is, you know, I can claim that all of these things are coming in the future.
And this is a way of, you know, generating more profit and getting people to give me more money.
Some of them, I think, genuinely believe it.
The idea that the future has to be elsewhere, I think some of it is just from this,
sense that they have that things are bad here on earth and that trying to solve problems here
on earth would be complicated and messy and difficult and that somehow going to space would give
them a fresh start which is not true you can't escape you know politics you can't escape
we're still human nature yeah exactly you can escape human nature exactly so how does an ai
overlord plug into these scenarios well the idea is that they they build a sort of AI god and it
does whatever they want.
And this would be an AGI.
Yeah.
Yeah.
Artificial general intelligence.
Yeah.
So when we normally think of AI, we think of a task-driven AI.
It can drive a car.
Right.
It can make a perfect cup of coffee.
Right.
It can fly an airplane.
Sure.
But AGI transcends all of that.
Yeah.
It can just learn about anything and might even be achieve consciousness.
Yeah.
Like Skynet.
Yeah, absolutely.
because it's, we are AGI.
That's what we are.
We are just, you know, the equivalent of what they want as AGI.
I'll be AGI at 0.10.
Yeah, we're 0.10.
No, 0.1.
0.1.
Yeah, okay.
So.
Yeah, no, that's right.
I mean.
No, but the point is, whatever we are, AGI, so how long we take you to go to school
to open up all your textbooks and learn from them and get an exam?
AGI will do what?
Yeah, it's supposed to be able to do all that much faster.
20 minutes.
Yeah, in 20 minutes.
We'll get your college degree in 20 minutes.
Well, and the other thing is, like, those AI systems, all the things that you mention, right, make a perfect cup of coffee, fly an airplane, drive a car.
The AI systems we have right now can't do any of those things without human supervision, right?
Even those self-driving cars that are all over the streets of San Francisco, there's actually a human remotely supervising and intervening pretty frequently.
So they tell you.
You're talking about the Waymo's?
Yeah, yeah, yeah, yeah, exactly.
And so...
Waymo is Google, if I remember correctly.
Yes, yeah, yeah, yeah, that's right.
And so AGI is supposed to be able to do all of these things, like, independently, right?
And then get smarter and smarter and the human is not even in the equation.
Human's not in the equation, and you can just make it go, you can have it do all the things that a human does,
but you can, like, overclock it, make it go faster, think faster than a human,
and then the idea is it gets smarter and smarter and achieves these, like,
superhuman, super intelligent powers.
The idea then is that for the billionaires controlling it,
it's like a genie.
And for the rest of us, it's an overlord.
It's an overlord.
But this is where they're so stupid,
and this is where all really rich people.
They're so stupid, they have hundreds of billions of dollars, and you don't.
No, but that's what makes them so stupid.
I agree.
I'm serious.
Yeah.
It's the fact that they have all this money,
and they've convinced themselves that they,
can transcend anything.
It's evidence of their own genius.
Right.
So it's, their hubris is their downfall.
It's like the first dictator, the very first dictator,
was a guy, a little guy by the name of Julius Caesar.
But Julius Caesar was the very first dictator.
You know how he became dictator?
They said, all right, how about you be dictator,
but you do it for a year?
If you create a godlike being,
whether it's artificial, general intelligence, or whatever.
Yeah.
And you think that you're going to control it.
It is not a God at that point.
Yeah.
You are the God.
And that's really what they're saying.
Yeah.
They're saying we're gods.
Yeah.
And the thing is, like, that's true.
If they somehow did achieve it, who would be controlling it?
But also, it's an incoherent idea.
Like, it's not, the good news for the rest of us is that it's not, like, something that's actually coming.
Because it doesn't make any sense.
That's funny.
It's like, these guys are hanging their hat,
and you're just like, yeah, man, it's just a dumb idea.
Yeah, it is.
No, no, no.
Incoherent sounds way more of a beatdown.
Your idea is incoherent.
There's dunking on somebody right there.
Is this a misconception of the science?
The misconception of science fiction?
Yeah, I mean, I think a lot of the ideas, right?
This goes back to, like, why are they, like, trying to go somewhere else?
I think they just get these ideas from science fiction,
and they just take it way too literally, they don't read it well, right?
Like, my favorite science fiction, the science fiction I grew up with was Star Trek, right?
Yay!
The thing about Star Trek is, yeah, okay, they're on the Starship Enterprise, they're out there, you know, exploring strange new worlds, new life, new civilizations, all that jazz, right?
Finish it.
To boldly go where no one has gone before.
Thank you.
Not to go boldly.
Yeah, not to go boldly.
No, we can split the infinitive.
Split that infinity.
Yeah, hell yeah.
But the thing is, though, Star Trek was never really about space.
It's about, like, us here and now, right?
And it was always an allegory and not even a particularly veiled one, right?
I seem to recall an episode where Kirk and Spock were literally punching Nazis with swastikus, right?
And then there was also, right, and there was also the episode with, like, the two dudes and one of them, like, the left half of his face was white and the right half was black, and the other one, it was Switch.
Frank Gorshian.
Yeah, Frank Gorshian, yeah.
He played the Riddler in Batman.
Yeah, yeah, yeah, yeah.
Exactly.
So. It's obvious why we persecute them.
They're black on their right side.
Yeah.
We're black on their left side.
Yeah, that was kind of, kind of blunt.
Yeah, exactly.
But Star Trek is always blunt.
Right.
And that's kind of part of the fun.
Yeah, right?
It is.
But these guys watch Star Trek, and they're like, oh, yeah,
warp drive's cool.
Let's do that.
And, like, you're not.
And miss the whole point of Star Trek in the process.
Yeah.
Right.
Because Star Trek is utopian ideals in
a galaxy that's descending towards dystopia,
and they're fighting, fighting it every step of the way.
Could it be that to become a tech bro in the first place,
you had to be really focused on a level to the exclusion of your social life
and possibly even your personal hygiene?
As a result, you achieve these places,
and part of your life's training did not include the emotion,
and feelings of others
or how people think about the world
or what their desires are
and you think that what you accomplish
is for their best interest
even though you have no idea
who they are.
Is that a fair?
I think that's right.
Right?
Like the way I like to talk about it is
you know, for someone who claims to care
about humanity so much
Elon Musk doesn't really seem to care
very much about humans.
Right.
Yeah.
And isn't he the guy who said
empathy is a bad thing?
Right.
Yeah.
But he's also said,
he said, I'm going to save humanity.
Yeah, he did.
But he's also said, I'm going to save humanity by taking us to Mars.
And like, buddy, first of all, no.
And second, like, I don't think that you actually care that much about other humans.
And I think that what you said is exactly right, except you also have to add in, they think
that the fact that they succeeded in business, which a lot of that's just luck, is proof of their...
Right, in government contracts.
Exactly.
There's such subsidies, yeah.
Yeah.
For the car business.
as well as the rocket business.
Absolutely.
But like Musk and others, right, like Altman and Andresen and these other people, they all think
that this is proof that they are like the smartest people who've ever lived, because
they're the richest people who've ever lived.
And that's just not how anything works.
Just remind me, Altman is OpenAI?
Yeah, Sam Altman's CEO of Open AI.
Just let's go down that list.
Yeah, absolutely.
And Mark Andresen is the head of Andresen Horowitz, the biggest tech venture capital firm.
firm. Oh, so you need that to get the confusions. So OpenAI is what gives us chat GPT. Got it,
got it, got it. And then of course, we all know Elon. Is Branson a player?
Branson is less of a player in Silicon Valley. What role in the tech sector does Bezos play?
I mean, you know, he's got his own rockets. Yeah, he's got his own rockets. He's also like
owns most of the infrastructure of the World Wide Web. This I think is something, yeah, AWS.
Yeah, AWS.
Which stands for what?
Amazon Web Services.
Basically, most of the cloud, most of the cloud, most of the actual computers that compose the cloud
belong to Jeff Bezos.
So Amazon.com is like window dressing on a whole other operation that matters to him.
The real operation.
That's why he doesn't have to make a buck selling you a, he's going to sell your book for 80% off or something.
Well, after this I'll be lucky.
He sells my book at all.
No, no, no, we're going good, Eric, yeah?
Let me remind.
Let's get the title of the book back in here.
Just, give us the title again?
Yeah, it's more everything forever.
AI overlords, space empires, and Silicon Valley's crusade to control the fate.
There it is.
That's what we're talking about.
Yeah.
Man, okay.
So when we think about the science fiction,
and I think Neil's point about the isolation of these people growing up,
if we think about the science fiction
and you think about certain points with Star Trek,
then maybe Matrix or Blade Runner,
and you go through the laundry list,
how have they co-opted these and kind of bolted this to that and that to this
You think they were influenced by sci-fi?
Oh, yeah, totally.
I mean, Elon Musk tweeted that science fiction shouldn't remain fiction forever.
Okay, I don't mind that.
Yeah, I sort of understand what he means, but which science fiction, right?
Like, Blade Runner's a dystopia.
Right?
And then he comes out and says that the cyber truck, that ugly piece of crap,
looks like something that Blade Runner would drive,
which Blade Runner is not the name of any character in Blade Runner.
But we can put that aside.
It's a profession in Blade Runner.
Yes, exactly.
That's great, man.
But there's a number of dystopians.
Oh, yeah.
But they're all dystopic.
That's not new.
They're all dystopic.
Some of them are.
I mean, right?
There's aspirational science fiction like Star Trek, right?
Right.
But even that is, like I said, they're the bright spot.
Yeah, the, the federation is the bright spot.
The Federation is definitely the bright spot.
Yeah, no, no, no, that's true.
But, like, there's a tweet that lives rent free in my head.
head about a thing called
the torment nexus. This is actually in my book
at the very beginning. The torment nexus.
I'm afraid to ask what this is. We're going there.
We're going there. We're going there. Let's do it. So the tweet
goes like this. Science fiction author, in my book,
I created the torment nexus as a cautionary tale.
Tech billionaire. At long last, we've created
the torment nexus from classic
sci-fi novel, don't create the torment nexus.
This is what these guys are doing.
Right.
It's Skynet.
Yeah, like, right, there's Skynet.
But also, like, if you go back and look at, like, the classic cyberpunk novels by somebody
like, say, William Gibson, right, who I think is a great novelist, a lot of those novels,
like Neuromancer are about, like, the concentration of wealth and power and the way that
the wealthy can and will use technology to remove themselves from the rest of us and accumulate
wealth and power while insulating themselves from their consequences.
And that's exactly what we see happening.
And so when they say that they want to make science fiction, you know, into reality,
we need to ask, okay, which ones?
Because if you want to make neuromancer reality, man, that's bad news for everyone who's not you.
So how much of science fiction has always been a silent alarm call, a silent warning?
Yeah, I mean, science...
Go back to Fritz Lang and Metropolis back in 1927.
Yeah, absolutely.
Yeah, Metropolis is very much like a movie about the need to get.
keep emotional intelligence
with pace with technology, right?
I didn't get that at it, but I believe you,
I'm just saying, that's a deep read.
To me, it was just a weird alien bot.
I mean, it's a weird movie.
Yeah, yeah, it's weird.
But, like, at the end, they say the heart
and the hand must work together
or something like that, right?
Yeah, yeah, yeah, yeah.
And so that's how I read that, at least.
Like, I think that science fiction,
a lot of it has always been, like,
about looking at the world as it is now
and saying, okay, if we push
that a little bit. If we want to take a look at this situation in a different context and
understand it in a different way by removing it from all of the sort of social and cultural
connotations that a particular thing has here and now, we put it somewhere else and maybe we can
see it more clearly, right? It's what Star Trek does. It's what like, oh, my favorite science
fiction author, Ursula Le Guin, right? She did this over and over again, looking at, you know,
poverty, inequality, capitalism, gender, you name it, right?
So Rod Serling back in 1959, he's interviewed about this new show called Twilight Zone.
And he says it, he said, look, there's stories I'm telling that you could not tell in just a dramatic way.
It has to be set at a time and a place that is not you and now.
Otherwise, I couldn't get away with these stories.
Yeah, right.
And only then do people say, wait, might that be me?
Yep.
But if it's blatant and in your face, you reject it.
And he said, in the end, we're just trying to sell soap.
Yeah.
He understood the situation, but tell an entertaining story, but said it in another place.
Yeah.
I just looked up when Skynet achieved consciousness.
Okay?
It was 2.14 a.m. Eastern Time, August 29th, 1997.
Oh, wow.
Oh, wow.
Because the movie was 1984 or?
Yeah, the first time.
I think was 84.
Yeah, so that was only 13 years in the future.
Yeah.
I hate when they do that.
Go far enough where...
Oh, I have a whole list.
Well, that was the whole thing with Star Trek.
It was set something like 200 years in the future.
Yeah, that was well safely in the future.
I got a whole list.
I'm saying, go safely into the future.
Do you know, Soiling Green was 2022?
Oh, God.
Well, that's why I'm eating people now.
It's people.
Everybody doesn't realize that's what?
the pandemic was about, guys.
Because it happened.
That's right.
Enjoy that burger.
So what else
is in your laundry list here?
Basically, I think that what they
want, this vision that they have, is this
idea of going to space and living
forever. Right? And so a lot
of it is really about space colonization.
Going out and expanding
to take over the universe, that's like, because
they don't want to just stop with Mars.
They want to put Dyson spheres around
every single star in the
observable universe and like collect all of that energy and that's I this not going to happen man
there would be a Kardashian scale five I think yeah control all the energy output of all stars in the
known universe right but doesn't the Borg have some yeah exactly similar energy yeah yeah
the Borg want to assimilate everything and what was it the phrase they want to uh take your
cultural and and political distinctiveness and make it part of our own like yeah yeah yeah speaking as a
scientist. I kind of like what science brings society. And shouldn't that be enough? Why does everyone
go to science fiction? Is there some morbid fascination with science gone bad? And isn't that a problem
with us, not with the storytellers themselves? I mean, I don't think that science fiction is in and of itself
the problem, right? Like I'm a huge- That's what I'm getting at. Yeah, exactly. Yeah, no, I'm a huge
sci-fi fan. I'm also a scientist by training, at least. The reason people find science fiction more
than science has a lot to do with the fact that it's not really about the future,
that it's sort of these interesting what-if scenarios that reflect on where we are right now.
If you tried to make a very realistic, you know, TV show about what life could actually
be like 100 years from now and made it as realistic as possible, people probably wouldn't watch
it because it would involve so much slang that doesn't make any sense to us right now
and like shifts in language.
Oh, MG!
Right, yeah, right.
WTF.
Yeah, but this is like, like,
and so many little things like that, right?
It's not meant to be a realistic depiction of the future.
So, I mean, part of me wants to say,
you know, the problem isn't science fiction,
the problem isn't science,
the problem is like critical reading comprehension skills.
Yeah, like teaching.
And money.
Yeah, exactly.
Yeah, so the accumulation of wealth to a very few
is always going to be a very bad thing
for any society.
But right now, unfortunately,
there's a global society of billionaires
that has popped up.
Yeah.
And they're...
When did you become Marxist?
What's that?
When did you become Marxist?
Believe me.
I'm pretty cool with capitalism.
I'm just all about guardrails.
And I also believe that $2 billion is all you get to have, okay?
Now that was pretty Marxist.
Yeah, I think that was in Das Kapital.
We're going to have, you know, $2 billion.
Yeah, you're going to have $2 billion.
Yeah.
$1, $1,000, no more.
So, no.
So basically you're saying,
not enough pure science education,
but a hell of a lot of money
is a bad...
That's a bad combination.
Yeah.
And I agree, completely.
So give us the takeaway thesis of your book.
Oh, I mean, I actually do end the book
saying that we should, you know,
limit the amount of money
that people should be able to have.
Okay.
Did Carl Marks write the few forwards?
I don't think you meant that.
I don't think you meant that.
What you mean is we should limit how much power
the people who have money have.
Yeah, absolutely.
The problem with that is,
the more money you have,
money is always, and they call it soft power,
it's not.
It is straight, hard power.
Because you are able to influence
every corridor of power
that there is when you have enough money.
No, this is good, man.
And you should follow me around and just say the stuff that I'm going to say, but better.
Right.
There's plenty of rich people.
But what you need to do is this is where progressive tax is a good thing.
And we found that out under FDR back in the day where basically you got to a certain amount of money and they were like, yeah, we're going to take 90% of that.
Okay.
And we're going to take it and we're going to do stuff because you wouldn't have been able to get that much money without all the things.
things that we want to now support with the money that we helped you make.
But you get to keep up until that point all your money, but when you get to this level,
you don't give us that money. Give me that money.
But yeah, no, I think that we as a society, like, because it's not just the billionaires,
it's also that we as a society buy into this idea that the ultra wealthy know what they're
talking about when it comes to something other than the ins and outs of having.
And they could be complete dumbasses.
Yeah, exactly.
No, you know what they're good at.
They're good at rigging the game for them to make more money.
Yep.
That's what they're good at.
And everybody thinks they're going to be rich one day.
Yep.
Okay.
And so I did this, the calculation for a billionaire, what it takes, just to make a billion dollars.
Yep.
And I think I use $500 an hour, which is a very good amount of money.
$500 an hour, 24 hours a day, seven days a week.
And I think it came out to like, you'd have to work $2,300,000.
or 4,000 years or 2,300 years.
It's ridiculous.
It's a ridiculous amount of money.
That's what I'm saying.
Well, you want to know what makes it even more ridiculous.
Turn it around.
You have a billion dollars.
You want to get rid of it.
You spend $500 an hour, 24-7.
And it takes you many times longer than a human life.
What do you need more than a billion dollars for?
That's my point.
So you get $2 billion, and that's it.
I did that calculation with Elon Musk's wealth.
Oh, did you?
Yeah.
You'd turn all of his money.
into $100 bills
and lay them end to end
and you ask how far
does it go around the earth
we can go several times around the earth
with $100 bills
okay and then there's some leftover money
tape them together to a ribbon
and you'll have enough left over
to go to the moon and back
yeah see that's ridiculous
that's freaking ridiculous
so now we feel like we have to protect these people
this is what I don't understand
the issue is they're outsized power they have
over laws legislation
politicians, and the rest of us.
Out of mind, rich people, provided they're not trying
to control my life.
Exactly.
Okay.
No, I agree.
I'm with you.
We've got to land this plane.
Okay.
I think I figured out what's going on here.
A lot of smart people, a lot of wealthy people, a lot of people with influence,
trying to figure out what kind of future we will have, what kind of future we should have.
And we all know that future will pivot on advances in science.
and technology, as civilization has always pivoted on science and technology.
And so, but we're at a point now, and maybe we've been at this point before.
So is this really any different?
I don't know.
But it seems like we have the future in the palm of our hands.
And in the end it comes down to not how advanced the science is,
not how clever anybody is, not how, it's not related to any of that.
It has to do with how wise we are in the face of our own creations.
And wisdom, I think, is an undervalued factor in all the brilliance people are exhibiting
in their creations, in their discoveries, in their forces operating what the future of
civilization will be.
so if I may appeal to what it is to not only think about how great your inventions and discoveries are
but think about how you might harness it as you harness a horse an unharnessed horse runs wild
you don't know what it's going to do next a harness horse is still a horse but it gets to do exactly
what you need it to do and what you want it to do
And that is a dose of wisdom, coupled with our ingenuity.
I'd like to think there's more of that in our future.
Maybe we'll avoid the disasters that the science fiction writers always portray.
And that is a cosmic perspective.
Dude, thank you for being on StarTalk.
Thanks for having me.
This has been a lot of fun.
Good luck with the book.
Thank you.
Give me the title of the book again.
More Everything Forever.
for AI overlords, space empires,
and Silicon Valley's crusade to control the fate of humanity.
You got it right.
He did it right.
Well, he did it right this.
All right, back to Berkeley, you go.
Yes.
And keep us thinking about the future.
I will.
There's not enough of that going on.
Thank you.
Yeah, I'd be happy to come back any time.
All right.
This has been StarTalk Special Edition.
You put together another one with your peeps.
Lane Unsworth as well.
Take a large slice of credit.
There you go.
All right, Chuck.
Always a pleasure.
We're all good here.
Neil deGrasse Tyson for StarTalk Special Edition, bidding you to keep looking up.
