TRASHFUTURE - Lock, Stock, and One Infinite Wheel of Suffering feat. Adam Becker
Episode Date: October 14, 2025Astrophysicist Adam Becker joins us in this pre-recorded episode to talk about the ongoing billionaire space race and the material limits of the ideology of infinite growth that underpins it. Also, we... send a pitch out into the universe to Guy Ritchie for a great new character. Check out MORE EVERYTHING FOREVER here! Get more TF episodes each week by subscribing to our Patreon here! TF Merch is still available here! (We can't ship to the US right now but we're working on it!) *MILO ALERT* Check out Milo’s tour dates here: https://www.miloedwards.co.uk/liveshows Trashfuture are: Riley (@raaleh), Milo (@Milo_Edwards), Hussein (@HKesvani), Nate (@inthesedeserts), and November (@postoctobrist)
Transcript
Discussion (0)
Listen, Sunshine, if I had escaped a cycle of samsara inside of five minutes,
then I'm going to be reincarnated, not as a lot of as fucking flower, but as your fucking worst nightmare.
Yeah, I don't like it as a mocha.
Oh, listen, mate.
You, Sunshine, must fucking imagine
Sissive Vasappi. All right.
Well, before we were recording,
we accidentally invented Giza Buddhism.
And then Riley suggested that we introduced
Guy Ritchie to the concept of Buddhism.
And I had to explain that Guy Ritchie was married to Madonna
and therefore, you know, Buddhism's greatest deep cover agent
and therefore I would say is intimately familiar
with the cycle of suffering.
We can do it.
We can get Guy Ritchie, I think, my opinion.
I think Guy Ritchie could be made to, instead of making movies, you know, it is, not even
instead of.
We could say, Guy Ritchie, go ahead.
Promote your 100,000 pound barbecue company like you did in the whole framing narrative
of the movie, The Gentleman, promote your, like, pub that you own in, like, Notting Hill or
whatever.
But, but, but, we need a better Buddhist action hero than Steven Seagall.
I, I don't know.
I think Steven Seagull might be a bodice.
sat foot. And I'm not, that's going to sound like a fat joke. It isn't. It's because he remains
seated a lot, which is key to relaxation, and because I genuinely believe he has no worldly
attachment. Stephen Seagor has no idea how he ended up in Russia. It is something of absolute
meaninglessness to him. He has no import because he has no attachment, right? And so that's why we
have to get Stephen Seagore in lock, stock, and one infinite wheel of suffering. Or, better yet,
syringeful of Stephen Seagull's blood, shoot Guy Ritchie with it, and then bam, he's ready to make the kind of movie we want to see.
Oh, wait, hello, hello, hello, everybody.
Oh, wait, we're recording a podcast.
Oh, yeah, with a guest who's not been on before.
No, hello everybody.
Welcome to a recorded slightly in advance episode of TF.
We are today, it's Nova, it's Riley is Hussein.
We are talking with Adam Becker, the author of More Everything Forever, which is basically a book kind of about the test real ideology.
It's one we've discussed numerous times in the show, but never by this name.
Adam is also a PhD astrophysicist, freelance journalist.
And Adam, welcome to the cycle of samsara.
Oh, thanks.
It's inescapable.
Yeah.
Don't like it.
There's no door.
I know door at all.
Don't like it.
No exit, if you will.
You have to like free yourself for even the concept of a door, ideally.
We're going to talk a little bit about some of the concepts, Adam, that you bring up in your book,
more everything forever, which is about.
about the, I would say, impossible dreams of the people who are currently in charge of
conservatively 80% of the economic growth that's currently happening in the United States
and broader Western world.
Yeah, sounds about right.
Hell yes.
I really hope that that growth isn't in any way like a phantasmic.
Oh, do you want to know something really funny?
Something really fucking funny is I have noticed, I've noticed this more and more every day.
It just look for the, like, everyone doing this, everyone I'm talking to, everyone,
who's listening. Train your mind to play a little chime whenever you see the words since 2009.
Yeah, because we're hitting a lot of landmarks for economic symptoms, not seen since 2009, aren't we?
Yeah, consumer confidence, subprime auto loan default, credit card debt, retail hiring in the halt for the
holiday season. Taylor Swift album releases. Since 2009.
which is pretty cool.
So, yeah, you can just look out for that,
especially now that the Fed is now saying,
yeah, if the AI bubble pops,
there's just nothing holding this together at all.
Do not press the wings fall off button.
I think the XX are also coming back this year.
So that also is something that hasn't been seen since 2009.
Yeah, no, I mean, that bubble seems like it's going to pop at some point.
It's just a question of when, you know?
The most important thing to remember is,
sure this up now to make sure you can keep doing it
through potentially like infinity years of economic lapse.
Keep your podcast subscriptions going.
I cannot stress enough how vitally important it is
that that cannot be an expense that you feel like
you have to cut when the economy collapses.
You make that central to your economic planning
for your household.
Look, you need a hearth to gather around
and that's what this is, digital.
I saw a stat
today that if you strip out building data centers
the US GDP grew by 0.1% this year
with the UK it's even worse
that 80% of the value growth
in the S&P 500 is down to AI-related stocks
so without this, there's nothing.
The entire US economy into an extension,
British economy, not so much continental European,
is one big leveraged bet
on building digital god pretty soon yep and it's definitely not happening oh god and it also requires
like so much money to kind of keep it going right like as like ed zitron kind of keeps pointing out
the other i mean the other source of like economic growth i can imagine um and this would be like
an interesting backup plan are like continual taylor swift uh tours yeah if 80 percent of the economy
is taylor swift yeah no if we replace if we replace the ai with taylor swift that's definitely a more
sound basis for this economy.
I mean, because the AI, it's just
a bullshit machine. Yeah, Taylor Swift
arguably exists.
If Taylor Swift did not exist,
it would be necessary to invent her.
Or my other
favorite essay by Bodriard, the Ares
Tour did not take place.
My favorite part of the Ares Tour is when
you could, like, see the missile go down
the air vent and kill Charlie XX.
So this is
This is just something to look out for, which is that it seems like, like Deutsche Bank said,
the AI bubble is the only thing keeping the U.S. economy together.
Bloomberg said without data centers, GDP growth is 0.1% in the first half of 2025.
And like, without this, there's nothing.
And even all the guys have said, guys in charge me like, oh, yeah, this is all,
none of this investment is worth anything if we don't build God.
And there was, again, I think a lot of people have seen this chart recently showing the
circular sort of vendor financing
between OpenAI, Microsoft,
AMD, NVIDIA and stuff.
And it's like this is trillions of dollars
of value being exchanged,
but none of it connected to any labor
and none of it connected to any kind of product
consumed by anyone.
There's no economic output.
It's just self-referential.
Right, but to them,
they see that as a good thing, right?
Yeah.
Like, the venture capital ecosystem
in Silicon Valley runs on hype, right?
it needs hype like oxygen to breathe.
And so the idea that you can create a sort of human centipede like or human or a boros
like infinite money printing machine where it just like eats its own excrement.
And as long as, you know, nobody points out that nothing of value is being created,
then you can just keep the party going.
I think they see that as a feature, not a bug.
And the fact that it's also leading to, you know, the erosion of labor rights and lots of people
being laid off from jobs. Those are definitely things that they actually want to happen, right?
They want to use this to break the power of labor. And we're actually going to get into a little bit of that
with some choice morsels that I've clipped from the last couple of months of the tech gods that we've
spoken about a little less recently. It's been very Altman. It's been very Jensen on this show.
We haven't really talked Musk recently because he's been a bit utra, but we are going to be doing that.
But first, I have a little mini startup for us to talk about and then we're going to get into it.
This isn't so much a startup as it's an application, as an app.
Okay, it's not a question so much as a comment.
Yeah, it is the Descartes XR.
The Descartes XR.
You can transform your reality.
Sounds like a vape.
You mean the Descartes XR?
I think, therefore I vape.
I got the special edition, Descartesian dualism is where you have one vape in each hand.
I think if we had a brand of like, this is the real TF merch thing,
if we wanted to get really unethical with it,
is a, like, philosophical school affiliated vapes.
So obviously, like, we have the Marxist vape comes first,
but then, you know, we've got to talk about those sort of,
like, other Hegelian vapes more generally.
But you can only use the Marxist vape to smoke the opiate of the masses.
Yeah, of course.
Yeah, like, how about this?
Check this out.
You've got an, um, an Alexandra Coheve vape,
and it's just thrown at you.
Uh, or you have, um, you've got, uh, you've got a Hegelian vape,
and it's just someone coming up.
to you and saying, vape.
Why, yeah, why doesn't the Botrya vape emit any vape cloud?
Yeah, well, quite.
You have the platonic vape.
It's a shadow.
It only casts shadows.
The Neo-Platonic vape is just a, uh, an NFT.
Yeah.
You, of course, have the Judeo-Christian vape, which just has written on it,
something completely unprintable and unrepeatable.
It's the vape that makes you feel guilty for vaping it.
Yeah.
So we're gonna, number one, that's just going to be something that hopefully we make when
we become a vape shop, like every other business.
Britain. Oh yeah, that's going to be, that's going to be so ethical for us to do as
look into getting some custom vapes, mate. No, we're going to market them to students,
but university students, philosophy students. It's just of all the lawsuits I expected to
take down Frash Future, I wasn't expecting to be like popcorn lung class action.
Because we're using just like the worst industrial runoff ever. We could invest in all the money
in the theme.
Also, as a marketing scheme, targeting
philosophy graduates.
Yeah, the money is. That's what I would say.
That's it. Yeah, exactly.
Yes. So to the Decart
XR, transform reality with
AI, live inside any style.
I'm sorry.
Are you reading something, like, are you reading
real copy that exists somewhere? Yep, I sure
am. It's off of the MetaQuest
App Store. Live inside any
style makes me feel like,
it's some kind of like AR glasses thing where
Like, if I want to redecorate my apartment, I just get AI to do it.
Nothing changes.
I live in the kind of like rising dunge apartment, but like I believe in my own head that it's like very kind of beautifully decorated.
So I will give you an example.
They say, turn your world into a living canvas with Mirage LSD for the MetaQuest 3.
Point your headsets camera at any scene and watch as advanced AI transforms it in real time into stunning visual styles,
cyberpunk cities, anime worlds, medieval castles or anything else you can imagine.
Don't, don't, don't call it Mirage LSD, you fucking try-hards.
Do real drugs.
Simply, speak your desired transformation such as make this look like a Studio Ghibli movie
or turn this into a neon-lit cyberpunk street.
Unfortunately, the fact that she looks like a Studio Ghibli character does not make me feel any better
about the fact that my girlfriend is leaving me.
Our cutting edge AI, video to video AI processes, your live camera,
creating immersive experience where you literally live inside and AI generated a piece of art.
They invented it.
The situations, the real-life situations that this is going to get used on are so grim.
I'm thinking about the cyberpunk custody hearing, which is not going in your favor because of the
like robocop judge who apparently believes in like feminism or something.
Or alternately, it can just be, you can instantly gentrify a neighborhood by being like,
please turn these homeless people into mailboxes.
Yeah, for real.
Why not?
Why is this mailbox yelling at me?
Yeah, I also figured, you know what this is?
This is the ultimate answer to the long tale of the British right wing project
because you can live in a drawing.
You can literally live in a drawing.
Yeah, that's true.
That is true.
Yeah, just ask you to show you a world where there's no such thing as society.
Yeah.
Hey, that, like, person who's not white, can you just, can you flip that guy into being a vicar
on a bike?
You know?
This might be better, actually.
Can we, can we, I've flipped my opinion completely on this.
Can we make this a kind of court-ordered thing where if you're right-wing enough,
you can just, you get put in the hug box.
We put you in the Matrix, right?
And you just get to play with your WoJack dolls.
You get to kind of, like, inflict all of that on your own consciousness.
And you're just kind of walking around bumping into lampposts and stuff
and not bothering real people who still have sentions.
Yeah, you're bumping into lamp posts that were, still have like,
look like they're being lit by gas rather than woke electricity.
Oh, sorry, Vicka.
It's just
10s of thousands of...
It's gone wrong.
It just turns everything into Vickers.
Like, you know how
early, early AI image stuff
used to turn everything into
like weird bubbly eyes?
Just that but Vickers.
Basically, yeah, that's the idea.
What if you could just
further retreat from reality?
What if I told you
that Vicar world
wasn't real?
It's the world
that's been pulled over your eyes
to blind you from the truth.
We got to start using right wingers as like human batteries.
Like, it's the only way we're going to do any carbon capture.
I'm sorry.
Just looking at the tablet as you're kind of working in the carbon capture facility.
It's just like, this is fucking weird.
This one's just, it's all vickers in here.
I don't know what this goes up to.
You've got the goggles on.
You're living in the Britain of the 1960s or 50s or whatever you're is nostalgic for.
And you believe you're the vicar on the bike, but actually you're just pedaling a bike that's powering a carbon.
capture facility.
You're like, well, another day of riding around, you know, fucking like the Cotswalt.
Okay.
All right.
All right.
So that's a little one because it's just a little app, but my God, is it fucking hilarious.
Nice little moose.
Let's reconfigure our entire society around this place.
I'm still hung up on Mirage LSD.
Can we just quickly distribute that to every reform U.K. voter?
You know?
And then, hey, everyone's happy.
If you distribute that to every reform UK voter, you can then tell them that the election is some time other than when it actually is.
That's true.
How about this?
There are no elections.
You got your wish.
Yeah.
There are not.
You're guys in charge.
They look at the TV and it's just, you know, Nigel Farage announces new type of gay it's illegal to be.
And they'll be like, oh, yes.
Wonderful.
Yeah, very good.
Hand sexual today.
Weird.
I wasn't expecting that one.
You know, Nigel Farage goes on TV, according to you, and announces an official policy of buy erasure.
No longer
The UK no longer recognizes
by people
And then all those reformed people
Like very good
Apparently doesn't out
And yeah
That's this is where we're going
Anyway, anyway
Look, look
I want to move on
I want to talk about the
We've bothered a very educated
Multi-published author
To be here with us today
And we're bugging him about nonsense
We did Buddhist Stephen Segal
Brain, the fucking goggles
Let's talk about Tescriel
Sure
Because this is an ideology that we've talked about on the show quite a bit, but not really by that name.
So just for anyone listening, you might not know, what is Tescriel and how is it the subject of your book, More Everything Forever?
So, Tescriel is an acronym for a set of bizarre, formerly fringe ideologies that I'm not actually going to bother, like, going through what every single letter in the acronym stands for.
because while I think it,
Tescreal is like a thing that it is good to, like, identify,
I don't actually love the name because, you know,
it's just this acronym that that's like filled with jargon,
like transhumanism, extropionism, singularitarianism, all that stuff.
Basically, the idea here is these are people who want to go to space and live forever.
And think that that is a real viable thing that can actually be accomplished.
and would be desirable.
And they really believe that technology will save them.
And I don't just mean like save in the sense of like fix problems.
I mean in some cases, literally save like a computer file.
Like technology will enable them to like upload their minds and save themselves into the cloud
where they will live forever with an AI god and, you know, colonize the universe.
Hmm, the kind of, you're kind of Nick Bostrom stuff of like, death real bad, death is bad, don't want to do it, what is there as an app?
Yeah, exactly. Nick Bostrom is very, very big in this community. He's very influential. And the thing is like these ideas, the idea that like an AI god is coming and is going to allow us all to live forever in space used to be sort of a fringe philosophy. But now it is very, very, very influential in the tech industry.
and the leaders of the tech industry have become some of the most powerful and wealthy people in the world who are propping up the entire economy on delusions that an AI god is coming and that they are saving the world by, you know, building a machine god and then using it to go to space.
Both of those are very bad ideas and neither one is happening and it's all going to come crashing down at some point.
I assume you're just saying this is a cultural critic, right, and you don't have any expertise in like space travel or anything like that, right?
You don't, for example, you don't, in your book even, I believe, say this, you don't look at the kind of energy that will be required in order to do this and then map that against the Kardashev scale.
Right.
For example.
Yeah.
Let's just say.
No, I mean, look, I think that one of the problems, one of the reasons why this, you know, ideology of technological salvation has sort of hung out.
for as long as it has on the fringes and has been able to, you know, gain more followers
is that the usual response to it over the years from, you know, people who don't buy in
has been, you know, just sort of bemused dismissal, right? Because it just sounds stupid on the face
of it. I think it is, you know, pretty awful and it is kind of stupid on the face of it. But at this
point, with so many powerful and influential people buying into these ideas, we can't just dismiss them
out of hand. We have to say, no, no, no, here's why this doesn't work. So, yeah, you know, when it was
just, you know, a bunch of science fiction nerds, and I say that as a science fiction nerd,
but when it was just a bunch of science fiction nerds who were saying, like, oh, we need to go
colonize Mars, then, you know, maybe it's not that big of a problem that they want to do
something that you just can't do. Colonizing Mars is not going to happen. But when you have someone
like Elon Musk, sort of taking that cultural narrative of Mars as the future.
and using it to portray himself as the savior of humanity by colonizing Mars, that's a serious
problem. And it's not happening. We're not going to Mars. We're not going to have a million
people living there by 2050, the way that Musk says that we have to and are going to. It is not
happening because Mars is an absolutely terrible place. Now, I thought that all it would take is some
domes. I was reliably informed that domes were the answer, a sufficient number of domes, and then
oxygen and an artificially intelligent computer god would square those circles and kind of
all the stuff that seems impossible, it will figure it out somehow.
Yeah, no.
Oh, damn.
Yeah.
I was really banking on the Mars thing.
Shit, okay.
All right.
I need to make some costs.
Yeah.
No, look, one of the things that these guys believe in, and they are basically all guys,
they believe that the AI is going to solve every problem.
That, you know, the fact that Mars is an absolutely horrible.
place that it is worse than Earth on, you know, the worst day in Earth's history, right? You know,
you could have a giant asteroid hit Earth. You could set off every nuclear weapon. You could do both
of those at the same time, and Earth would still be a nicer place to live than Mars. But, you know,
that fact or, you know, other things like, you know, global warming, there is this persistent belief
that if you just build a big enough and smart enough AI, it will just solve these problems. And
And there's, you know, this is just not how the world works, right?
They think, oh, it'll discover new laws of physics and that will, you know, remove the
limitations that we currently have.
That is not how physics works.
That is not how the history of science works.
The history of science has often been about discovering new limitations that were previously
unsuspected.
There is, you know, no guarantee.
But I don't like that.
That sounds hard.
Yeah.
Like, I got into science.
We choose to go to Mars and to do the other things, not because.
as they are easy, but because we thought
they would be easy.
I think you're doing that in the wrong accent, right?
You know, what has we wanted to choose?
Yeah, I mean, have you considered like a theory,
I've been working on this theory actually while you've been talking.
I'm just going to read it out now.
By physics, right?
Men go to Mars to get more stars and girls go to Jupiter to get more stupider.
That's true.
Yeah, I'm going to do that.
I am going to do that.
That's affirming to me.
Yeah, exactly.
I think, I think, yeah, I have been.
The sort of like AI will solve the problems.
It reminds me a lot of this like New York Times article that came out like a few months ago.
And it was like one of those articles that was like kind of talking about it was like they interviewed people who had basically like lost their minds kind of like or they became very convinced that chat GPT was their friend and like they got alienated from their families.
And like quite often these are just like very regular people.
There's a guy who's just like, you know, he's like a single father of two and he wanted to help his kid out with like their math homework.
And like chat GPT basically convinced him that like he had invented a new.
kind of maths and he was like so insistent that he had invented this new way of doing like a square
route that he fell out with his kids he fell out with his ex-wife he fell out with the school
boards um he fell out with like everyone like he fell out with like colleagues of his who were just like
blue collar workers who for them were just like can you please just like stop talking about the math
you've invented and just like do your job yeah okay but then he made the best man he made hustle and
Flow, Iron Man, other Terrence Howard movies.
Hey, hey, hey, look, I'm setting up the Terrence Howard Defense League has arrived.
Terrence Howard invented his version of math without the AI.
And he's like very like, he has actually sort of like said that very vociferously, like
recently where he's been like, I invented this new kind of math that I didn't need the
AI to help me.
And he's been trying to like get the AI to prove Terriology, but it's very different to
what happened to the guy who was quoted in the New York Times.
article. The point being, number one, this guy clearly is going to sort of like solve the
Mars problem. But I think on a more serious note, like it is very kind of, I don't know. I mean,
like we sort of go through this every time we do an episode these days, but it is just like,
oh, okay, you know, you do, like the sort of AI will do it has kind of made it very easy to just
like not have to do any of the work, right? And still get paid for it. And so what they're doing
is they're basically taking the billionaire experience and exporting it to the masses, right?
because, you know, who else says that they've created a completely new way of understanding the world
and it makes no goddamn sense and they have no expertise, but, you know, everyone around them just
constantly says yes. I mean, that's ultra wealthy people, right? And so the AI will just give you
sort of the sycophant experience of being an ultra wealthy person without the wealth or power.
And so, you know, without the wealth and power, if you just actually listen to the sycophant
constantly whispering in your ear, you'll go crazy or lose everything.
that matters to you in your life because you don't have billions of dollars that allows you to
like, you know, warp the people around you into indulging your bullshit.
I knew I was going wrong somewhere and for a long time I'd suspected that it was not having
billions of dollars, but to have it confirmed for me, to have it kind of like professionally
diagnosed as very reassuring.
Well, so basically, Tesla, some weeks ago, released its fourth master plan sandwiched between
immigration, panic, transphobia, and the usual stuff.
And it is one of the most written by AI things ever.
Elon Musk has, like, cooked his, he has, he has pan-fried his brain and bladder with ketamine.
Yeah.
But I think it's like, it's a good, it's another good lens into looking at this.
And the first thing I'm going to talk, it's not from the plan, but it's related.
He was speaking recently about SpaceX.
Like, stage one, do a bunch more casement.
Stage two, question mark.
Stage three, Mars.
Yeah.
No, it's stage one, do a bunch more ketamine.
And stage two, question mark.
Stage three, quick floor nap.
Quick floor nap, really loosen up all those bladder muscles, you know.
I like to think of it as the billionaire's indulgence.
Look, you can piss in the spacesuit, okay?
You can have a loose bladder.
It's fine.
My God, that's why he wants to go to space is he can just be pissing himself all the time there.
Yeah, yeah, no, I don't have to wear this catheter in space.
If I'm in the space suit, I can do as much ketamine as I want.
Okay, so the thing that destroys the podcast is,
now 50-50 between like
vape-cut cadmium popcorn
lung lawsuit and Elon Musk
Ketamon piss boy lawsuit.
He says, he says, regarding
going to Mars, I think it can be done in
30 years provided there's an
exponential, oh sorry, sorry, excuse me, Adam.
It's so good when you have
a specific claim and then you
have an expert
who is able to react in the moment
with barked laughter.
So I think it can be done in 30 years,
So here's the thing, Adam, you barked that laughter before he gave his proviso.
Oh, yeah, no, no, let's do it. Hit me. Hit me. I think I know what he's going to say. I just, I'm so excited.
Provided, there's an exponential increase in the tonnage to Mars with each successive Mars transfer window.
Yeah. And he said this.
Send more stuff every time. What are you stupid?
Yeah, the problem is the stuff, idiot.
Have you heard of this thing called the benevolence of the rocket equation that says,
that it's actually easy to send more stuff into space?
Yeah, the more stuff you send,
the easier it is because there's less stuff on Earth
and Earth's gravity is less strong
because there's less stuff on it, you fucking moron.
God ask that.
Jesus fucking Christ.
The punchline is where he made these remarks.
He made these remarks at an event called the All In Summit
organized by the All In podcast.
Of course.
A fucking course.
He made this at a basically Shamath Spack trade show
is like, yeah, we can get to Mars.
We just need to be invent a way
to send everything there immediately.
Step one.
Don't siege Mars.
Take it immediately.
And to be truly self-sufficient, Musk said,
it will need to have all the ingredients of civilization,
such as those will enable settlers to grow food,
generate fuel from the atmosphere,
build microchips, computers, rockets,
and many other things,
which again, if you can just move infinite stuff,
sounds pretty easy, right?
I just, he'd be lucky to get a person,
on Mars and back safely by 2050, one person.
One person, I assume, and the ingredients of an entire civilization.
Right.
Yeah, I mean, but...
Some assembly required.
Okay, maybe, maybe let's say two or three people, right?
A very small mission.
Get it there, land, come back by 2050.
I will be a bit surprised if it can be done in that time window.
the technical challenges that remain just to do that are massive.
There is zero chance.
It is not happening that you get a million people on Mars by 2050.
It is just not in the realm of possibility.
Mars is too awful the amount of technical challenges that you would have to solve just
to send anyone there and back.
Just to have someone in space outside of.
of Earth's orbit for that long and then bring them back safely that far away from any possibility
of help, any possibility of real-time communication with Earth, just getting them there and back
without even landing, like doing a weird Mars flyby. That's still something like, you know, the
shortest possible Mars mission is something like a year because of the launch window issues
because, you know, the Earth and Mars don't orbit each other. And so they both, they both
both orbit the sun and they're in separate orbits.
And so there's only certain times that you can launch to go to Mars, right?
Okay, just go through the sun.
Yeah.
It's made of gas.
I mean, okay, so you're joking when you say that,
but there is a possible Mars return path that gets, you know,
goes through like the inner solar system,
goes closer to the sun than the earth is.
Like, I think it goes within the orbit of Venus or something like that.
It's really terrifying.
just because of the radiation exposure.
And that's good?
Yeah.
I assume that's good.
That's good for you.
Yeah.
This is just about radiation exposure and like life support systems in like, you know,
a mission to Mars through space, like much less living on Mars, landing on Mars,
getting back off the surface, actually living there.
No, no, there's no way.
And also a million people, even if you could do it, is not enough to be self-sufficient.
I mean, listen, you make all of these points, right?
However, I have, and I've just checked,
there's 1,053.2 hours in Kerbal space program.
Let me just say that I've seen season one for all mankind.
Let me just say this.
Let me try it.
Give me the controller.
Let me try it.
Look, I think that it should be illegal to be a billionaire,
but if Musk wants to use his money to send himself to Mars,
and I realize this is not a controller,
original joke, but, like, I am fine with him going to Mars. I think that that would go very well.
He even said once, you know, back when he deigned to allow people to, like, write lines for him,
as opposed to thinking that he, you know, had the heart of a poster and was a comedic genius,
he once said, you know, it would be cool to die on Mars, just not an impact. And my feeling about
it is, you know, why be so picky, man? Yeah, beggars can't be chooses. You want to be specific and make it a
runway, huh? I mean, yeah, I think thinking about Elon Musk going to Mars, thinking about the things
he would have them load in for that journey, just being like, why is the amount of Ketamon that we're
loading in measured in units this large? But also, I think it'll be a fascinating scientific
experiment, right? Because I don't just want to keep coming back to the Keteman joke, because his
body's fucked up in other ways, right? And we know that, like, in space, astronauts, very, very healthy
people already, like, by design, have to be, you know, kind of like exercising constantly, and even
then it takes a huge toll on their bodies. I don't know what space does to the kind of, like, flabbiest,
weirdest torso-shaped wind tunnel-looking motherfucker I've ever seen in my life. I don't know.
And I kind of want to find out. I guarantee five minutes outside the car.
line, that motherfucker is looking like Mr.
House. What if it
makes him hot? You never thought about that?
Interesting question. What if that's kind of like
a Zimarsian situation going on?
I think if you tell him
it would make him hot, he would go,
right? You know, I mean,
like, not to give credit to any billionaire
for anything ever, but
you know, Jeff Bezos at least went up
on one of his rockets. And he got hot.
Yeah, surprisingly
fuckable now.
But Elon Musk doesn't have the guts.
He doesn't have the balls to do it.
Yeah.
And, you know, maybe that's a smart call because his rockets keep exploding.
But, you know.
This goes back to, I think, a thesis that Nova developed a long time ago,
which is one that I always come back to, which is of all of the mega billionaires,
Jeff Bezos is the only one that seems to be even vaguely enjoying it.
Yes.
I mean, he's enjoying it in the kind of way of like, like the, here's the mind of a child, right?
And that's terrifying in the same way that being able to wish someone into the cornfield.
field is terrifying, but there is enough
kind of like spark of kind of
human vitality in that he's like
yeah, I want a wife with like big tits
and a lot of plastic surgery.
Yeah, sure. You know what? He wants
something. Elon Musk doesn't want anything.
I mean, he thinks that he
wants the Fourth Reich, but that wouldn't satisfy
him either. Hold on a second. There is something
that Elon Musk wants desperately.
Elon Musk wants
our approval. He wants us to like him.
The problem is that he picked something
that you can't buy with money.
Jeff Bezos is enjoying himself
because the things that he wants
are things that you can buy with money
that can be exchanged for goods and services.
Having people like you is not a good or a service.
Just bringing out my big kind of like solar system diagram
and showing you the relative distance
and possibility of intercept windows
for woman with big tits and Mars
and showing you the kind of like order of magnitude
difference between those things.
If he went to Mars, he wouldn't be happy.
Firing myself
into a transfer window to
the nearest woman and just
hoping for the best.
As opposed to just buy Venice
for a day, get married
in it, wear a cowboy hat to space,
start a movie studio as a
lost leader for your sex life.
He's at least enjoying it.
Oh no, I'm not making that up.
That was a quoted Amazon executive.
Who said that's what Amazon Studios was.
Oh, God.
Yeah.
Anyway, yeah, so this is a tangent.
Yeah, but I will say Jeff Bezos, going back to your question about, you know, energy,
Jeff Bezos has, like, insane ideas about human energy usage that are just,
he thinks that it's important that we go out into space because otherwise we're going to have to limit our energy usage here on Earth.
Because he says, you know, the amount of energy that we're using has been going up.
exponentially for like a couple hundred years and he says if that trend continues then in another like
300 years we'll be using all of the energy available here on earth and we'll have to go out into space
uh adam yeah quick question yeah doesn't necessarily follow that that trend would continue
absent the need to build a digital god yeah that's the thing i mean you know there's no good argument
that that trend would should or could continue it's almost as though industrialization was a
historically unique process.
Yeah. Maybe. Maybe. Maybe. Sure.
You know, but that sounds crazy, right? And also, you're probably a communist if you're talking
like that. And you need to know history to know that. You can't just be thrown into the world,
look around and be like, okay, I guess everything, I guess it's an eternal present.
I'm just going to take whatever the two points at the beginning of the day and the end of the day
are and extrapolate them into infinite vastness of time in both directions.
Oh, but history is something that you study over in, you know, like arts and sciences and
humanities, right? It's not, it's not engineering, and that means it's basically underwater basket
weaving. These people study engineering until they drop out and start their companies, right?
Like, they don't want to think about history. They don't think that history matters, right? Anthony
Lewandowski, the guy who started Waymo, said that, you know, history doesn't matter in technology.
All that matters is tomorrow. And as for Bezos, either he is ignorant of history or he, you know,
doesn't care because, I mean, this is a guy who repeatedly quotes and idolizes
Vernar von Braun, Brown, a Nazi.
Yeah, I learned that from watching season one for All Mankind.
I mean, look, season one for all mankind is actually pretty good.
Yeah, it is.
Riley, you'll enjoy this, the guy playing him, and that is the Bon Cop and Bad Cop Bon Cop.
No.
Colfior?
Canada's favorite son?
They made him Canadian.
I never, I never knew.
this, but it turns out Vernan von Braun was Canadian
the whole time. Yeah, well
hey, you know what? Like I say, Canadians
we get everywhere, right? All your favorite
Hollywood actors. Much like former
members of the Waffin SS, yeah.
Told you, we get everywhere.
Look, look. So I want to
actually talk about the master plan. Okay.
The new master plan. Yeah.
So it's, um,
so this master plan also contains one of my
favorite lines of writing in anything ever
or AI generated slop. Um,
which is, quote, the hallmark of meritocracy
is creating opportunities
that enable each person to use their skills
to accomplish whatever they imagine.
That's the hallmark of meritocracy right there.
Okay, so first of all, this is from the Tesla master plan.
Yeah, yeah.
Because the first one made kind of,
you might disagree with it, but it made sense,
which is create a high-end car,
use that to popularize electric cars,
then create a consumer model.
Yeah, it was a business plan in that sense,
whereas this is not exactly.
Also, they didn't.
follow that plan. They created high-end cars and never created the, but anyway. Any moment now,
they're going to create that $20,000 electric car. Oh, yeah. Sure. Sure they are. I suppose, yeah,
they're the only ones that could do it. And I'm just like, B.Y.D. is just like out of my field
of vision. Oh, God. The cyber drunk is just such an ugly piece of shit. Anyway,
eventually they were going to have to do the Model T. But Elon got distracted by making the model name spell sexy with
There's three in there.
Yes.
And if they did it now, it would be a sexy tea, which he's opposed to those, I guess.
Nice.
But here's the thing.
Like, again, that sexy thing, that's Musk desperately wanting people to like him.
That's what he thinks is funny.
Or rather what he thinks everyone else will think is funny.
He desperately wants to be seen as witty and funny and clever, right?
This is why he carried a fucking.
sink into Twitter headquarters so he could say let that sink in.
Uh-huh.
Right?
Which is not a good joke.
It seems pretty structurally sound to me.
Yeah.
And structural soundness is what we look for in our humor.
But speaking of structurally unsound things, that sentence about meritocracy.
Are you kidding me?
Yeah.
It's an example of something that is the opposite of structurally unsound.
Yeah.
It makes me feel like I'm having.
a stroke. So I want to read
a little bit of this. Since Tesla's
founding, each iteration of our master plan
is focused on our North Star to deliver
unconstrained sustainability without
compromise. Humans are tool makers.
At Tesla, we make physical products
at scale and at a low cost with the goal of
making life better for everyone.
Has there been something amusing? Has there been another well-structured
joke that I've missed? I'm sorry.
Is there a joke in here that we're not
familiar with? Yeah.
Unconstrained sustainability was
Is that the phrase?
You know how sustainability has to not have constraints.
Because it's so often constrained, you know?
Yeah.
Sustainability is all about being unconstrained.
Yeah, what the fuck does that mean?
Is this word sound?
This is what I meant when I said it makes me feel like I'm having a stroke, right?
Like, it feels like suddenly I have aphasia.
And I cannot understand what words mean anymore.
As the influence and impact of AI technology increases, no, it's not every single use of
AI technology that is not either for writing a master plan or for having fun as a toy is just 95% of
AI pilots and workplaces fail. Right. So I don't understand where that's coming from.
Well, and also the people, like, there was that fascinating study that showed that, who was it,
they gave a bunch of AI tools to open source software developers and asked them to estimate how much
faster it made them. And they all said something like, you know, between 20% and, you know,
200% faster and then it turned out
it made them all 20% slower.
But yeah, that's the impact.
That's unconstrained sustainability.
We're building less.
That's degrowth.
Thank you, Elon. Very cool.
Yeah, so it says,
The next chapter in Tesla's story
will help create a world we've only just begun to imagine
and will do so at a scale
that we have yet to see AI generated.
We are building the products and services
that bring AI into the physical world.
We have been working tirelessly for nearly two decades
to create the foundation for this technological renaissance
to development of electric vehicles, energy products, and humanoid robots.
Now we are combining our manufacturing capabilities with our autonomous prowess
to deliver new products and services that will accelerate global prosperity.
Sorry, the autonomous prowess of full self-driving is 18 months away for the last 10 years?
Yes, yes, that's the one.
Yeah, the one that, you know, kills children at crosswalk.
Yeah, or the one where you have a teleoperated, you know, optimist robot scooping popcorn
at the Tesla diner.
It's playing Shahnana from the 80s.
Yeah, they got Bowser from Sean
on controlling an autonomous robot.
Their guiding principles of the following.
The first one, growth is infinite.
Assume a perfectly infinite growth.
Oh, God.
Yeah.
I got to say, Nova, you mentioned this earlier,
but I'm reiterating it.
Reading this stuff with an astrophysicist on the call
is very satisfying.
I mean, there's a reason I titled the book
More Everything Forever, right?
Like, it's meant to be sarcastic or ironic.
Your book, The Terror Nexus.
Yeah, exactly.
I mean, the sort of, like, secret title of my book, like, the title of it in my heart is
these fucking people.
And what these fucking people want is more everything forever.
They believe in infinite growth.
And there is no world in which that happens.
There are limited resources.
There are limits to everything.
And the fact that they can't recognize that.
It just, you know, you were talking about how Bezos is childlike, I think that in the same way that, you know, maybe Bezos really is more mature than the rest of the billionaires and that Bezos's desires of maybe like a seven-year-old or maybe like a 12-year-old.
Whereas I feel like with Musk, you're looking at something like a two-year-old, right?
Someone who wants things but is never satisfied with anything and can't articulate what it is that he actually wants.
Well, it's what he actually, I think a lot of it really is a fear of limits.
A lot of it is a fear of limits to things like growth or money or life.
It's why they're obsessed with brain uploading.
Oh, yeah, yeah, yeah.
They can't possibly consider a world in which they can't have everything that they want.
The idea of a limit means that they may not actually have the kind of control over the world that they
fantasize having because they're afraid, right?
You know, like where does the kind of?
of limitless need for money, power, and adulation come from,
it has to come from fear for these guys, right?
They're afraid of something.
And I think in most cases, it's just death.
Well, so you're actually right.
The false promise of endless growth as a singular utopia
shines undimed by considerations of the fact
that if we continue growing our energy use
at the same rate as we have since the Industrial Revolution,
we'll be using all the energy that the Milky Way galaxy generates
in about 3,700 years,
which considering the size of the galaxy,
this is me editorializing,
3,700 years is not that long.
Yeah, well, actually, it's, I think it was more like all of the energy in the observable universe in 3,700 years.
So it's even worse.
And also, that's if you spot them warp drive, right?
Because you can't actually get to the edge of the Milky Way, much less the edge of the observable universe in 3,700 years, going at the speed of life.
So, you know, even if you let them break one of the most fundamental laws of physics, you still can't have unconstrained growth.
And they don't care.
You know, they think that if you just, you know, wish hard enough, you'll get what you want and find a way around things like, you know, the speed of light limit and the laws of thermodynamics.
And that's not how anything works.
I'm very excited for a version of XAI to be released.
That's a play on the name of Zephram Cochran that just like blankets all of Memphis, Tennessee and a cloud of toxic smoke.
Fantastic.
That is a deep Star Trek cut, right?
there. But just as with any group that has glimpsed paradise, proponents of this type of future
are primarily concerned with imagined fears that could prevent their implausible visions from coming
to pass. Yeah. And the imagined fear, because it doesn't make any sense to be afraid of
death because it's inevitable. So if you are allowing a fear of death to structure everything
that you, and not immediate death, but like death as a concept, to structure everything you do,
and that is essentially an imaginary fear because it's premised on the idea that you can avoid it
in general. You know what I mean? I would say,
they, it's pretty reasonable to have, you know, some fear of death, right?
The, the place where I sort of, you know, depart the train is when they sort of use fear of death as like an organizing principle in their lives, right?
And of course, like, Buddhist Guy Ritchie would never do this, by the one.
Just imagining like Stephen Segal, like sitting cross-legged, like, levitating.
Anyway, so they are convinced.
that if they just want something enough, they can make it happen and that there must be a way around
the inevitability of death. And that's not going to happen. These people are going to die.
God, I hope sooner rather than later. I mean, look, there is something, you know, I don't, what's the
old line? Like, you know, I don't wish death on anybody, but, you know, there have been some obituaries
I've read with great gusto. Um, right? You know, it, it is, there is something
darkly comic about the fact that, like, say, to pick another billionaire we haven't talked about
yet, the only person who isn't sure that Peter Thiel is going to die someday is Peter Thiel,
right? That of all the things that could be surprising to someone, that death would be
surprising, the single question that we already know the answer to for sure about everybody,
that that's the thing that is surprising to these guys, you know, and then we take them seriously
when they talk about anything else. It's really, you know, they're a clown show. The problem is
that our society is structured in such a way that we've given this sociopathic clown show
enormous amounts of power and money and influence. Yeah, so some of the other things that the
plan says is it also are the axioms, I guess. It's because not a plan, it's just a series of
axioms. Growth in one area is not required to decline in another. Short
And shortages and resources can be remedy by improved technology and new ideas.
Oh, wait, wait, wait, wait, wait, wait, wait.
Uh-huh, yes?
That last sentence again?
Oh, shortages and resources can be remedied by improved technology, more innovation in new ideas.
Remediation and new ideas, that's the same thing class.
Yeah, but also, but also a shortage in a resource can always be remedied by a new idea or a new invention.
Like, God will come up with it.
Yeah, exactly.
No, this is the thing.
They think that the AI is a fucking genie.
They think it's just going to be able to do anything regardless of what, you know, the actual physical limitations on the world are.
And regardless of the fact that nobody knows how to build that kind of AI anyway and that the idea of it is just like fundamentally incoherent and ill-defined.
God, these guys are such fucking Protestants.
I swear to God.
No, I mean, they are, they are, right?
They are just trying to build a God.
But they're trying to build God in the sense of like a claw machine where you just like put in wishes.
They're trying to because only Protestants think of God is basically a genie.
And this is how they're imagining AI God is Protestant God.
Well, like, but this is the thing when I say they want to go to space and live forever.
This is an idea that fundamentally goes back to, you know, like a kind of Christian millenarian vision.
of going to heaven and living with God forever who will fulfill your every wish
once the world to come arrives.
Yeah, and the world to come just happens to be covered in data centers.
Yeah, well, but that's how you make the God happen, right?
So he says, for centuries, humanity's primary mode of transportation was the horse.
Then over 50 plus years, cars, the internal combustion engines powered by fossil fuels
became the standard.
The idea that batteries could be produced affordedly at a large scale to pivot the transportation
away from fossil fuels seemed a fool's errand until Tesla.
led the way forward. Through continued innovation, we have overcome the technological constraints
of battery development and built an industry powered on renewable resources. You know what this reminds
me of? This reminds me of, like, I get a lot of crackpot email, right? Because, you know,
what? You? Yeah, come on. Yeah. So I get a lot of crack pot email and one of the things that
shows up in those crackpot emails and also just in conversation with people, like a fair amount
is like, oh, but you know, how serious is the limit, you know, the speed of light limit anyway, right?
You know, we broke the sound barrier.
Why can't we break the light barrier?
Like, well, nobody ever really believed that the sound barrier was fundamentally impossible
to break.
There was no law of physics saying that you couldn't go faster than the speed of sound
or that nothing artificial could go faster than the speed of sound.
We have had technology as humans that allowed us to propel something faster than the speed
of sound artificially for thousands of years because that's what a whip crack is.
It's the tip of the whip.
going a little bit faster than the speed of sound and having like a sonic boom, right?
So this is not something that is in principle, impossible, or ruled out by the laws of physics.
Similarly, the battery technology stuff, nobody said, oh, it's impossible to build that kind of battery
according to the fundamental laws of physics.
And so what they're doing is taking their success in innovation in one area and saying,
because we were able to do this thing,
we're also able to do
this literally impossible, completely
unrelated thing.
Or just because, you know what it is, they're taking the theme,
which is speed of. Yes.
And then being like, okay, that's the same. That's the same thing.
You're because it's because it's speeds
we experience sensorily.
Yeah. So those have to be conceptually related.
Would hate to get whipped by a whip
whose tip was traveling at the speed of lightning.
If you did,
you're perfectly bisect.
It's me.
No, no, it wouldn't.
You want to know what would happen?
Please.
It would be roughly like a nuclear weapon going off at your location.
Oh, my God.
These kink parties have gotten really out of control.
So the tools we make at Tesla help us build the products that advance human prosperity.
How we develop and use autonomy and the new capabilities it makes available to us should be informed by its ability.
to enhance the human condition, making daily life better and safer for all people through
autonomous technology is always being and continues to be our focus.
Wait, wait, Tesla said that autonomous technology making us safer has been a note and
continues to be their focus.
Tesla said this.
Well, of course it sounds unlikely when you say it in a tone of obvious mockery.
I think it's technically true that it's their focus in the same way that like lung cancer
is the focus of the tobacco industry.
Also, there's a quick interlude here, which is like OpenAI's high.
aggressively in humanoid robotics as well, as is like, as are these people.
And the, Morgan Stanley, again, tries to know this stuff, concludes that, again, in order to
acquire any kind of useful autonomy in an unstructured environment, these robots will require
algorithms that go fundamentally beyond a large language model's understanding of the physical world.
Yeah, that's because large language models have no understanding of a physical world.
Well, quite.
Yeah.
Right.
So the, there is no sense of autonomy in the way that they're talking about it.
again, does not exist, and if it does exist, it would require such a step change in artificial
intelligence as to basically be that digital, the digital wish granting God that we've already
talked about is fundamentally impossible.
Yeah, they look, I think that there's two things sort of like sitting underneath all of this,
right? One of them is just like this, this propensity to like believe that they can do whatever
the fuck they want and that, you know, that they can just like have their wishes granted by this sort
of AI God. But I also think that there's this idea that the future of technology is already
written, and it's just a question of getting there faster or slower, and that, you know,
they feel that they know what the future of AI must be and what the future of space travel
must be and all of these things. And the reason they think they know this is because they
read it in a science fiction novel or a TV show or something. The same stuff that I was raised on
and still consume very happily today.
But the difference is, I know that it's fiction.
It's not a prediction of the future, right?
I mean, this is why we were talking about the torment nexus tweet earlier, right?
Like, they've confused cautionary tales and allegories for our present day with a roadmap for the future.
And that's just not true, right?
Like, there's no reason to think that just because a, you know, an AI god showed up in a science fiction story,
that means you can definitely build one.
not how the world works. Yeah, but have you considered that they want it? Yeah, exactly. Yeah,
but they want it. They want it to be that way and they like it. Yeah. Anyway, I actually think
that's as good a place as any to end it for today. But Adam, what a delight it's been to have you on
the show. Thank you so much. Oh, thank you for having me. This has been great. That's been an absolute
pleasure. Where can people find your book, more everything forever? Oh, you can find more everything
forever, wherever fine books are sold. But, you know, indie bookstores or bookshop.org or I suppose if you
must, you can get it from Amazon. But yeah. Support the one fun one. Yeah. No, don't do that. Just,
hey, just kidding. All right. Anyway, thank you very much, Adam. Thank you very much for listening.
Don't forget, there is a Patreon. You can subscribe to it. It's $5 a month. And also, you can't get mad at us
if like something big happened
that you expected us to talk about
in the last couple of days
because we recorded this
towards the end of the previous week.
So no getting mad at us, not allowed.
All right, all right.
Bye, everybody.
Bye.
