Hard Fork - Interesting Times: A Mind-Bending Conversation with Peter Thiel

Episode Date: July 11, 2025

This week, we’re bringing you a recent interview from “Interesting Times with Ross Douthat,” one of The New York Times’s newest podcasts. In this episode, Ross sits down with Peter Thiel, the ...co-founder of PayPal and Palantir and one of the most contrarian thinkers in tech. Together, they unpack Thiel’s theory that we’re living through an era of technological stagnation, and debate whether President Trump’s populism and the development of artificial intelligence will help us unlock new progress.Guest:Peter Thiel, co-founder of Paypal and Palantir.Additional Reading:Peter Thiel and the Antichrist We want to hear from you. Email us at hardfork@nytimes.com. Find “Hard Fork” on YouTube and TikTok. Unlock full access to New York Times podcasts and explore everything from politics to pop culture. Subscribe today at nytimes.com/podcasts or on Apple Podcasts and Spotify.

Transcript
Discussion (0)
Starting point is 00:00:00 Well, Kevin, it's July 11th, and what are you doing right now? Well, I'm time traveling because we are recording this in June. But on July 11th, if all goes according to plan, I will be on a beach chair, modello in hand, sunblock applied liberally all over my body, watching my child eat sand and enjoying our one and only summer vacation. Casey, what are you doing right now? Well, if all goes according to plan for me, I will be in Japan for the first time. And so if you're in Japan, you see somebody who looks like me.
Starting point is 00:00:36 It might actually be me. Yes. And let's just say, you're going to stick out in Japan. You're very tall. I'm looking forward to it. So we are away this week, but in our place, we are bringing you something really interesting, specifically an episode of Interesting Times, which is a
Starting point is 00:00:52 New Times opinion podcast, where columnist Ross Douthat explores the ideas of today's most influential leaders and thinkers of the modern American right. And a few weeks ago, he spoke with Peter Thiel, and we thought, that's a conversation our listeners should hear. Yeah, Peter Thiel is, of course, the well-known venture capitalist. He was one of the co-founders of PayPal.
Starting point is 00:01:13 He now runs Founders Fund and has a bunch of other important positions within tech. He was on the board of Metta, and he is also one of the most prominent conservatives in Silicon Valley. He was on the Trump train before it became trendy for tech executives, and he's just a guy who's been thinking about tech and the future
Starting point is 00:01:33 for a long time. So here's Ross's interview with Peter Thiel. From New York Times opinion, I'm Ross Douthat, and this is Interesting Times. Is Silicon Valley recklessly ambitious, or is it not ambitious enough? What should we fear more, Armageddon or stagnation? Why is one of the world's most successful investors worrying about the Antichrist? My guest today is a co-founder of PayPal and Palantir, and an early investor in the political careers of Donald Trump and J.D. Vance. Peter Thiel is the original tech-right power player, well known for funding a range of conservative and simply contrarian ideas.
Starting point is 00:02:58 But we're going to talk about his own ideas, because despite the slight handicap of being a billionaire, there's a good case that he's the most influential right-wing intellectual of the last 20 years. Peter Thiel, welcome to Interesting Times. Thanks for having me. So I want to start by taking you back in time about 13 or 14 years. You wrote an essay for National Review, the conservative magazine, called The End of the Future. And basically the argument in that essay was that the dynamic, fast-paced, ever-changing modern world was just not nearly as dynamic as people thought. And that actually we entered a period of technological stagnation, that sort of digital life was
Starting point is 00:03:50 a breakthrough but not as big a breakthrough as people had hoped. And that sort of the world was kind of stuck basically. And you weren't the only person to make arguments like this, but it had a special potency coming from you because you were a Silicon Valley insider who had gotten rich in the digital revolution. So I'm curious, in 2025, do you think that diagnosis still holds? Yes, I still broadly believe in the stagnation thesis. It was never an absolute thesis. So the claim was not that we were absolutely completely stuck. It was in some ways a claim about the velocity had slowed.
Starting point is 00:04:32 It wasn't zero, but from 1750 to 1970, 200 plus years were periods of accelerating change where we're, you know, relentlessly we're moving faster. The ships were faster. The railroads were faster, the cars were faster, the planes were faster, it culminates in the Concorde and the Apollo missions. And then that in all sorts of dimensions, things had slowed. I always made an exception for the world of bits. So we had computers and software and internet and mobile internet, and then the last 10-15 years you had crypto and the AI revolution, which I think is in some sense pretty big,
Starting point is 00:05:12 but the question is, is it enough to really get out of this generalized sense of stagnation? There's an epistemological question you can start with on the back to the future essays. How do we even know whether we're in stagnation or acceleration? Because one of the features of late modernity is that people are hyper specialized. And so, can you say that we're not making progress in physics unless you've devoted half your life to studying string theory? Or what about quantum computers? Or what about cancer research and biotech and the sort of all these, these verticals.
Starting point is 00:05:50 And then how much does progress in cancer count versus string theory? And so you have to give weightings to all these things. So it's in theory, it's an extremely difficult question to get a handle of. Um, and the fact that it's so hard to answer that we have ever narrower groups of guardians guarding themselves is a self-cause for skepticism. And so yes, I think broadly we're in this world that's still pretty stuck.
Starting point is 00:06:15 It's not absolutely stuck. You mentioned Back to the Future and we just showed our kids the original Back to the Future, the first one with Michael J. Fox. And of course- It was like 1955 to 1985, 30 years back. And then the Back to Future 2 was, I think, 1985 to 2015, which is now a decade in the past. And that's where you had flying cars.
Starting point is 00:06:36 And the 2015 future is wildly divergent. The 2015 future did have Biff Tannen as a Donald Trump-like figure in some kind of power. So it had some kind of prescience. But yeah, the big noticeable thing is just how different the built environment looks. And so one of the strongest cases for stagnation that I've heard is that, yeah, if you put someone in a time machine from various points, they would recognize themselves to be in a completely different world if they left 1860 and landed. Or 1890 to 1970, those are the 80 years of your lifetime, something like that.
Starting point is 00:07:14 But the world, just to my kids, even as children of 2025, looking at 1985, it's like the cars are a little different and no one has phones, but the world seems fairly similar. So that's a kind of non-statistical, but I think- That's a common sense. That's a common sense understanding. But are there, like, what would convince you that we were living through a period of takeoff? Is it just economic growth? Is it productivity growth?
Starting point is 00:07:39 Are there numbers for stagnation versus dynamism that you look at? Sure. It would be, well, the economic number would just be, what are your living standards compared to your parents? You know, if you're a 30-year-old millennial or, you know, how are you doing versus when your boomer parents were 30 years old, how did they do at the time? There are intellectual questions, how many breakthroughs are we having, how do we quantify these things, like what are the returns of going into research. There certainly are diminishing returns to going into science or going into academia generally. And then maybe this is why so much of it feels like this sociopathic Malthusian kind of an institution because you have to throw more
Starting point is 00:08:22 and more and more at something to get the same returns, and at some point, people give up and the thing collapses. So let's pick up on that. Why should we want growth and dynamism? Because as you've pointed out in some of your arguments on the subject, right, there is a kind of cultural change that happens in the Western world in the 1970s, around the time you think things slow down, start to stagnate, where people become very anxious about the costs of growth, the environmental costs above all. And the idea being, you end up with a widely shared perspective that we're sort of rich enough,
Starting point is 00:09:00 and if we try too hard to get that much richer, the planet won't be able to support us. We'll have degradation of various kinds, and we should be content with where we are. So what's wrong with that argument? Well, I think there are deep reasons the stagnation happened. So there are always three questions
Starting point is 00:09:20 you can ask about history. What actually happened? And then you have a question, what should be done about it? But there's also this intermediate question, why did it happen? People ran out of ideas. I think to some extent, the institutions degraded and became risk averse and sort of these cultural transformations we can describe.
Starting point is 00:09:39 But then I think to some extent also people had some very legitimate worries about the future where if we continue to have accelerating progress, were you accelerating towards environmental apocalypse or nuclear apocalypse or things like that. But I think if we don't find a way back to the future, I do think the society, I don't know, it unravels, it doesn't work. The middle class, I would define the middle class as the people who expect their kids to do better than themselves.
Starting point is 00:10:10 And when that expectation collapses, we no longer have a middle class society. And maybe there's some way you can have a feudal society in which things are always static and stuck, or maybe there's some way you can shift some radically different society, but it's not the way the Western world, it's not the way the United States has functioned for the first 200 years of its existence. So you think that ordinary people won't accept stagnation in the end? It's... That they will rebel and sort of pull things down around them in the course of that rebellion? They may rebel or our institutions don't work. You know, all of our institutions are predicated on growth.
Starting point is 00:10:50 Our budgets are certainly predicated on growth. Yeah, if you say, I don't know, Reagan and Obama. You know, Reagan was sort of consumer capitalism, which is oxymoronic. You don't save money as a capitalist, you borrow money. And Obama was low tax socialism, just as oxymoronic as the consumerist capitalism of Reagan. And I like low tax socialism way better than high tax socialism. But I worry that it's not sustainable. At some point, the taxes go up or the socialism ends. So it's deeply, deeply unstable. And that's why people
Starting point is 00:11:28 are, they're not optimistic. They don't think we've hit some stable. The Greta future, maybe it can work, but we're not yet there. This is the Greta Thunberg, since her name will probably come up again in this conversation. That's a reference to Greta Thunberg, the activist best known for That's a reference to Greta Thunberg, the activist best known for anti-climate change protests who, to you, I would say, represents a kind of symbol of a anti-growth, effectively authoritarian, environmentalist dominated future. Sure, but we're not there yet. We're not there yet. We like a very, very different society.
Starting point is 00:12:03 If you actually lived in a kind of degrowth, you know, small Scandinavian villages. I'm not sure it would be North Korea, but it would be super oppressive. One thing that's always struck me is that when you have this sense of stagnation, a sense of decadence, to use a word that I like to use for it, in a society. You then also have people who end up being kind of eager for a crisis, right? Eager for a moment to come along where, you know, they can radically redirect society from the path it's on.
Starting point is 00:12:40 Because I tend to think that in rich societies, you hit a certain level of wealth, people become very comfortable, they become risk-averse, and it's hard to get out of decadence into something new without a crisis. So the original example for me was after September 11th. There was this whole mentality among foreign policy conservatives that we had been decadent and stagnant and now was our time to, you know, wake up and launch a new crusade and remake the world. And obviously that ended very badly, but something similar. It was, it was Bush 43 just told people to go shopping right away.
Starting point is 00:13:13 So it wasn't anti-decon and enough for the most part. So there was, there was some, um, neocon foreign policy enclave in which people were larping as a way to get out of decadence. But the dominant thing was Bush 43 people telling people just to go shopping. So what risks should you be willing to take to escape decadence? It does seem like there's a danger here where the people who want to be anti-decadent have to take on a lot of risk. They have to say, look, you've got this nice, stable, comfortable society, but guess what?
Starting point is 00:13:51 We'd like to have a war or a crisis or a total reorganization of government and so on. They have to lean into danger. Right? I don't know if I'd give you a precise precise answer, but my directional answers a lot more. Yeah. We should take a lot more risk. We should be doing a lot more. And I don't know,
Starting point is 00:14:11 I can go through all these different verticals. You know, if you look at biotech, something like dementia, Alzheimer's, we've made zero progress in 40 to 50 years. People are completely stuck on beta amyloids. It's obviously not working. It's just some kind of a stupid racket where the people are just reinforcing themselves. So yes, we need to take way more risk in that department.
Starting point is 00:14:37 I want to ask to keep us in the concrete. I want to stay with that example for a minute and ask, okay, what does that mean, saying we need to take more risks in anti-aging research? Does it mean that the FDA has to step back and say, anyone who has a new treatment for Alzheimer's can go ahead and sell it on the open market? What is risk in the medical space look like? Yeah, you would take a lot more risk. You know, if you have some fatal disease, there probably are a lot more risks you can take. There are a lot more risks the researchers can take. Culturally, what I imagine it looks like is early modernity
Starting point is 00:15:17 where people, yeah, they had, they thought we would cure diseases, they thought we would have radical life extension. Immortality was that was part of the project of early modernity, it was Francis Bacon, Condorcet, you know, and maybe it was anti-Christian, maybe it was downstream of Christianity, it was competitive. If Christianity promised you a physical resurrection, you know, science was not going to succeed unless it promised you the exact same thing. But I remember 1999, 2000 when we were running PayPal,
Starting point is 00:15:50 one of my co-founders, Luke, noticed he was into Alcor and cryonics and people should freeze themselves. And we had one day where we took the whole company to a freezing party, Tupperware party, people sell Tupperware policies at a freezing party. They sell, was it just your heads? What was going to be frozen? You could get a full body or just a head. The just a head was cheaper.
Starting point is 00:16:11 It was disturbing with the dot matrix printer didn't quite work. And so the freezing policies couldn't be printed out. But in retrospect, this was still. Technological stagnation once again, right? But it was, but it's also a symptom of the decline because in 1999, this was not a mainstream view but there was still a fringe boomer view where they still believed they could live forever.
Starting point is 00:16:35 And that was the last generation. And so I'm always anti-boomer, but maybe there's something we've lost even in this fringe boomer narcissism where there were at least a few boomers who still Believe science would cure all their diseases. No one who's a millennial believes that anymore I think there are some people who believe in a different kind of immortality though right now I
Starting point is 00:17:00 Think part of the fascination with AI is Connected to a specific vision of transcending limits. And I'm going to ask you about that after I ask you about politics. Because one of the striking things I thought about your original argument on stagnation, which was mostly about technology and the economy, was that it could be applied to a pretty wide range of things. And at the time you were writing that essay, you were interested in seasteading this idea of, ideas of essentially building new polities independent of the sclerotic Western world. But then you made a pivot in the 2010s. So you were one of the few prominent, maybe the only prominent Silicon Valley supporter
Starting point is 00:17:44 of Donald Trump in 2016. You supported a few sort of carefully selected Republican Senate candidates. One of them is now the vice president of the United States. And my view as an observer of what you were doing, having read your arguments about decadence, was that you were basically being a kind of venture capitalist for politics, right? You were saying, here are some disruptive agents who might change the political status quo and it's worth a certain kind of risk here.
Starting point is 00:18:18 Is that how you thought about that? Sure. There were all sorts of levels. I mean, one level was these hopes that we could redirect the Titanic from the iceberg was heading to, or whatever the metaphor is, you know, really change courses in society. Through political change. Maybe a narrower, a much narrower aspiration was that we could maybe at least have a conversation about this.
Starting point is 00:18:39 And so when someone like Trump said, make America great again, okay, is that a positive, optimistic, ambitious agenda or is it merely a very pessimistic assessment of where we are, that we are no longer a great country? And I didn't have great expectations about what Trump would do in a positive way, but I thought at least for the first time in a hundred years, we had a Republican who was not giving us this syrupy Bush nonsense. And that was, it was not the same as progress, but we could at least have a conversation. In retrospect, this was a preposterous fantasy.
Starting point is 00:19:21 You know, I had these two thoughts in 2016 and you often have these ideas that are just below the level of your sort of consciousness, but the two thoughts I had that I wasn't able to combine was, number one, nobody would be mad at me for supporting Trump if he lost. And number two, I thought he had a 50-50 chance of winning. And then I had this implicit- Why would nobody be mad at you if he lost? It would just be such a weird thing and it wouldn't really matter. But then I had this implicit- Why would nobody be mad at you if he lost? It would just be such a weird thing and it wouldn't really matter. And then, but then I thought he had a 50-50 chance
Starting point is 00:19:50 because the problems were deep and the stagnation was frustrating. And the reality was people weren't ready for it. And then maybe we've progressed to the point where we can have this conversation in 2025, a decade after Trump. And of course, you're not sort of a zombie left-wing person, Ross, but it's, this is-
Starting point is 00:20:11 I've been called many things, many things, Peter. I'll take whatever progress I can get. Well, so from your perspective of, so let's say there's two layers, right? There's sort of a basic sense of, you know, this society needs disruption, it needs risk. Trump is disruption, Trump is risk. And the second level is Trump is actually willing to say things that are true about
Starting point is 00:20:32 American decline. So do you feel like you as an investor, as a venture capitalist, got anything out of the first Trump term? Like what did Trump do in his first term that you felt was anti-decadent or anti-stagnation? If anything, maybe the answer is nothing. I think it took longer and it was slower than I would have liked, but we have gotten to the place where a lot of people think something's gone wrong. And that was not the conversation I was having
Starting point is 00:21:06 in 2012, 2013, 2014. I had, I don't know, I had a debate with Eric Schmidt in 2012 and Mark Andresen in 2013 and Bezos in 2014. I was on the, there's a stagnation problem and all three of them were versions of, you know, everything's going great. And I think at least those three people have, to varying degrees, updated and adjusted. Silicon Valley's adjusted.
Starting point is 00:21:31 And Silicon Valley, though, has more than adjusted. A big part of Silicon Valley... Well, on the stagnation question. Right, on the stagnation question. But then a big part of Silicon Valley ended up going in for Trump in 2024, including, obviously, most famously, Elon Musk. Yeah, this is deeply linked to the stagnation issue, in my telling. I mean, these things are always super complicated, but my telling is, you know, I don't, and again, I'm so hesitant to speak for all these people, but someone like Mark Zuckerberg or
Starting point is 00:22:01 Facebook meta, and in some ways, I don't think he was very ideological. He didn't think this stuff. So I threw that much. It was the default was to be liberal and it was always, you know, if the liberalism isn't working, what do you do? And for year after year after year, it was, you do more, you know, if something doesn't work, you just need to do more of it and you up the dose and you up the dose and you spend hundreds of millions of dollars and you go
Starting point is 00:22:24 completely woke and everybody hates you and at some point it's like okay maybe this isn't working. So they they pivot. Yeah it's not a pro-Trump thing. It's not a pro-Trump thing but it is just both in public and private conversations it is a kind of sense that Trumpism and populism in 2024, maybe not in 2016 when Peter was out there as the, you know, the lone supporter, but now in 2024, they can be a vehicle for technological innovation, economic dynamism. Yeah, you're framing it really, really optimistically here. So I think- Well, the people, but I think,
Starting point is 00:23:06 I know you're pessimistic, but people- When you frame it this optimistically, you're just saying these people are gonna be disappointed and they're just set up for failure and things like this. I mean, people expressed a lot of optimism. That's all I'm saying. Elon Musk expressed a lot of, I mean, he expressed some apocalyptic anxieties
Starting point is 00:23:23 about how budget deficits were going to kill us all. But he came into government and people around him came into government basically saying, we have a partnership with the Trump administration and we're pursuing technological greatness. I think they were optimistic. And so you're coming from a place of greater pessimism or realism. So I'm just, what I'm asking for is your assessment of where we are. Not their assessment, but like, does populism in Trump 2.0 look like a vehicle for technological dynamism to you?
Starting point is 00:23:58 It's still by far the best option we have. I don't think, I don't know, is Harvard going to cure dementia by just puttering along doing the same thing that hasn't worked for 50 years? So I... That's just a case for, it can't get worse, let's do disruption, right? But the critique of populism right now would be Silicon Valley made an alliance with the populists, but in the end the populists, but in the end, the populists don't care about science.
Starting point is 00:24:27 They don't want to spend money on science. They want to kill funding to Harvard just because they don't like Harvard. And in the end, you're not going to get the kind of investments in the future that Silicon Valley wanted. Is that wrong? Yeah, but it's, we have to go back to this question of how well is the science working in the background? This is where the new dealers, whatever was wrong with them, they pushed science hard
Starting point is 00:24:54 and you funded it, you gave money to people and you scaled it. Whereas today, if there was an equivalent of Einstein and he wrote a letter to the White House, it would get lost in the mail room and the Manhattan Project is unthinkable. If we call something a moonshot, this is the way Biden talked about, let's say cancer research. A moonshot in the 60s still meant that you went to the moon. A moonshot now means something completely fictional that's never going to happen. Oh, you need a moonshot for that.
Starting point is 00:25:23 It's not like we need an Apollo program. It means it's never ever going to happen. And so that's- But it seems like then you're still in the mode of, for you, as opposed to maybe for some other people in Silicon Valley, the value of populism is in tearing away the veils and illusions. And we're not necessarily in the stage where you're looking
Starting point is 00:25:44 to the Trump administration to do the Manhattan Project, to do the moonshot. It's more like populism helps us see that it was all fake. You need to try to do both. And they're very entangled with each other. And I don't know, there's a deregulation of nuclear power and at some point we'll get back to building new nuclear power plants or better designed ones or maybe even fusion reactors. And so yes, there's a deregulatory deconstructive part and then at some point you actually get to construction and it's all things like that. So yeah, in some ways you're clearing the field and then but you've maybe but you've personally
Starting point is 00:26:27 stopped funding politicians I'm schizophrenic on this stuff. I think it is it's incredibly important and it's incredibly toxic and so I I go back and forth on incredibly toxic for you personally? For everybody, everybody who gets involved. It's zero sum, it's crazy, and then in some ways. Because everyone hates you and associates you with Trump? Like, how is it toxic for you personally? It's toxic because it's in a zero sum world.
Starting point is 00:27:00 The stakes in it feel really, really high. And you end up having enemies you didn't have before? Yeah. It's toxic for all the people who get involved in different ways. This is, you know, there is a political dimension of getting back to the future. You can't, I don't know, this is a conversation
Starting point is 00:27:17 I had with Elon back in 2024. And we had all these, you know, conversation. I had the C-steading version with Elon where I said, you know, um, Trump doesn't win, I want to just leave the country. And then Elon said, there's nowhere to go. There's nowhere to go. This is the only place. And then, you know, you always think of the right arguments to make later.
Starting point is 00:27:37 And it was about two hours after we had dinner and I was home that I thought of, wow, Elon, you don't believe in going to Mars anymore. 2024, 2024 is the year where Elon stopped believing in Mars, not as a silly science tech project, but as a political project. Mars was supposed to be a political project, was building an alternative. And in 2024, Elon came to believe that if you went to Mars, you know, the socialist US government, the woke AI, it would follow you to Mars. It was, you know, it was the Demis meeting with Elon that we sort of brokered.
Starting point is 00:28:13 He was- This is an AI company. Yeah, this is the rough conversation was, you know, Demis tells Elon, I'm working on the most important project in the world. I'm building a superhuman AI. And Elon responds to Demis, well, I'm working on the most important project in the world, I'm building a superhuman AI. And Elon responds to Demis, well, I'm working on the most important project
Starting point is 00:28:26 in the world. I am turning this into interplanetary species. And then Demis said, well, you know, my AI will be able to follow you to Mars. And then Elon sort of went quiet. But in my telling of the history, it took years for that to really hit Elon. It took him till 2024 to process it.
Starting point is 00:28:45 That doesn't mean he doesn't believe in Mars. It just means that he decided he had to win some kind of battle over budget deficits or wokeness to get to Mars. What does Mars mean? And again, it's- What does Mars mean? It's, is it just a scientific project
Starting point is 00:29:02 or is it, I don't know, is it like a, I don't know, Heinlein. A vision of a new society. Yeah, Heinlein, you know. Populated by many, many people. The moon as a libertarian paradise or something like this. Descended from Elon Musk. Well, I don't know if it was concretized, that specifically, but if you concretize things,
Starting point is 00:29:19 then maybe you realize that Mars is supposed to be more than a science project, It's supposed to be a political project. And then when you concretize it, you have to start thinking through, well, the AI, the woke AI will follow you, the socialist government will follow you. And then maybe you have to do something other than just going to Mars. Okay. Okay, so the woke AI. Artificial intelligence seems like one, if we're still stagnant, it's the biggest exception to stagnation. It's the place where there's been remarkable progress, surprising to many people progress.
Starting point is 00:30:25 It's also the place we were just talking about politics. It's the place where the Trump administration is, I think to a large degree, giving AI investors a lot of what they wanted in terms of both stepping back and doing public-private partnerships. So it's a zone of progress and governmental engagement. And you are an investor in AI. What do you think you're investing in? Well, I don't know. There's sort of a lot of layers to this.
Starting point is 00:31:04 So I do think,'t know, there's sort of a lot of layers to this. So I do think, you know, one question we can frame is just how big a thing do I think AI is? And it's, I don't know, my stupid answer is it's somewhere, it's more than a nothing burger and it's less than the total transformation of our society. So my placeholder is that it's roughly on the scale of the internet in the late 90s, which is, you know, I'm not sure it's enough to really end the stagnation. It might be enough to create some great companies and the internet, you know, it added maybe a few points, percentage points to the GDP, maybe 1% to GDP growth every year for 10, 15 years.
Starting point is 00:31:47 It adds some to productivity. And so that's sort of roughly my placeholder for AI. It's the only thing we have. It's a little bit unhealthy that it's so unbalanced. This is the only thing we have. I'd like to have more multidimensional progress. I'd like us to be going to Mars. I'd like us to be having cures for dementia.
Starting point is 00:32:03 If all we have is AI, I will take it. There are risks with it. There are obviously there are dangers with this technology. But they're also- So, but then you are a skeptic of the, what you might call the sort of super intelligence cascade theory, which basically says that if AI succeeds, it gets so smart that it gives us the progress in the world of atoms. That it's like, all right, we can't cure dementia.
Starting point is 00:32:34 We can't figure out how to build the perfect factory that builds the rockets that go to Mars, but the AI can. And at a certain point, it just, you pass a certain threshold, and it gives us not just more digital progress, but 64 other forms of progress. It sounds like you don't believe that, or you think that's less likely. Yeah, I somehow don't know if that's been really
Starting point is 00:33:03 the gating factor. What does that mean, the gating factor? It's probably a Silicon Valley ideology, and maybe in a weird way it's more liberal than a conservative thing, but people are really fixated on IQ in Silicon Valley and that it's all about smart people. And if you have more smart people, they will do great things. And then the economics anti-IQ argument is that people actually do worse. The smarter they are, the worse they do. You know, it's just they don't know how to apply it or our society doesn't know what
Starting point is 00:33:35 to do with them and they don't fit in. And so that suggests that the gating factor isn't IQ, but something, you know, that's deeply wrong with our society. So I don't think- Is that a limit on intelligence or a problem of the sort of personality types that human super intelligence creates? I mean, I'm very sympathetic to the idea, and I made this case when I did an episode of this podcast with a sort of AI accelerationist, that certain problems can just be solved if you ramp up intelligence.
Starting point is 00:34:05 It's like we ramp up intelligence and boom, Alzheimer's is solved. We ramp up intelligence and the AI can figure out the automation process that builds you a billion robots overnight. I'm an intelligent skeptic in the sense I don't think, I think you probably have limits. It's hard to prove one way. It's always hard to prove these things, but I- Until we have the super intelligence. I share your intuition because I think we've had a lot
Starting point is 00:34:31 of smart people and things have been stuck for other reasons. And so maybe the problems are unsolvable, which is the pessimistic view. Maybe there is no cure for dementia at all, and it's a deeply unsolvable problem. There's no cure for mortality. Maybe's a deeply unsolvable problem, there's no cure for mortality. Maybe it's an unsolvable problem. Or maybe it's these cultural things.
Starting point is 00:34:51 So it's not the individually smart person, but it's how this fits into our society. Do we tolerate heterodox smart people? Maybe you need heterodox smart people to do crazy experiments. And if the AI is just conventionally smart, if we define wokeness again, wokeness is too ideological, but if you just define it as conformist, maybe that's not the kind of smartness that's going to make a difference. So do you fear then a plausible future where AI in a way becomes itself stagnationist? That it's like highly intelligent, creative in a conformist way.
Starting point is 00:35:34 It's like the Netflix algorithm. It makes infinite okay movies that people watch. It generates infinite okay ideas. It puts a bunch of people out of work and makes them obsolete, but it doesn't... It like deepens stagnation in some way. Is that a fear? It... It's quite possible.
Starting point is 00:35:56 That's certainly a risk, but I guess where I end up is, I still think we should be trying AI and that the alternative is just total stagnation. So yeah, there's sort of all sorts of interesting things can happen with you can maybe drones and military contexts are combined with AI and okay, this is kind of scary or dangerous or dystopian or it's going to change things, but if you don't have AI, wow, there's just nothing going on. And I don't know, this is, this, there's like a version of this discussion on
Starting point is 00:36:31 the internet where, um, you know, did the internet lead to more conformity and, and more wokeness and yeah, there are all sorts of ways where it didn't lead to quite the cornucopian, you know, diverse explosion of ideas that libertarians fantasized about in 1999. But counterfactually, I would argue that it was still better than the alternative, that if we hadn't had the internet, maybe it would have been worse. You know, AI is better, it's better than the alternative and the alternative is nothing at all. Because here's one place where the stagnationist arguments are still reinforced.
Starting point is 00:37:10 The fact that we're only talking about AI. I feel it is always an implicit acknowledgement that but for AI, we are like in almost total stagnation. But the world of AI is clearly filled with people who, at the very least, seem to have a more utopian, transformative, whatever word you want to call it, view of the technology than you're expressing here, right? And you mentioned earlier the idea that the modern world used to promise radical life extension and doesn't anymore.
Starting point is 00:37:46 It seems very clear to me that a number of people deeply involved in artificial intelligence see it as a kind of mechanism for transhumanism, for transcendence of our mortal flesh and either some kind of creation of a successor species or some kind of merger of mind and machine. And do you think that's just all kind of irrelevant fantasy? Or do you think it's just hype? Do you think people are trying to raise money by pretending that we're going to build a machine god, right? Is it hype? Is it delusion? Is it something you worry about? I think you would prefer the human race to endure, right? Is it hype? Is it delusion? Is it something you worry about? I think you would prefer the human race to endure, right?
Starting point is 00:38:31 You're hesitating. Yes? I don't know. I would... This is a long hesitation. This is a long hesitation. There's so many questions implicated in this. Should the human race survive?
Starting point is 00:38:47 Yes. Okay. But I also would like us to radically solve these problems. Right. And so, you know, it's always, I don't know, yeah, transhumanism is this, you know, the ideal was this radical transformation where your human natural body gets transformed into an immortal body. And, um, there's a critique of, let's say, um, the trans people in a sexual
Starting point is 00:39:21 context, or I don't know, transvestite is someone who changes their clothes and cross dresses and a transsexual is someone where you change your penis into a vagina and we can then debate how well those surgeries work. But we want more transformation than that. The critique is not that it's weird and unnatural. It's man, it's so pathetically little. We want more than cross dressing or changing your sex organs. We want you to be able to change your heart and change your mind and change your whole body. And then Orthodox Christianity, by the way, the critique Orthodox
Starting point is 00:39:58 Christianity has of this is these things don't go far enough. Like the transhumanism is just changing your body, but you also need to transform your soul, and you need to transform your whole self. And so, this is where- Right, but the other, wait, wait, wait. I generally agree with what I think is your belief that religion should be a friend to science and ideas of scientific progress.
Starting point is 00:40:24 I think any idea of divine providence has to encompass the fact that we have progressed and achieved and done things that would have been unimaginable to our ancestors. But it still also seems like the promise of Christianity in the end is you get the perfected body and the perfected soul through God's grace and the person who tries to do it on their own with a bunch of machines is likely to end up as a dystopian character. Well, it's, let's articulate this. And you can have a heretical form of Christianity that says something else. I don't know. I think the word nature does not occur once in the Old Testament. And so, you know, there is a word in which, a sense in which, the way I understand Judeo-Christian inspiration is,
Starting point is 00:41:19 it is about transcending nature, it is about overcoming things. And the closest thing you can say to nature is that people are fallen and that that's the natural thing in a Christian sense is that you're messed up. And that's true, but there's some ways that, you know, with God's help, you are supposed to transcend that and overcome that. Right. But the people, present company accepted, most of the people working to build the hypothetical machine god don't think that they're cooperating with Yahweh, Jehovah, the Lord of hosts.
Starting point is 00:41:59 They think that they're building immortality on their own, right? We're jumping around a lot of things. So again, the critique I was saying is they're not ambitious enough. From a Christian point of view, these people are not ambitious enough. Now, then we get into this question, well, are they still-
Starting point is 00:42:14 But they're not morally and spiritually ambitious enough. And then are they still physically ambitious enough? And are they even still really transhumanists? And this is where, okay, you know, man, the cryonics thing, that seems like a retro thing from 1999. There isn't that much of that going on. So they're not transhumanists on a physical body. And then, okay, well, maybe it's not about cryonics, maybe it's about uploading, which
Starting point is 00:42:38 okay, well, that's not quite, I'd rather have my body. I don't want just a computer program that simulates me. So that uploading seemed like a step down from cryonics. But then even that's, you know, it's part of the conversation and this is where it gets very hard to score. And I don't want to say they're all making it up and it's all fake, but I don't think-
Starting point is 00:42:59 You think some of it's fake. I don't think it's, fake implies people are lying, but I wanna say it's not the center of gravity. And so there is, yeah, there is a cornucopian language, there's an optimistic language. A conversation I had with Elon a few weeks ago about this was he said, we're gonna have a billion humanoid robots in the US in 10 years.
Starting point is 00:43:24 And I said, well, if that's true, you don't need to worry about the budget deficits because we're going to have so much growth. The growth will take care of this. And then, well, he's still worried about the budget deficits. And then this doesn't prove that he doesn't believe in the billion robots, but it suggests that maybe he hasn't
Starting point is 00:43:41 thought it through or that he doesn't think it's going to be as transformative economically or that there are big error bars around it. But yeah, there's some way in which these things are not quite thought through. If I had to give a critique of Silicon Valley, it's always bad at what the meaning of tech is. And the conversations, they tend to go into this microscopic thing where it's like, what are the IQ ELO scores of the AI and exactly how do you define AGI? We get into all these endless technical debates and there are a lot of questions that are at an intermediate level of meaning that seem to me to be very important, which is like, what does it mean for the budget deficit?
Starting point is 00:44:24 What does it mean for the economy? What does it mean for the economy? What does it mean for geopolitics? One of the conversations recently was, you know, I had was, does it change the calculus for China invading Taiwan, where we have an accelerating AI revolution, the military, is China falling behind? And will this, and maybe on the optimistic side, it deters China because they've effectively lost. And on the pessimistic side, it accelerates them because they know it's now or never. If they don't grab Taiwan now, they will fall behind. And
Starting point is 00:44:58 either way, this is a pretty important thing. It's not thought through. We don't think about what AI means for geopolitics. We don't think about what AI means for geopolitics. We don't think about what it means for the macroeconomy. And those are the kinds of questions I'd want us to push more. There's also a very macroscopic question that you're interested in that, you know, will pull on the religion thread a little bit here. You have been giving talks recently about the concept of the Antichrist, which is a Christian concept, an apocalyptic concept. What does that mean to you?
Starting point is 00:46:03 What is the Antichrist? How much time do we have? We've got as much time as you have to talk about the Antichrist. All right. Well, I have a, I could talk about it for a long time. But I think, I think there's always a question. Um, how do we articulate, you know, some of these existential risks, some of the challenges we have, and they're all framed in this sort of runaway dystopian science text.
Starting point is 00:46:31 There's a risk of nuclear war, there's a risk of environmental disaster, maybe something specific like climate change, although there are lots of other ones we could come up with. There's a risk of bio weapons. You have all the different sci-fi scenarios. Obviously there are certain types of risks with AI. But I always think that if we're going to have this frame of talking about existential risks, perhaps we should also talk about the risk of another type of a bad singularity, which I would describe as the one world totalitarian state. Because I would say the default political solution people have for all these existential
Starting point is 00:47:12 risks is one world governance. What do you do about nuclear weapons? We have a United Nations with real teeth that controls them, and they're controlled by an international political order. And then something like this is also, what do we do about AI and we need global compute governance? We need a one world government to control all the computers, log every single keystroke to make sure people don't program a dangerous AI.
Starting point is 00:47:43 And I've been wondering whether that's sort of, you know, going from the frying pan into the fire. And so the atheist philosophical framing is one world or none. That was a short film that was put out by the Federation of American Scientists in the late forties. It starts with a nuclear bomb blowing up the world. And obviously you need a one world government to stop it, one world or none. The Christian framing, which in some ways is the same question, is Antichrist or Armageddon.
Starting point is 00:48:15 You have the one world state of the Antichrist or we're sleepwalking towards Armageddon. One world or none, Antichrist or Armageddon, on one level are the same question. Now, I have a lot of thoughts on this topic, but one question is, and this was a plot hole in all these Antichrist books, people wrote, how does the Antichrist take over the world? He gives these demonic, hypnotic speeches and people just fall for it. So it's this plot hole. It's this demonium ex machina. Totally. It's implausible. It's a very implausible plot hole. But I think we have an answer to this plot hole. The way the Antichrist would take over the world is you talk about Armageddon nonstop. You talk about existential risk nonstop, and this is what you need to regulate. It's the opposite of the picture of Baconian science from the 17th, 18th century, where
Starting point is 00:49:09 the Antichrist is like some evil tech genius, evil scientist who invents this machine to take over the world. People are way too scared for that. In our world, the thing that has political resonance is the opposite. The thing that has political resonance is we need to stop science, we need to just say stop to this, and this is where, yeah, I don't know, in the 17th century, I can imagine a Dr. Strangelove, Edward Teller type person taking over the world. In our world, it's far more likely to be Greta Thunberg.
Starting point is 00:49:48 Okay, I want to suggest a middle ground between those two options, right? It used to be that the reasonable fear of the Antichrist was a kind of wizard of technology. And now the reasonable fear is someone who promises to control technology, make it safe, and sort of usher in what from your point of view would be a kind of universal stagnation, right? Well, that's more my description of how it would happen. So I think people still have a fear of a 17th century Antichrist. We're still scared of Dr. Strangelove. Right, but you're saying the real Antichrist would play on that fear and say, you must
Starting point is 00:50:22 come with me to avoid Skynet, to avoid the Terminator, to avoid nuclear Armageddon. And I guess my view would be looking at the world right now, that you would need a certain kind of novel technological progress to make that fear concrete, right? So I can buy that the world could turn to someone who promised peace and regulation if the world became convinced that AI was about to destroy everybody. But I think to get to that point, you need one of the accelerationist apocalyptic scenarios to start to play out. To get your peace and safety, Antichrist, you need more technological progress.
Starting point is 00:51:07 One of the key failures of totalitarianism in the 20th century was it had a problem of knowledge. It couldn't know what was going on all over in the world. So you need the AI or whatever else to be capable of helping the peace and safety totalitarian rule. So don't you think you need, essentially you need your worst case scenario to involve some burst of progress that is then tamed and used to impose stagnant totalitarianism. You can't just get there from where we are right now. Well, it can- Greta Thunberg's on a boat in the Mediterranean, like protesting Israel. I just don't see the promise of safety from AI, safety from tech, even safety from climate
Starting point is 00:52:00 change right now as a powerful universal rallying cry, absent accelerating change and real fear of total catastrophe? Man, these things are so hard to score, but I think environmentalism is pretty powerful. I don't know if it's absolutely powerful enough to create a one world totalitarian state, but man, it is. I think it is not. It is. In its current form.
Starting point is 00:52:30 I want to say it's the only thing people still believe in in Europe. Like you know, they believe in the green thing more than Islamic Sharia law or more than in you know, the Chinese communist totalitarian takeover. And the future is an idea of a future that looks different from the present. The only three on offer in Europe are green, Sharia and the totalitarian communist state. And the green one is by far the strongest.
Starting point is 00:52:59 In a declining, decaying Europe that is not a dominant player in the world. It's always in a context. And then I would, I don't know, we had this really complicated history with the way nuclear technology worked and okay, we didn't really get to a totalitarian one world state, but by the 1970s, one account of the stagnation is that the runaway progress of technology had gotten very scary and that Baconian science, it ended at Los Alamos. And then it was, okay, it ended there and we didn't want to have any more.
Starting point is 00:53:38 And you know, when Charles Manson took LSD in the late sixties and started murdering people, what he saw on LSD, what he learned was that you could be like an anti-hero and Dostoevsky and everything was permitted. And of course not everyone became Charles Manson, but in my telling of the history, everyone became as deranged as Charles Manson. But Charles Manson did not become the Antichrist and take over the world, right?
Starting point is 00:54:04 We're ending in the apocalyptic. No, but my telling of the history of the 1970s is the hippies did win. And we landed on the moon in July of 1969, Woodstock started three weeks later, and with the benefit of hindsight, that's when progress stopped and the hippies won. Yeah, it was not literally Charles Manson. Okay, but you're just, I want to stay with the Antichrist just to end, right? And you're retreating, you're saying, okay, environmentalism is already pro-stagnation and so on.
Starting point is 00:54:37 Okay, let's agree with all that, but we're not living under the Antichrist right now. We're just stagn, we're not living under the antichrist right now, we're just stagnant, right? And you're positing that something worse could be on the horizon that would make stagnation permanent, that would be driven by fear. And I'm suggesting that for that to happen, there would have to be some burst of technological progress that was akin to Los Alamos that people are afraid of. And I guess this is my very specific question for you, right? Is that you're an investor in AI,
Starting point is 00:55:12 you're deeply invested in Palantir, in military technology, in technologies of surveillance, in technologies of warfare and so on, right? And it just seems to me that when you tell me a story about the Antichrist coming to power and using the fear of technological change to sort of impose order on the world, I feel like that Antichrist would maybe be using the tools that you were building, right? Like wouldn't the Antichrist be like, great, you know, we're not going to have any more technological progress, but I really like what Palantir has done so far, right?
Starting point is 00:55:50 I mean, isn't that a concern? Wouldn't that be the, you know, the irony of history would be that the man publicly worrying about the Antichrist accidentally hastens his or her arrival? They're all, look, there are all these different scenarios. I obviously don't think that that's what I'm doing. I mean, to be clear, I don't think that's what you're doing either. I'm just interested in how you get to a world willing to submit to permanent authoritarian rule? Well, but, but again, uh, there, there are these different gradations of this we can describe, but is this so preposterous? What I've just told you as a broad account of the stagnation that the entire
Starting point is 00:56:42 world has submitted for 50 years to peace and safetyism. This is a first Thessalonians five three, the slogan of the Antichrist is peace and safety and we've submitted to the FDA regulates not just drugs in the US, but de facto in the whole world because the rest of the world defers the FDA. The Nuclear Regulatory commission effectively regulates nuclear power plants all over the world. You know, you can't design a modular nuclear reactor and just build it in Argentina. They don't, they, they, they, they won't trust the Argentinian regulators.
Starting point is 00:57:17 They're going to defer to the, to the U S. And so it is at least a question about why we've had 50 years of stagnation. And one answer is we ran out of ideas. The other answer is that something happened culturally where it wasn't allowed and then the cultural answer can be sort of a bottom up answer that it was just some transformation of humanity into the sort of more docile kind of a species, or it can be at least partially top down that there is this machinery of government
Starting point is 00:57:54 that got changed into this stagnation thing. And nuclear power was supposed to be the power of the 21st century, and, you know, it somehow has gotten off ramped all over the world on a worldwide basis, you know? So in a sense, we're already living under a moderate rule of the Antichrist in that telling. Do you think God is in control of history? I, uh, man, this is again, like, I think that, I think that there's always room for human freedom and human choice. These things are, you know, they're not absolutely predetermined one way or another.
Starting point is 00:58:43 Right. But God wouldn't leave us forever under the rule of a mild, moderate stagnationist anti-Christ, right? That can't be how the story ends, right? It's attributing too much causation to God is always a problem. There are different Bible verses I can give you, but I'll give you John 15, 25, where Christ says, they hated me without cause. And so it's all these people that are persecuting Christ
Starting point is 00:59:15 have no reason, no cause for why they're persecuting Christ. And if we interpret this as a ultimate causation verse, they want to say, I'm persecuting you because God caused me to do this. God is causing everything. And the Christian view is anti-Calvinist. God is not behind history. God is not causing everything. If you say God's causing everything, then God is, but wait, but God is, you're scapegoating God. But God is behind Jesus Christ entering history because God was not going to leave us in a stagnationist decadent Roman Empire, right? So at some point, God is going to step in.
Starting point is 00:59:55 I am not that Calvinist. That's not Calvinism though, that's just Christianity. God will not leave us eternally staring into screens and being lectured by Greta Thunberg, right? He will not abandon us to that fate. It is. There is a great, I don't know, for better and for worse, I think there's a great deal of scope for human action, for human freedom. If I thought these things were deterministic, you might as well, you know, maybe just accept it.
Starting point is 01:00:32 The lines are coming. You should just have some yoga and prayerful meditation and wait while the lines eat you up. And I don't think that's what you're supposed to do. It's no, I agree with that. And I think on that note, I'm just trying to be hopeful and suggesting that in trying to resist the Antichrist using your human freedom, you should have hope that you'll succeed, right?
Starting point is 01:00:54 We can agree on that. Good. Peter Thiel, thank you for joining me. Thank you. And thanks to all of you for listening. As a reminder, you can watch this as a video podcast on YouTube, and you can find the channel at Interesting Times with Ross Douthit. Interesting Times is produced by Sofia Alvarez-Boyd, Andrea Batanzos, Alisa Gutierrez, and Catherine Sullivan. It's edited by Jordana Hokeman. Our fact check team is Kate Sinclair,
Starting point is 01:01:48 Mary Marge Locker, and Michelle Harris. Original music by Isaac Jones, Sonia Herrero, Amin Sahota, and Pat McCusker. Engineering by Isaac Jones. Mixing by Sophia Landman. Audience strategy by Shannon Busta and Christina Samuelski. And our director of opinion audio is Annie Rose Strasser.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.