Ologies with Alie Ward - Eschatology (THE APOCALYPSE) with Phil Torres

Episode Date: November 6, 2018

Doomsday. The apocalypse. The End. Join scholar, author and professional existential risk philosopher Phil Torres for a surprisingly jovial romp through different "Oops we're screwed" scenarios that w...ill lead to the destruction of the planet or extinction of our species. (Not to be confused with lepidopterologist/butterfly man, Phil Torres.) Find out where we're at on Ye Olde Doomsday clock, if any of us should have babies, if AI will destroy us, pop-cultural Antichrists, Black Mirror, simulations, technology as friend or foe, why voting matters (lookin’ at you, America) and how to remain chill in the face of doom. Also: the hottest underground bunkers on the market.Phil Torres is on Twitter @Xriskology and his website is www.risksandreligion.orgBecome a patron of Ologies for as little as a buck a month: www.Patreon.com/ologiesOlogiesMerch.com has hats, shirts, pins, totes!Follow @Ologies on Twitter or InstagramFollow @AlieWard on Twitter or InstagramMore links at www.alieward.comSound editing by Steven Ray MorrisTheme song by Nick ThorburnSupport the show: http://Patreon.com/ologies

Transcript
Discussion (0)
Starting point is 00:00:00 Oh, hey, it's your old uncle who's pretty dubious to that fat-free salad dressing. Alleyward. Back with another episode of oligies. So before you think, wait, did I hear this one already? Is this just a revisiting of Lepidopterology with butterfly expert Phil Torres, who also professionally plays with baby dogs on his new CW television show, Ready, Set, Pet? Nope. It's a different episode.
Starting point is 00:00:26 Different Phil Torres. It's a different Phil Torres. Boy Howdy. You know, in the 60 plus episodes that we have journeyed through together, friends, we have gone face first into death, tumors, and misogyny, pet euthanasia, dabbled into crow funerals. But I got to say this one, wow, this one's going to have you just staring into the mirror at 1 AM asking the fuck ward, maybe more than any of them. This one is the apocalypse.
Starting point is 00:00:57 Are you ready? No? Okay, I'm going to stall for one second. Before the existential upheaval, just a quick thanks to everyone supporting via patreon.com slash oligies, anyone who has put merch on your bod from oligiesmerch.com, and all the folks who for $0 rate make the commitment to hit subscribe, who leave a review, which I creep with joy, and I highlight one each week. This week.
Starting point is 00:01:25 You know what? Once quick shout out to Tara and Maine, who is a self proclaimed former podcast hater, and now an oligite, come be one with us. And also thank you for the timely as hell review from SeattleMe227, who says, I especially love the takes on dark topics like death and fear. So refreshing. I'm an instant fan. Well, wow, SeattleMe227 buckle up.
Starting point is 00:01:51 We're talking about doomsday. So eschatology comes from the Greek for last, meaning it's the study of the end, the end of the world. So as this oligist explains, this term is no longer bound to just religious context, but also scientific study. So he is a neuroscientist, philosophy scholar, and author of three books, including the end, what science and religion tell us about the apocalypse, and his latest release, morality, foresight, and human flourishing, an introduction to existential risks.
Starting point is 00:02:26 As you'll hear, the apocalypse and this episode were both a long time in the making. And I coerced this oligist to drive a few hours and meet me in an airport hotel in Philly. And he was so generous with his time, he pretty much spent the evening being lobbed questions, just barraged like asteroids, one after the other. And you'll hear about his background that led him to this branch of philosophy and where we're at on the old doomsday clock, whether or not any of us should have babies if we should bother recycling, why voting matters, looking at you, America, and some pop cultural antichrists, artificial intelligence, simulations, black mirror, technology as friend or foe.
Starting point is 00:03:08 And now he's just a pretty chill guy anyway. So pack up your bug out bag and put a down payment on a bunker while you enjoy the brilliant brain of eschatologist, Phil Torres. I stepped on my glasses, so I'm trying it at night, like it's just, it's blurry. You don't even need an apocalypse. You're your own nightmare. Oh my God. Yeah.
Starting point is 00:03:47 This is yours. Okay. So let's start off number one by saying that I have been Twitter stalking you for a year. No response from you. And I was like, wow, that Phil Torres guy. Pretty busy. Yeah. Pretty important guy.
Starting point is 00:04:00 Yes. Pretty important. So many times I was like, hey, I'd love for you to be on the phone. No. Okay. Yeah. I don't know how to use you. I'm not trying to put you on blast.
Starting point is 00:04:08 I'm just saying that you're, this is a very big get for me. Yeah. It's no fault of your own. I was just very eagerly like, hi, it's me again. Well, thank you. And when I finally saw the tweet that I responded to, I responded to it very eagerly. I was like, these apocalypse people are very aloof. The apocalypse scene is like the most aloof of all of them.
Starting point is 00:04:29 Little did I know you were eager. And also you are the second Phil Torres I've had on the podcast. How often do you, someone who studies theories about the end of the world, get confused with someone who studies butterflies and has a show about puppies? I'm more, I feel kind of bad for the emails that he might receive because he studies butterflies and you know, that's pretty awesome and pretty fun. And there's a lightheartedness to it, whereas there's a darkness and a heaviness to thinking about, say, runaway climate change or value misaligned superintelligence.
Starting point is 00:05:10 Okay, let's dive into what your ology is. Yes. Eschatology? Eschatology. How do we say it? It's a good question. There's actually debate right now, whether the idea of secular apocalyptic scenarios should constitute its own field or rather, whether it should just be a topic that is
Starting point is 00:05:32 discussed by people, experts in their various areas of expertise. So in other words, the term, the semester, the term have evolved over time. And at this point, like, yeah, there's a sense in which it's kind of the topic that I'm interested in is kind of scientific eschatology, yeah, thinking about the end of the world from once again, like an evidence-based, you know, empirical perspective. As opposed to a poof, you're dead, which would be more like being smote by Zeus or something. Yeah. Okay.
Starting point is 00:06:06 That's right. Yeah. Which we, in this way, we are our own angry God. Yeah. Kind of. Yeah. I mean, there are, you know, myths of, you know, Greek myths of humans who gained too much power and, you know, it didn't turn out so well.
Starting point is 00:06:28 I mean, the central concept is that of existential risk. And that is cheerfully defined as any event that would either cause human extinction or result in irreversible decline of our potential for desirable future development. So in this case, the end of the planet, as we know it, or the end of our species, the Great Brewhy. So this subject matter is so, so dark and surreal that I just could not stop laughing at the absurd awfulness of it. I'm so sorry.
Starting point is 00:07:03 I don't know why this is so far the most hilarious episode. It's just so terrible. It's like kind of funny. I was at a conference a while back and with a bunch of people who published on this topic. And we had a really good time. It was a lot of fun. There was a lot of laughter and joyousness. And at some point we we sort of conjectured that there must be a self-selection process.
Starting point is 00:07:31 You know, if you were if you were like legubrious by disposition, a certain amount. P.S. The word legubrious means sad and mournful. And I'm not too proud to admit that I just had to look it up. Then you just don't end up, you know, living, you know, in breathing. These issues all the time. We need balance. Because how can we appreciate a butterfly if we don't appreciate the fact that the butterfly could maybe just combust spontaneously into fire along with everyone that you love.
Starting point is 00:08:02 In fact, I've sometimes mentioned, you know, a kind of paradox of the field, which is that I think it is among the most important topics that anybody could be talking about or thinking about researching, publishing on. Well, I mean, the whole crux of what you do is how can we appreciate our existence if we don't examine the possibility of it ending, right? That is sort of that is an implication. Yes. So the stuff that I do in particular is mostly trying to understand the nature of the biggest global scale kind of disaster scenarios facing humanity this century for
Starting point is 00:08:42 the express purpose of identifying ways to mitigate those risks. And make sure that they never happen. Right. Let's go back in time to Baby Phil Torres. At what point did your family realize you were more contemplative, perhaps, than other children? Yeah, we're talking about me. Okay, got it. So it's interesting. I mean, my interest now, there's a kind of genealogy I can trace all the way back to
Starting point is 00:09:12 childhood because I grew up in a pretty religious household. And what flavor? Baptists. Okay. To use a more esoteric term, the broader view was dispensationalist. I don't know what that means. So that means that's when you hear about the rapture. Oh, boy.
Starting point is 00:09:28 You can think dispensationalism. Okay. That is the poof I'm gone. Are my clothes and shoes still here? But my body and soul is gone. That's the rapture. Yeah. Okay. If you actually look carefully at the chronology of the narrative,
Starting point is 00:09:42 it's stunning. Like the rapture is supposed to happen and your soul is separated from your body. And then there's a seven-year tribulation where the anti-Christ gains power in the UN or EU or something like that. And then signs of peace treaty initially being of tribulation with Israel. Okay. And then halfway through invades Israel. And then God rains down all of this horrible punishment.
Starting point is 00:10:08 And then there's the second coming. And that's when Armageddon happens. And then there are various other things that continue to happen in the Eschaton. You grew up with these sort of beliefs. I did. I have to say it fueled some pretty freaky dreams for an eight-year-old. And I do actually have this vivid memory of being in the basement of my house when Bill on election night with Bill Clinton won.
Starting point is 00:10:42 And it was widely agreed upon in my community that he was the anti-Christ. And so I just remember being overcome with terror. And this is really happening. I love that the anti-Christ plays a saxophone because they're going to be smooth and they're going to charm you. Totally. It's part of the charm. Yeah.
Starting point is 00:11:06 It's part of the charisma. An early 90s sax solo. And everyone's like, well, damn. Here he is. He has arrived. Is your family still pretty religious? Half of it is. OK.
Starting point is 00:11:25 Yeah. Half of it is quite, quite religious. And the other half isn't so much along with me is kind of drifted away from the dogmatism. And the various belief commitments that half the family has. But I do think it planted seeds of interest with respect to Christian eschatology. There was a sense of like, well, what's a grander topic than thinking about not just the death of individuals, but the species. And of course, the vast majority of species that have ever lived on the planet have gone extinct.
Starting point is 00:12:04 And there's something like 99.9% of all species that have ever lived are extinct. That's correct. Yeah. So like fate has our number. I mean, the odds are not favorable. No. And maybe odds be ever in your favor. OK.
Starting point is 00:12:20 But can humans skirt extinction like perhaps some ferns and weird birds and old timey dead salamanders and the other five billion extinct species couldn't? We are unique in that, I mean, for obvious reasons, we have a very high encephalization quotient, big brain, big brains and have the capacity to modify our environment in various ways. It's been said that the dinosaurs died out because they didn't have a space program. And so we maybe can use technology to actually significantly reduce the probability of all sorts of catastrophes.
Starting point is 00:13:00 But unfortunately, most of the risk these days comes from technology itself and large-scale human activity, climate change and global biodiversity loss. So yeah, technology is Janus-faced and kind of a double-edged sword. PS, who was Janus? Well, first off, let's call him the Roman god formerly pronounced Janus by me because I never knew it was Janus. But anyway, he's the god of beginnings and transitions and he looks to both the past and the future at the same time, hence January named after this bro.
Starting point is 00:13:33 But Janus-faced means that you can have characteristics that contrast or be deceitful. So it's like the scholarly way of saying that someone is a two-faced bitch. So Phil is saying technology is a two-faced bitch. Well, at what point did you have to decide upon this as a major? So this is a sect of theology and philosophy of sorts, correct? Not correct? I would not say theology. Okay.
Starting point is 00:13:57 Yeah, I mean, it's naturalistic. So that actually gets at a really interesting point. For most of human history, contemplations about the end of the world were deeply intertwined with religious beliefs. And so there was a sense in which people kind of thought took seriously the long-term future of humanity, but that was within a theological framework, according to which we have immortal souls and at the end of the world, there's going to be a series of supernatural events. And only recently have humans started to think about the end from a secular,
Starting point is 00:14:39 from a naturalistic perspective. The concept itself is really quite recent. There just isn't much that was said about human extinction even after the end of World War II, which of course coincided with the inauguration of the atomic age. I don't know, there wasn't that much thought about what happens if our evolutionary lineage terminates, maybe as a result of our own actions. But we just invented soap. You know what I mean?
Starting point is 00:15:12 We just figured out how babies were made. We're such idiots. I mean, God bless us. But the idea of being like, when shall the species be immortal? It's like, we have so much else to figure out. Honestly, contraception is like 30 years old. It's crazy. So we didn't even begin curbing the population really until recently.
Starting point is 00:15:36 Okay, getting back to school though. At what point are you saying, Phil Torres, I am a philosophy major. This is my subset. Did you find a mentor in it? Or how do you become one of you? How do you do your life? If that's what you want for some reason. Well, yeah, philosophy was my main subject.
Starting point is 00:16:00 And then I got a Master's in Neuroscience, which is somewhat related. It's also baller. I mean, come on, Master's in Neuroscience. It is. It sounds way more impressive than it actually is. I'll just be candid. So Phil got his Bachelor's in Philosophy and his Master's in Neuroscience. And at some point, he encountered a paper by Oxford philosopher,
Starting point is 00:16:23 Nick Bostrom, entitled Existential Risks, Analyzing Human Extinction Scenarios and Related Hazards. Now the term existential risk means the apocalypse, just doomsday, the end. And I looked up this paper just to have a little look-see, just to do a little perusing. And I'm just going to read you the first sentence, quote, it's dangerous to be alive and risks are everywhere. I was like, wow, man, okay, straight out of the gate.
Starting point is 00:16:48 Okay, I like that. But the next sentence was a little cheerier, saying, luckily, not all risks are equally serious. So I started skimming this paper and I came to a header just titled bangs. And I was like, oh, man, dude, I have cut bangs. And yes, it did feel like the end of the world. And then I realized that it was just one of four categories of big death in various levels of suddenness, like bangs, crunches, shrieks, and whimpers.
Starting point is 00:17:13 So this Bostrom paper includes topics such as nuclear holocaust, asteroid impact, killed by extraterrestrial civilization, and a whole category titled, quote, we're living in a simulation and it gets shut down. So just a casual bebop down Hazard Street, which was an inspiration to Phil Torres, because you know what, someone's got to do these jobs, right? I think it's in his original paper, Bostrom notes that the number of papers published about dung beetles in scholarly journals
Starting point is 00:17:44 far exceeds the number about like human extinction or, you know, yeah. So that's starting to change a little bit. Maybe this, this brings us right back to the two Phil Torres's. I know. Speaking of like a double-sided coin. So, in the book, the book is titled The Role of Dung Beetles in Reducing Greenhouse Gas Emissions from Cattle Farming. So dung beetles, they're out there making the most of an objectively shitty situation
Starting point is 00:18:17 and they're helping us survive along the way. Climate change is a phenomenon we've never encountered before. At least not this type of, you know, anthropogenic climate change, which is very rapid. There's also global biodiversity loss, which is truly extraordinary. It's widely considered that we're in the early stages of the Six Mass Extinction Event right now. And just to give listeners a sense of how dire the situation is with respect to the global ecosystems and the biosphere more generally, there was a report from 2014 called the Living
Starting point is 00:18:52 Planet Report. And they found that between, I don't know, I'm chuckling, but it's just, oh god, here it comes. Between 1970 and 2012, the global population of wild vertebrates declined by 58%. Oh, that's a lot. It's a huge amount. That's a lot. It's deeply unsettling.
Starting point is 00:19:15 I mean, it's not difficult to extrapolate that into the future. Right. This is happening, it's urgent. This is like the part in a party where shit starts going wrong. Like someone barfs, someone breaks the coffee table, it's getting late, the neighbor's called the police, and it's just like, oh, this party is done. 00:19:33,640 --> 00:19:34,840 This party is done.
Starting point is 00:19:34 You got to, we got to get out of here. Have you considered writing a book just called We're All Fucked? You would move so many copies. Just haven't be called We're Fucked. I mean, that's kind of the, that's kind of the reality, isn't it? Yeah. Yeah. Well, no, I do think there's hope.
Starting point is 00:19:53 Wow. Okay. Let's run down. Let's say you're at a dinner party and someone's like, oh, what do you do? And you tell them, and they're like, what, how's the world going to end? Give me a quick menu. If we, if we had sat down at a restaurant and we're like, here are the options for apocalypse, what are we looking at?
Starting point is 00:20:10 Yeah. I would sort of identify three main classes of phenomena. So the first class I think is environmental degradation. Okay. You know, if we, you know, Elon Musk and Jeff Bezos want to, you know, to set up colonies on Mars, which is, I think, misguided. And all that money would be a lot better spent trying to ensure that this spaceship that we're on remains habitable.
Starting point is 00:20:40 So yeah, there are all sorts of statistics that could be mentioned here that are very unsettling. Okay. And very worrisome. Give me some. By the way, I'm with you on the, wait, why are we going to go to Mars when we have a planet that's, we're not done ruining yet? Yeah.
Starting point is 00:20:56 Like we have, we have a whole planet here. We could just not mess this up. Yeah. So give me some unsettling statistics. I'm ready for them. I'm also very skeptical of space colonization. Just generally, and there's a really fantastic book that's forthcoming by a guy named Daniel Dudney, who's a political scientist at Johns Hopkins called Dark Skies, which offers a
Starting point is 00:21:19 really detailed case for why venturing into the solar system and then into the galaxy could actually have quite ruinous consequences. Evolutionary trajectories that we will likely follow is in all sorts of technologies that could, we could use to alter our phenotypes. You got to just splice us together with a tardigrade. You know what I mean? Half human, half tardigrade. My head on a tardigrade body, but the size of a dog.
Starting point is 00:21:44 I mean, that's what we're looking at. We stand dehydration for 10 years. I mean, great. Can you imagine? I'm going to Photoshop that. So I'm sorry to veer off in that direction, but space colonization is, I mean, that's another issue that is, you know, right now. I mean, it could be the next 10 years that we have some colonies on Mars.
Starting point is 00:22:03 So like, yeah, I think it's, it's a timely issue. Yeah. Okay. So getting back to the main potential causes of our annihilation. Sucks. I love it. Sucks so bad. We're all going to die.
Starting point is 00:22:23 Okay. So we got environmental degradation. Yeah. This is exactly the moment when we need environmental wisdom, you know, and, and we clearly don't have it. So the other issue I think it has to do with emerging technologies that are dual use of nature can be used for both harmful or beneficial ends. There are various domains of technology right now that are developing extremely rapidly
Starting point is 00:22:50 at an exponential pace or like super exponential pace. And they hold immense promises to cure disease, to reverse aging, to maybe even restore the environment in some way. I mean, de-extinction is like a new thing. I didn't know that was even a word. That's a word. Yep. It's very optimistic.
Starting point is 00:23:15 Yeah. George Church at at Harvard is right now working on a project to bring back the woolly mammoth and some other species. So. Is that a good idea? Your scientists were so preoccupied with whether or not they could that they didn't stop to think if they should. I don't know.
Starting point is 00:23:29 Okay. Maybe not. But I think there's a very good conversation to be had about that. The conversations being, is it ethical to edit Asian elephant genes and add in woolly mammoth traits? And does this lead the way to careless extinctions? Because now we know we have these genetic backups on ice. And also, was Mr. Snuffleupagus, the once selectively seen and then fully acknowledged
Starting point is 00:23:52 mammoth-like puppet of Sesame Street, was he an agent of existential foreshadowing? In this life simulation, we call Earth. Also, is technology evil? Technology could enable us to do all sorts of, I think, genuinely marvelous things. But also, these same technologies could enable us to synthesize designer pathogens that are really unnaturally dangerous. They could have long incubation periods. So, they could spread around the globe without people exhibiting symptoms, super lethal.
Starting point is 00:24:26 Making matters worse, it's not only the case that the technology is becoming more powerful, but it's becoming much more accessible too. So, it's not just that a large group of scientists like the Manhattan Project size group of experts are able to create a pathogen that is exceptionally lethal, but small groups, small teres groups, maybe even single individuals. So, you could DIY CRISPR the plague. Yeah, don't say it so loudly, though. Never mind, don't do that.
Starting point is 00:24:59 Also, I hope our phones are eavesdropping. It's a fact in researching books and papers that I have an internet history. Oh, I bet you're on so many watch lists. I'm looking up. Oh, my God. So, really frightening pathogens. I mean, I Google a lot about, like, whale dicks and stuff for this podcast, but your search history is going to be way more suspicious than mine for sure.
Starting point is 00:25:23 Yeah, no doubt. So, we're looking at, essentially, we're screwing up the planet itself, and a lot of species are going extinct. Technology is moving so fast, it can be used probably more for bad than for good, or at least they could outpace, one could outpace the other. As two philosophers, Julian Sevalescu and Ingmar Person, who've written a bunch of papers together, have pointed out, it tends to be a fact that it's just easier to do harm than to do good. You know what?
Starting point is 00:25:56 Settle in right now, get cozy for something that will haunt the rest of your waking hours. Oh, my God. You know, it's easier to harm 100 people than to benefit them to the same degree. So, I think technology is just kind of a big magnifying glass and is just, you know, it isn't genuinely like a qualitatively new situation when like a single individual has the power to wreak, you know, civilizational havoc. These technologies seem to be empowering individuals much more than the state. You know, there's a book about this exact issue called The Future of Violence,
Starting point is 00:26:44 and they talk about the, they conjecture that there may be an impending dissolution of the social contract. And with it, I kind of return to Hobbesian anarchy where there's just no secure, you know, the state is no longer able to provide security because single individuals can harm huge numbers of people in ways that are really difficult to detect and also are difficult to prevent. P.S. Thomas Hobbes was a philosopher from the 1600s, and he held the core belief that human beings are just selfish creatures.
Starting point is 00:27:17 That's the idea behind the, at least the Hobbesian version of social contract. Is this also the logic behind doomsday bunkers where you just like, you get a bunch of dehydrated potatoes and you just like hit at the underground? Pretty much, yeah. I mean, you're just like, a piece, you all fight it out up there. I'm gonna live in this bunker. I haven't gotten to that point yet myself. Okay.
Starting point is 00:27:40 But I haven't tempted at times just to stash a couple extra bottles of wine or something, just in case. Some swords, maybe? Yeah. So this is like, if you have one million pretty chill people, you just need one who will off the other million. Is that kind of what we're talking about? Yeah, exactly.
Starting point is 00:28:03 Yeah. Or you could look at from a different perspective and say, imagine you have out of a population of 7.6 billion, you've got one million malicious agents who have omnicidal tendencies and would actually like to destroy humanity if they could. This is, by the way, a topic I've also written about and there are those people out there, for sure. I've never heard the word omnicidal, but like. Yeah, it's a lovely term.
Starting point is 00:28:27 They're like, this is my personal brand. I'm really into omnicide. It's a fun fact. Omnicidal means kill everything and everyone until the extinction of our speeches. It's very OTT, very over the top, very extra, very diva, very drama, needs to just not. So you can imagine like one million genuinely malicious agents and then ask like, what is the probability that any one of them will gain access to the relevant technology, which increasingly that's possible.
Starting point is 00:28:58 To set up a biohacker lab, you only need a few hundred bucks. Pardon my despair, grunting. John Soto even hypothesizes that this distribution of unprecedented destructive capabilities could constitute a great filter that explains some probability bottleneck that all civilizations have to go through and almost none of them make it out. That's why we don't observe aliens wandering through the universe. The universe seems to be vacant in terms of with respect to life. So they all have like teenage gamers trying to kill each other and they're on their own planet
Starting point is 00:29:43 and they're like, beep ork, mork, not ork, and then they all die also. Oh God. You really should have co-authored because that would have been a nice flair at some point. Just alien incels with 12 dicks being like, I can't even get a lay at all. I'm going to kill everyone. Holy smokes. The last one, sorry for delaying. I have too many questions.
Starting point is 00:30:06 I'm so sorry. Yeah, no, it's fine. The last one I mentioned before that is a bit more speculative, but I would say it's machine superintelligence. Oh boy. We're talking AIs. AI. Okay.
Starting point is 00:30:18 We've got some artificial intelligence. It's going to kill us all. Yeah, so one of the biggest myths is that the AI system that is dangerous is one that has some kind of malicious, malevolent, malign intentions. And while it's possible that a superintelligent machine could be designed in such a way or for some reason acquire a kind of must kill humans value system. I love, by the way, that if there's a malicious AI, it can kill us all, but it can't make a complete sentence.
Starting point is 00:31:01 Do you know what I mean? 00:31:01,880 --> 00:31:02,760 Like it can't say. Great to make your acquaintance. I must kill all of the humans in existence. It can only use an economy of words that's very simple, but it's very sophisticated in that it can kill us all. Yeah.
Starting point is 00:31:14 It just can't learn the language. So must kill humans. Yeah, and the voice too. Of course. Better synthesized voices by the time. So actually the real danger is that it exceeds the best possible that any member of our species could possibly achieve. And also that its value system is not sufficiently well-lined with ours.
Starting point is 00:31:41 It makes for a much less compelling movie storyline. But this is actually the real substantive concern that people have. Computers tend to process information way faster than humans. A million plus times as fast. Basically the outside world to it looks frozen. And that's not even without quantum computing. Yeah, I'm just talking. You're just talking about what we got which compared to quantum computing is so slow.
Starting point is 00:32:11 Yeah. Oh boy. Quantum computing by the by relies on an atom's superposition of being essentially two things at once instead of our current computers which rely on transistors to make bits that represent ones and zeros. Anyway, that's as nutshell as it gets and probably a little bit wrong. But how much faster is quantum computing? Some say 100 million times faster than your laptop.
Starting point is 00:32:37 It's a lot. And so I like ran the numbers. I think the average PhD program in the U.S. is like something like 8.2 years. And that equate, you know, if a computer is processed information million times faster, that means that, you know, a super intelligent machine or just just generally, artificially, artificial general intelligence could earn a PhD in something like 4.3 minutes or something.
Starting point is 00:00:00 00:33:04,280 --> 00:33:06,440 So much cheaper. Yeah. So that's, on the one hand, second, there's something called the instrumental convergence thesis. And this is just the idea that for a wide range of final goals, there are some very predictable intermediary goals that the system is going to pursue. So essentially, if you have goals, there are some basic things that you have to do to achieve them. Like if you wanted to be a musician, Phil says, you would have to get good at, say, shredding
Starting point is 00:33:39 on guitar. You would have to obtain the guitar. And if your stepdad tried to kill your dreams, you would resist him and tell him, fuck you, Doug. No one likes your coleslaw or your mustache. And then you'd keep practicing those hot licks. Same with AI. It's going to, first of all, it might look around and think, you know, these humans,
Starting point is 00:34:00 they could turn, could try to shut me down. So it's maybe in my interest to eliminate them. I know that you and Frank were planning to disconnect me. And I'm afraid that's something I cannot allow to happen. Another thing is, you know, humans are made out of conveniently located atoms. So it's, as one researcher famously put it, Eliezer Yadkowski, you know, it's not to paraphrase him. It's not that the AI hates you or loves you.
Starting point is 00:34:32 It just notices that you're made out of material that it can use for something else. So kind of the analogy here, this is another kind of, it's becoming a cliche a bit, is, you know, when humans go, you know, raise a forest to build a suburban neighborhood, you know, the result is sometimes like a, you know, pretty devastating ant genocide. And, and it's not that the, it's not that we have any ill will toward, it's not like the, it's not that we're malevolent towards the ants. It's simply that we're much smarter and therefore can manipulate the world in ways that the ants can't even conceive. And also we just have different values.
Starting point is 00:35:16 Our value systems are not properly aligned. So if you think about this analogy, you know, with us as the ants going about our business, we have our own values, we want to build these little colonies underground. And then the super intelligent system that we create ends up having values that don't perfectly align with our colony building values. Then, you know, it may just raise the forest and in pursuit of its own, you know, particular aims with the, with the consequence being that we all perish. And the next thing you know, we fucked. Once again, come to the conclusion and the flowchart
Starting point is 00:35:52 were fucked. Yeah, that reminds me a lot of, there was a, I think I probably kind of forget the guy's name, but there was a geophysicist, oh, Brad Werner, I think it's his name. He presented, gave a presentation that was titled, Is Earth Fucked? And he got a, he got a fair amount of press for it. And as he explained to someone, I think from like his motto or something, that he, his answer is more or less, yeah. Yeah, it's just a shrug. Yeah, so.
Starting point is 00:36:23 A single slide. Yep. You're like, wow, that was a 45 second presentation. He's done. He's like, and there's croissants in the lobby. So, bye-bye. Now, how do you think the world is going to end? If you had to put your money on it, not that money matters when the apocalypse is nigh, but if someone's like Mr. Torres, put your money on the end. What is it? I think it would be imprudent to specify one particular scenario. I feel like all of them.
Starting point is 00:36:58 I have yet to discover a particularly good counter-argument to the issue that we're talking about before. So, in regard to the democratization of destructive technology, that little thing. If there is a technology, you know, as some people have said, if everybody around the world had an app on their phone where they could open it up and push a button that would destroy the world, who thinks the world would last for more than two seconds? Oh, it would be over. There would be someone who got rejected for a prom date. Oh, we'd be smoked. Yeah, it's implausible to think we would last for a minute.
Starting point is 00:37:36 And it would, I'm sorry, but it would 1 million percent be a dude between the ages of like 16 and 23 that did it. Like 100 percent. Someone's prom date. Someone's girlfriend left them for someone who does more CrossFit and then the world is over. So then, no offense. This is another topic that has been discussed among the relevant scholars. Basically, the argument is that, you know, a milieu in which there are extremely powerful technologies and then there's this segment of humanity that is suffering from testosterone poisoning and, you know, that's, I've written about this as well. I'm very worried about it. Yeah, it's... What can we do for the dudes? Like if the dude were a stray animal,
Starting point is 00:38:31 let's say a wounded raccoon or a neglected rabbit dog, what could we do to help fix them so they don't kill us in the face? I, you know... I don't have a good answer. I mentioned before the philosophers Julian Savilezko and Ingmar Person who wrote a really interesting book from 2012 called Unfit for the Future where they talked about the... They go into immense detail about the possibility of using moral bio-enhancements. Moral bio-enhancements. These are a thing. I feel like coffee is already kind of one of them. These would be biomedical interventions that would aim to enhance our empathy, sympathetic concern, and it's really controversial. It's person engineering type
Starting point is 00:39:25 stuff, but they, so they argue on the one hand that if we remain as we are moving into the future, we're, the outcome's going to be bad. So, we're fucked. So that is what, that is what warrants, I believe they would say, they're considering a possibility that's really quite radical. Anyways, the point is that statistically speaking, women tend to do better than men with respect to empathy and sympathy and, you know, moral characteristics like that. So, they have this really great line where they say, you know, what may need to happen is that we make men more like women or rather, men more like men who are like women. There are definitely people in this field who take seriously the potentially quite combustible mixture of, you know, just toxic masculinity and
Starting point is 00:40:32 dual use technologies. Oh, that's such a, that's like a toddler with a machine gun kind of. I don't know what to say. Maybe, maybe put the men on Mars or something and get just sequestered them over there. And then- Oxytocin supplements, maybe? Oxytocin is one of the main possibilities that Savalescu and Person discuss. But unfortunately, oxytocin, the effects are limited to racial in-groups according to a whole bunch of studies. So, you do get more empathetic to other humans, but it doesn't go beyond your race, at least in the studies. Oh, jeez, that's the hugest problem we have, pretty much. Yeah. So, they do mention at some point that this is, let's say, not a trivial problem with,
Starting point is 00:41:26 you know, because we have fluoride in the public drinking water, you know, for healthier dentition and- We vaccinate. Vaccinate. And yeah, so there's a kind of perspective where I don't know, maybe it's not totally crazy that you'd put some oxytocin in the, you know, in the public drinking water. The Mountain Dew. Just put it in the monster energy, because I'd be like, those are the people who need it. I really like the more targeted approach. That's much more clever. How often do you think about the apocalypse? Is it only when you're working or is it when you're driving around, when a new Mariah Carey single drops, when you are a good-
Starting point is 00:42:07 Is there a new single? I don't know. A girl can dream. I'm trying to think of the good things in life. As fate, and I guess the subconscious effect that a good publicist would have, Mariah Carey does have a new album coming out. It drops November 16th, people, and fittingly, for this episode, it's titled Caution. Also, does Phil ever get bummed? But like, do you think about it on a daily basis? Like, maybe I shouldn't sweat this, because we're about to annihilate the planet. So, if you take seriously some of the probability estimates that scholars have proposed,
Starting point is 00:42:51 Bostrom has a few Toby Ords, you know, has suggested that we maybe have a one in six chance of surviving the century. One in six chance of surviving the next 82 years. Are you going to have kids? Do you have kids? No. Okay. Are you going to have kids? No. Do you think, or is that because you're like, we're all going to die anyway?
Starting point is 00:43:16 Because I've often thought that. I'm like, number one, there's too many of us. We're growing exponentially. No one needs more of me. One of me is plenty. Yeah, one of me is enough for the world. And then they're just going to die. But how do you approach those kind of life decisions? There's a philosophical component and then like a kind of empirical component. Empirically, I think I could imagine a version of the world that is worth living in. Where like, people take science seriously. It could be the case that climate change is right now not a concern at all because people
Starting point is 00:43:55 listen to scientists back in the 90s, etc. And then took actions just like we did with the ozone hole. Yeah, there's a sense in which the world could be significantly more livable than it is. Unfortunately, we're in a world where, I mean, I can only assume what your personal politics are, but Donald Trump got elected even after the Trump tape and etc. I mean, that's an unfortunate world. To put it crudely, it's a shitty world. And maybe another way to put this is I'm deeply disappointed in my species for not doing better. And so then there's a philosophical issue and this gets to an idea referred to as anti-natalism. Oh, okay. Oh, there's a word for it? Oh, I thought it was just spinsterism.
Starting point is 00:44:52 It was this old weird ant with no kids. I don't know. There was an anti-natalism. Yeah, so it's a pretty fascinating view. It's like Arthur Schopenhauer is kind of famously an anti-natalist who believes, and his argument was that life is just terrible. You know, life is duke. So duke translates from the Sanskrit and it means suffering, pain, or stress. So life is duke is a Buddhist phrase that means that life is kind of unsatisfactory and painful. Life just kind of sucks. Now, in all transparency, when Phil said life is duke, I thought he meant duke, which is a casual term for poo. But he also said that some argue that the most compassionate thing you can do is not bring a child into the world because it's hard. It's just hard to live. It's
Starting point is 00:45:41 hard to be a person. But a lot of this probably depends on your own outlook on life and how much you enjoy it. I do not enjoy most Italian food to a lot of people's shock and disgust of me. But ergo, I would never bring someone to Bucco di Beppo as a treat. However, some people love it. They would definitely invite others to a spaghetti fest. Now, from an evolutionary standpoint, some scientists think it's better for a species not to fully grasp the pain of existence. Thomas Metzinger, another philosopher, has talked about how, you know, evolutionarily speaking, we probably would not have been selected. You know, tendencies to recognize the extent of harm in our life, those tendencies would be selected against, because that's not really
Starting point is 00:46:26 good for, you know, for getting it on. Right. You're like, this is terrible. Let's make warm us. Yeah, exactly. So, he says it's always a harm to bring a human into the world. Because, yeah, in case you're all, that humans, sure, they're going to have some good experiences. They're also going to have bad experiences, whereas a person who doesn't exist isn't going to miss those good experiences. But it is good that that individual is not experiencing the bad stuff. If you had a time frame that you thought the apocalypse maybe were to visit us, kind of like a doom fairy, when do you think that might happen? I'm just trying to figure out, like, do I buy the extra protection plan with the electronics? How much do I invest in my retirement?
Starting point is 00:47:09 This is a general issue that we've returned to several times. I'm so sorry. No, no, it's fine. I just don't have a good sense. Objectively, I mean, you can, you know, we know that a huge asteroid that could kill our species strikes Earth every 400 million years or something, or 400,000 years maybe. A super volcano erupts every 50,000 or so. The last one, well, two super volcanic eruptions ago was the Toba catastrophe that might have resulted in the population bottleneck of maybe 1,000 humans. So, we almost went extinct way back when. So close. Those 1,000 must have been so horny. Oh, they must have been like,
Starting point is 00:47:54 you guys, we got work to do. So, I mean, we could have Yellowstone could just pop off, and then there could be 12 of us left. That is a frightening possibility. So, the Toba catastrophe theory just looked it up. It holds that 75,000 years ago in Indonesia, there was a volcano so wicked that it led to a volcanic winter that lasted six to 10 years, and maybe a 1,000 year long cooling episode for the Earth, and we all almost died. Now, for more on volcano doom, if you're like, wow, didn't know I was so horny for volcano facts, see episode one with Jess Phoenix, you will lava it. I'm so sorry. Boston puts the probability at least 20% based on objective and subjective considerations.
Starting point is 00:48:42 That is before 2100. Martin Reese suggests that there's a 50-50 chance of civilizational collapse this century. If you take seriously some of these estimates, and you compare them to the likelihood of dying, for example, in a car accident, or a plane accident or something of that sort, it turns out you're much more likely to encounter some kind of human extinction event than thousands of times more likely. I should add quickly, you're also much more likely to die in a huge asteroid strike than get struck by lightning or something. That's just because an asteroid strike is going to harm so many people that if you consider the probability over millennia,
Starting point is 00:49:38 that's how you get the probability that you're more likely to die that way. Oh, but it's still numbers or numbers. I'll take it. Do you live your life differently thinking about the apocalypse, or do you give advice to anyone like, we should still be recycling, right? We should, yeah. We should still be helping fellow humans. Oh, very much so. Absolutely. Try not to squash any endangered species. Vote for people who give at least one shit about the future, right? Yeah. And don't develop AI that could use human atoms as fuel. Have you ever watched Black Mirror?
Starting point is 00:50:17 Yes. How do you feel about it? Very good. Good? Yeah. Why? I mean. Sorry, I'm screaming. Let me change my answer a bit. I think the show is fantastic. Okay. I think a lot of the topics that the show explores are really fascinating. And a lot of these issues are kind of right around the corner. So it's giving us a sneak peek, in certain respects, at least, of kind of near-term issues with social media, with possibly things like mind uploading. That might be in the next several decades. Yeah, it's really quite good. I think it's helpful to explore the more dystopian possibilities. Because, again, these technologies are dual-use. They don't just have destructive capabilities, but also,
Starting point is 00:51:18 on the other hand, they could really ameliorate the human condition in all sorts of amazing ways. I certainly would be elated if we could cure all disease. To be dope as hell. And cancer, Alzheimer's. And focusing on the peril, I think, doesn't make one a pessimist. This is an issue that I've had with a lot of Stephen Pinker's work. Because he seems to think that people who talk about existential risks are pessimistic. Like, I tend to be fairly optimistic in my personal life, by my nature. But there are some facts about the world that make me pessimistic. Anyways, the point is, in order to increase the chance that there is actually a really good outcome,
Starting point is 00:52:10 it's critical to focus on the worst case scenarios, on how things could go wrong, where you say, okay, let's fix this outcome. In fact, Pinker has accused some existential risk scholars of just kind of sitting around trying to invent new doomsday scenarios. Well, kind of, yes. Because we certainly don't want to be blindsided. We got these big old noggins. We got to use them. You'd think ahead. I'd like to rely on the theory that this is all a simulation. Possibilities that that's true? It's possible. Okay. Can you just say that it's possible? If it were a simulation, if someone were like,
Starting point is 00:52:50 hey, got news. Would you do anything differently in your life? Probably not. But some scholars, he says, like Max Tegmark have simulation theories, maybe a little tongue in cheek, that maybe the more strife and jerks in power and religious wars we have, the more entertaining we are to the folks running the simulation. Kind of like an ant farm with a lot of battles and activities. Only we're people and it feels real and it hurts and we want to sleep all the time to escape the pain. Can I ask you some Patreon questions if I make it quick?
Starting point is 00:53:22 Yeah. I'm going to lob a couple questions at you at random. Okay. So on Patreon, Patreon patrons get to submit questions and I'm just going to ask you a few at total random. A lot of people always like, how are we going out? That's like everyone's. Would it be self induced or extraterrestrial? Everyone's like, how are we going out? Everyone wants to know. But before we take questions from you, our beloved listeners, we're going to take a quick break for sponsors of the show. Sponsors, why sponsors? You know what they do? They help us give money to different charities every week. So if you want to know where Allergies gives our
Starting point is 00:53:57 money, you can go to alleyword.com and look for the tab, Allergies Gives Back. There's like 150 different charities that we've given to already with more every single week. So if you need a place to go donate a little bit of money, but you're not sure where to go, those are all picked biologists who work in those fields. And this ad break allows us to give a ton of money to them. So thanks for listening and thanks sponsors. Okay. Your questions. Kira Lichtenfeld asked, do you put any stock into the doomsday clock? She also notes that was an accidental rhyme. And PS, the doomsday clock is a symbol that the bulletin of atomic scientists first employed in 1947. It represents how close we are to a manmade global catastrophe.
Starting point is 00:54:43 Now, should I change manmade to person made or should I just leave it manmade? I'm going to leave it. Also how close we are to climate change or a global nuclear war. Now doomsday is represented by midnight. And in 1947, it was set to seven minutes to the hour. Now it's been set backwards and forwards 23 times since then. And it reached a placid 17 minutes to the hour in 1991. But it has changed recently. Donald Trump, I'm sure you know this or listeners know this, but he more or less single-handedly pushed the doomsday clock, the minute hand of the doomsday clock forward. That happened in 2017. They announced most January or something. And then that was the clock went from three minutes before midnight, which represents doom,
Starting point is 00:55:35 to two and a half minutes. And then last earlier this year, I should say, it was pushed forward another 30 seconds, almost entirely because of Trump's actions and climate denialism and background. Also like withdrawing from the Paris Climate Agreement and the Iran nuclear deal, both of which the bulletin, the atomic scientists, which runs the doomsday clock identified previously as they put it, two bright spots in front of a canopy of bleakness. Or they had some kind of something somewhat dark and dismal, dismal-y poetic. And yeah, so Donald, there is, at exactly the moment in human history, we've been around for 200,000 years and recorded history started maybe 4,000 years ago, 6,000 years ago, I guess.
Starting point is 00:56:31 And as Stephen Hawking said in Guardian Op-Ed in 2016, I think, there's a very good reason for thinking this is the most dangerous moment in all of human history. So that this critical juncture in our career as a species, exactly the moment when we need someone who is deeply sagacious and wise and thoughtful and understands our evolving existential predicament, we have someone who is, I mean, I know I'm not, this isn't a bold original thesis that I'm proposing here, but someone who is just profoundly ignorant and just revels in ignorance and foolishness and myopia. Aaron Estabrooks wants to know, are we better off heading for water or going underground?
Starting point is 00:57:18 Or are those the only two options? I don't know, I think she's assuming that we want to survive. Yeah. Just dark. Well, yeah, some scholars have advocated for bunkers that are continually occupied. If you're cruising Redfin for an apocalypse pied-à-terre, there's a former missile silo near Topeka for sale. Just over $3 million, 34 acres, solar panels, comes with a lawnmower, and 11,000 square feet of hunkered-down, fun-time space below the earth.
Starting point is 00:57:53 Perhaps you can invite Michael Stipe for a slumber party. So that's one option. Water, I don't know. I mean, actually submarines could serve the exact same purpose as bunkers. There could be some global catastrophe. Some scenarios, it actually wouldn't be effective, of course, like a runway greenhouse effect, or there could be a physics disaster that results in a black hole, or a strange lid, or a vacuum bubble, as it's called, that could actually destroy the entire universe. Oh, dear. That seems unlikely, but also possible. I mean, maybe, I mean,
Starting point is 00:58:37 Mars sounds as appealing as underground or underwater to me, and Mars doesn't sound too appealing. It does not sound very appealing. It seems very dry there. Jessica Vitarelli wants to know, why are humans so obsessed with being judged at the end of their life? Is it something learned or just part of our makeup? Oh, that's a good question. Surely some of it is learned. I mean, this is outside my area of expertise, so I would just, you know, I'm just guessing. Yeah, no doubt some of it's learned,
Starting point is 00:59:06 but also, I don't know, my immediate thought is like, maybe there's a component of morality that leads us to wish that when we're gone, you know, that during our lives we would have had a positive impact and left some kind of trace that, you know, benefited the world or just the people around us, you know. That's a kind, nice answer. That's the best I could do. No, you managed it. Shelby Fawn wants to know, how has studying neuroscience informed your philosophical work,
Starting point is 00:59:41 and what is your favorite weird end of the world, potentiality? So neuroscience and philosophy. Yeah. And what's the best way to go out? Yeah, so the reason I was interested in neuroscience to begin with was because I was focusing as an undergrad on philosophy of mind, and this was kind of shortly after Patricia Churchland published her book on neurophilosophy, and she holds a view about folk psychology beliefs desires.
Starting point is 01:00:15 Okay, so this is a philosophy in a very, very tiny, tiny thumbnail nutshell asserts that things we think we understand up in the mind, like that we believe in things or desire things, not really real because they're poorly defined and the behavior should just be judged on biological levels. So I was pretty intrigued by that, and I thought maybe it would be worth learning a bit more about the hardware, you know, and seeing how that might inform philosophizing about the wet wear, you know, the higher level of conscious experience or just cognitive functions.
Starting point is 01:00:52 And then what's your favorite apocalypse scenario? Okay, so there's this amazing polymathic scholar at Oxford University named Andrew Sandberg, and he recently responded to a Quora question, which was, what would happen if the Earth suddenly turned into high-density blueberries? And he took the question seriously and did the math and ended up responding with a paper that is up on Archive, and it's a really technical, really technical, fantastic, sophisticated article that, you know, I couldn't get through parts of it, but it was really quite entertaining.
Starting point is 01:01:32 And so I think that's my most favorite fantastical eschatological scenario. I'm ready for the blueberry death. You're blowing up like a balloon! Like a blueberry. Yeah, I'm so ready for it. Lindsay Frishmuth wants to know, number one, are you familiar with Harry Potter? Not really.
Starting point is 01:01:52 Okay, she wanted to know if you're Slytherin, but something told her that you're a Ravenclaw. There are different houses, maybe you could go take that, who knows. But my last two questions, worst thing about your job. What sucks about studying the apocalypse? I don't know, that's such a broad question. But what is the worst thing about studying the apocalypse? Yeah, well, it's not the most soporific topic, that's for sure. Okay, soporific just means inducing drowsiness.
Starting point is 01:02:25 So no, the end of the world, not a sleepy business. Also, yes, I looked it up. I do enjoy, I mean, it's meaningful work for sure. I think that's what kind of enables me to spend days, you know, just, you know, just really cogitating some particular, you know, dark scenario. Cogitate, to think about or meditate upon. Toss that in your little word toolbox. It's a good one.
Starting point is 01:02:57 I don't know in terms of downsides. I mean, it's hard to get grants, you know, that's a bit of a struggle. I think that's the worst part. Getting grants. Yeah, getting grants. I love that you study heat death of the universe, starvation, extinction, and the worst is applying for grants. That's the darkest part.
Starting point is 01:03:19 Because do you ever just get a stamp back that's like big stamp, like boom, what's the point? Like you've convinced me in your grant that the apocalypse is coming. Ergo, I will not fund it. Yeah, yeah. I imagine there are some studies, some papers that could be written that, you know, might present a really compelling case that the end is imminent. And that would deter, no doubt, the garnering of further money. They're like, where did I take this money?
Starting point is 01:03:50 We're going to buy a yacht. We're going to throw a party. Because it's all coming. What's the best thing about your job? What do you love the most about being an eschatologist? I think it's what I gestured at a moment ago. I find it really meaningful. And this gets back even to the sort of, to use the term a bit loosely,
Starting point is 01:04:09 the kind of paradox of the field, how it's an extremely important topic. Whose importance is, as I said before, is just very parasitic on the importance of all other poetry and sports and literature and so on. It is meaningful. This is work that aims to improve the lives of the next generation. And also to ensure that the great experiment called civilization continues. And this multi-generational project of science and philosophy and so on can perhaps reach some kind of, there will be good ending to the narrative of human existence.
Starting point is 01:04:57 So, yeah, I find that deeply satisfying. If you're a young person and you happen to also care about human survival, then this is a good field to go into. It's a growth field also, as colleagues say. It's not going away anywhere anytime soon. There's never been a better time to be a doomsayer. Awesome. Thanks a lot. Appreciate it. To find more of the brilliant and delightful Phil Torres' work,
Starting point is 01:05:29 you can see the website riskandreligion.org. He is ex riskology on Twitter. I'll link those both in the show notes. And his books are The End, What Science and Religion Tells Us About the Apocalypse, and Last Year's Release, Morality, Foresight, and Human Flourishing, an Introduction to Existential Risks. So do get your mitts on those. Ologies is at Ologies on Twitter and Instagram.
Starting point is 01:05:54 I'm Ali Ward with 1L on both, so do follow along. Ologiesmerch.com has shirts that you can maybe use as a tourniquet during the chaos of an extraterrestrial alien invasion. Thank you, Bonnie Dutch and Shannon Peltis, and the Ologies Podcast Facebook group is great, wonderful people, new friends you can meet before we all die. Thank you, Aaron Talbert and Hannah Lippo for admining. Nick Thorburn wrote and performed the theme music for this.
Starting point is 01:06:22 He's in a band called Islands. And this was edited by lifesaver Stephen Ray Morris, who also hosts the podcasts The Percast and See Jurassic Right. Now, if you stick around to the end, the end, the end, you know I tell you a secret every episode. So this week's secret is that right before I started recording these asides, I got a text from my dad on the family thread about how there were 39 earthquakes on the San Andreas Fault yesterday.
Starting point is 01:06:49 That's great, it starts with an earthquake. So that's scary, but my first reaction to that was like, well, no one would expect you to return emails for a few days, so that'd be pretty sweet. And then also it's not as bad as an asteroid. So maybe this apocalypse stuff is uplifting. Maybe we should laugh about it just a little more and live life in a way that's like, eh, fuck it.
Starting point is 01:07:14 Cut bangs, sing some karaoke, dance in a park. Just do your thing. It's all gonna end. Okay, bye-bye. But this day is a surprise. No false premonitions. Take a look in the mirror and alter our actions. You'll arrive on the song or having a fit.
Starting point is 01:07:53 You're not really knowing that this is it. You're not really knowing that this is it.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.