Cognitive Dissonance - Episode 630: The Skeptics' Guide to the Universe and the Future

Episode Date: May 23, 2022

Thank you to our guest Steven Novella for joining us. Find out more about their new book The Skeptics' Guide to the Future at Follow them on twitter    ...

Transcript
Discussion (0)
Starting point is 00:00:00 This episode of Cognitive Dissonance is brought to you by our patrons. You fucking rock. Be advised that this show is not for children, the faint of heart, or the easily offended. The explicit tag is there for a reason. recording live from glory hole studios in chicago and beyond. This is Cognitive Dissonance. Every episode we blast anyone who gets in our way. We bring critical thinking, skepticism, and irreverence to any topic that makes the news, makes it big, or makes us mad. It's skeptical, it's political, and there is no welcome mat. This is episode 630.
Starting point is 00:01:06 And Cecil, we are joined by a special guest. Special guest. A special guest. A kind of an inspiration. Yeah, absolutely. If we're being genuine. Absolutely. Right, which we sometimes are and mostly aren't.
Starting point is 00:01:17 But today, we are. And really an inspiration. We are joined by Dr. Stephen Novella of Skeptic's Guide to the Universe. Dr. Novella, thank you so much for being here today. Thank you, guys. Pleasure to be here. We want to start to talk a little bit about this brand new book. Can you tell us why your team decided to put this book together?
Starting point is 00:01:37 Talking about the second book now, the Skeptic's Guide to the Future. Yeah, the future. Yeah. Yeah, yeah. So that one, yeah, that's coming out September 27th. You can pre-order it now, but but it hasn't yeah yeah fully come out so um you know we're techno nerds we love science fiction though thinking about the future this has been a lifelong fascination of ours uh you know especially you know bob jay and i so um you know we definitely want to continue
Starting point is 00:02:02 writing books you're you have a lot of ideas about like what the next book in the lineup would be. And, um, you know, I came up with the idea for this book and I knew that this had to be our next book. It was just, it was a no brainer. I pitched it to Bob and Jay. They're like, yeah, absolutely. This would be, it was a ton of fun to write, a ton of fun to research, you know, just whenever we would get together to talk about the chapters in the book, it was just, you know, just whenever we would get together to talk about the chapters in the book, it was just, you know, we, we had so much fun writing it. We hope that, you know, it's, it's the same experience when people read it, but, uh, you know, there's a,
Starting point is 00:02:34 in generally speaking, people have done a horrible job of predicting the future. And so we said, oh, well, this is, there's a lot of room here for us to take a skeptical approach to futurism, you know, to just thinking about futurism itself, and then, and then we'll see, maybe we could do a little bit better, so we'll put our nickel down and try to, you know, predict the future as well, and see how it goes, so. So how, I guess, how much psychic power do you plan to use? Every year. Every year we see the psychic predictions. Yeah, we used to do it.
Starting point is 00:03:09 God, we've been doing this for a long time. Cecil and I, for years, we would, at New Year's, we'd have an episode. I don't know if we did it last year. We did it last year, but yeah. Last year was a disaster. Yeah, it wasn't funny anymore. No, it's not funny anymore. But for the longest time, for a decade plus, we'd be like,
Starting point is 00:03:22 all right, let's look and see what they predicted last year. And then it was inevitably so wrong. So when you're thinking about futurism, why do you think it is that futurists, and outside of psychics, that's obviously nonsense, but futurists, why do you think they get it wrong so often? So that's a big chunk of the book as we talk about that exact question. And we actually outline what we call futurism fallacies, right? What are the common mistakes that futurists make? And then just by looking at the past attempts at projecting technology into the future, the farther back you go, the more laughable, you know, the futurist predictions become. But the themes are still there. Like one, as an example,
Starting point is 00:04:10 is taking some current trend and projecting it indefinitely into the future. As, oh, cell phones are getting smaller. So they're going to continue to get smaller. Right, yeah. Until they're teeny, tiny, teeny. Remember Minority Report,
Starting point is 00:04:24 they had the really teeny cell phones. Right, Yeah. interact with technology. So I think that's why. I mean, when you think about it also, you know, if you go back 100 years, the futurists of that time didn't have a lot of previous futurists to learn off of. So they were just winging it, you know, they were just doing what made sense to them and not realizing that they were falling into all these pitfalls. But now we could look back and say, oh, yeah, look at all, you know, all these different ways in which the futurists fail. But we're not going to do that. You know, we're going to learn those lessons and try to account for them.
Starting point is 00:05:17 So in all honesty, we do give glimpses of the future, which are more fun than anything else. We do give glimpses of the future, which are more fun than anything else. But mainly, we don't try to predict details that are unpredictable. It's like the difference between predicting the weather and predicting climate. You can't tell me if it's going to rain a year from now on a very specific day. That's like trying to predict a very specific future technology. But we do know that things are generally getting warmer. And so we can sort of predict these broad brushstrokes.
Starting point is 00:05:50 And also we try to paint out possibilities. Like these are the possible futures depending on these variables. So we are therefore to hedge our bets a lot, but that's the only way you can reasonably extrapolate into the future by saying these are the possibilities
Starting point is 00:06:06 depending on these variables. I mean, obviously, the goal is to achieve full Jetsons. I think that is the, that's it, right? The Jetsons is it.
Starting point is 00:06:17 Yeah. You got to have a robot made that balances on a single wheel. Yes. Zipping around. A robot dog. With a sassy attitude. I think all of these things
Starting point is 00:06:27 are great. Because my Roomba has no attitude at all. Just absolutely flat. Terrible. Boring. Literally boring.
Starting point is 00:06:34 Unless you put googly eyes on a Roomba, which is amazing. Which I have done. I do have googly eyes on my Roomba. Still, second worst conversationalist
Starting point is 00:06:40 in the house. Just mediocre. Can I ask? Can I ask, we want to talk about, we want to talk about your, your, your, the current book that's currently out too. But I want to ask this question. When you look at the future, do you think you look at it half full or half empty? If we're talking about like glass half full. Yeah, is it so bright you have to wear shades or no? So actually we give the full spectrum. We talk about if everything works out perfectly well,
Starting point is 00:07:15 this is the techno utopia we will create for ourselves or we could have the techno apocalypse. So with many of the technologies we talk about, we give a range of scenarios. So like nanotechnology could cure death or could turn the surface of the earth into three feet deep of paperclips. You know, one or the other, you know, it will completely destroy civilization or will or will cure all of our problems. Probably it's going to be somewhere in the middle. I mean, so realistically, we say, yes, a foot foot and a half of paper clips and we live six years longer. The range of possibilities.
Starting point is 00:07:51 And most technology, and that's also one of the fallacies, you know, is like you think in these black and white terms, like this technology is going to only be good or only be bad. And the vast majority of technologies, you know, cut both ways. They may be disruptive. They may have some downsides. They can be bad. And the vast majority of technologies, you know, cut both ways. They may be disruptive. They may have some downsides. They can be abused. They can, you know, look at social media and the internet. It's been wonderful and horrible at the same time. That's the way most big technologies are going to do that. They're going to, they will give us great things and also completely
Starting point is 00:08:23 screw with our society, you know? So that, this is interesting. So I just finished reading, and I don't, I don't know that I recommend it. I read, I read Homo Deus. Have you read Homo Deus? Kind of a bigger book out there. I've read that one. It's, it, by the same guy who wrote Sapiens, kind of one of those bigger pop-sci kind of
Starting point is 00:08:40 books that, that's out there. And there's a lot of discussion. There's a lot of futurism in, in Homo Deus. It's very much a futurist kind of books that's out there. And there's a lot of discussion. There's a lot of futurism in Homo Deus. It's very much a futurist kind of a book. And I think it's riddled with all kinds of these sort of absolutist flaws. So when I read it, I'm sort of dubious of a lot of these assertions. But one of the things I'm curious to get your take on is one of the discussions that's kind of been had and had and had again is about this idea of conquering death. And you just brought it up very briefly with respect to nanobots,
Starting point is 00:09:09 or nanotechnology, not necessarily nanobots. And I'm curious what your thought is on that in general, because I've always thought myself, like, it doesn't sound desirable, much less achievable. But I'm curious what your thoughts are on that sort of thing. So we discuss that directly in the book. And it's like, all right, if our goal is to become immortal, what are the different pathways to that? Which, again, is another fallacy in that we think there's only one path to any outcome when in fact there's often multiple
Starting point is 00:09:47 different ways you know different technological pathways where we might get to an end and they will probably exist side by side and complement each other etc so you think you know what are we going to uh just get really better and better and better at like stem cell technology, where we could basically just infinitely regenerate our tissue? Or are we going to get, is it going to be more of the nanotechnology pathway, where we are using, you know, hard machines to manipulate our biology? Or is it going to be genetic technology? Or are we going to say, just forget these meat suits. Let's just just you know somehow
Starting point is 00:10:26 migrate our consciousness into silicon into something uh you know into an artificially intelligent matrix and then that will be the form of our our immortality the the like for one of the problems is is the brain right because you you are your brain, basically. Like you can't, there's no, like I'm going to upload my conscious, it doesn't exist. Like you are your brain, the firing of those neurons, that is you. And you can't move it anywhere
Starting point is 00:10:55 or take it out or do transfer it or anything. That's why I had to go very carefully. You might, there are theoretical ways you might be able to slowly migrate it, but we won't go down that rabbit hole just yet. So the ultimate limiting factor is going to be the longevity of your brain. Even if you can regrow your body every 20 years or 30 years, you can't regrow a new brain because then you won't be you, right?
Starting point is 00:11:22 It'll just be some new person. It'll be a clone, right? It won't be you. it'll just be some new person could be a clone right it won't be you um so what's that's the ultimate limit and there's really no way around that you know complicated systems are just not infinitely uh renewable like that um right you just you just build up junk and mistakes and whatever you might be able to prolong it for a really long time but never forever um you know, without literally undoing all the information. There's only one, you know,
Starting point is 00:11:50 animal that's immortal. It's like this jelly, but it actually like gives birth to itself. There's no neuronal continuity, you know, from one generation to the next. So it's not neurologically immortal.
Starting point is 00:12:02 So our brains are ultimately going to be the limiting factor on our immortality so that's where it's like that's it unless of course you accept the some method of migrating to a machine and then we have to have the discussion of continuity and is that really you or not and what are there any methods that are acceptable? But not going to happen anytime soon, that's for sure. I think we're going to be seeing some significant life extension in the next 100 years or so. But nothing approaching immortality.
Starting point is 00:12:36 But it's going to miss us. A lot of times when people ask if something's going to happen, what they're really asking is, when is this going to happen? Right. Because if you go a million years in the future, whatever, depending on how far you want to go into the future, you know, we can't really set limits on anything. Because there's a point beyond which we can't really even speculate anymore about what the technology is going to be like. Now we're just talking about physics, just theoretically. What are the limits of physics, but not the limits of technology?
Starting point is 00:13:16 So when somebody says, will we ever achieve this? Will we ever get to a fully general artificial intelligence? It's like, yeah, of course we will. The only really question is when, because it's not impossible. And if it's not impossible at some point, we'll figure out how to do it. But will it be a hundred years, a thousand years, a million years? That's the question is how long is it going to take? Yeah. It's funny. I think about that idea of, of, of this like infinite extension. And it's like, if we don't figure out like quality of life, there's, you
Starting point is 00:13:45 know, I, I get out of bed too fast. My back hurts all day. You know, if I feel a sneeze coming on, I got, I'm 44. I gotta feel a sneeze. I gotta brace myself and make sure I don't throw my shit out. Like I can imagine a world and I really can't, I can imagine a world where we figure out a, a, an answer for most of the life endingending disease and degenerative processes, but it's still like, yeah, but you know what? You still have a vertical spine and a horizontally spined body. And so you're just like, okay, my God, I'm going to live forever. But I can't open a pickle jar without screaming. So why? Exactly. You have to combine quality of life with duration and again that brings up another sort of way you can you're predicting the future can go awry if you think
Starting point is 00:14:33 about simplistically like we're going to have one goal like the goal to live longer it's like yeah but no that's not how that's not how things are always interacting with each other and the other thing is you say you're trying to predict one technology. Okay, what if it takes us 300 years to get to the point where we could live for thousands of years? We won't even talk about infinite, just a really long time. Oh, it's going to suck because we will be frail and everything. It's like, yeah, but by the time, anytime you're predicting the impact of a future technology, you have to think simultaneously about how all the other technologies are going to have advanced in that same amount of time. So if it takes us a couple of hundred years to get there,
Starting point is 00:15:15 we're also going to have 200 years of advancement in every other technology and how will they be interacting with each other? So you can't just look at one technology and project it forward. It's always how are all of these technologies going to be interacting with each other? So that, you know, again, gives you a glimpse as to what the answer might be. It doesn't tell you because there's too many possibilities, you know. You're trying to predict the future of hundreds of technologies simultaneously. You can't do it. But you could think about, you know, you could think about how they might be interacting with each other.
Starting point is 00:15:51 Yeah. So, okay. So what about the public's idea of science in this sense, right? So we think about how sometimes the public, like right now, like doesn't think that maybe NASA funding is worthwhile or anything. That clearly slows down or speeds up whether or not how quickly we move forward, right? So do you think, do you talk about that at all? Yeah. So when you get, when we're trying to predict like the 2050 year timeframe, that's where that's really important. It's like, sure. And in fact, so let me back up a little bit. We look at the history of technology and, you know, Ford, for example, famously chose to market a gasoline engine, internal combustion engine as the first real, you know, consumer automobile. you know, consumer automobile. And his plan was that the second one would be an electric vehicle. Now, what if he had reversed that order and did the electric one first?
Starting point is 00:16:54 Would that have changed the next hundred years of technology of the automotive industry? And then it turned out he didn't mass produce the electric car because of patent rights over batteries and fighting over this. It was like all political stuff. It wasn't technology. It was political. But then there's also this from another angle. It's like, well, if we were a little bit ahead of the curve on the electrical grid, then electric cars may have just beat out the internal combustion engine.
Starting point is 00:17:31 And electric cars may have just beat out the internal combustion engine, but because it was just easier or if we discovered gasoline 10 years later, whatever, there's all these little variables. So there's one point in the book I say, so when you go back 100 or 200 years and try to pull those threads forward to today, you can't even predict today. today, you can't even predict today. Like you can't, you can't put yourself back a hundred years and predict what current, what today's technology would be like. So forget about going a hundred years in the future. So in the short term, you know, the really short term, it is kind of easy because you're just extrapolating existing technology, that next step. But even if you go even as much as 50 years, it's all about the choices that we are making both individually and collectively. And that's the one thing you can't predict is what choices are we going to make? You know, there are individual people like Elon Musk, for example, who has a tremendous amount of power to alter the course of future technology with quirky individual decisions.
Starting point is 00:18:26 Right. And so that 50-year realm, it just becomes really hard to predict for that reason. And so we have to say things like, if we choose to push this technology forward, this is where it could go. You know, the one thing that I've been fascinated by for 20 years is the coming hydrogen economy. Are you guys old enough to know that we've been talking about how long have we been talking about the coming hydrogen economy? Right. Why don't we have it?
Starting point is 00:18:54 Why don't we have it? And there's political reasons and there's technological reasons and there's quirky reasons, you know. But in 50 years, are we going to have it? What would need to happen for that to be the case and it's not just what would need to happen with hydrogen technology but all the competing technologies it doesn't matter if it gets better if if battery technology kicks its ass yeah then it's why would we double down on hydrogen? And so you always have to talk about if then, like if we decide to go this route, this is the potential of where it could lead,
Starting point is 00:19:33 but we have to, you know, think about all these other things, you know, that, that would be competing with it, or maybe people won't like it. That's the other thing. Like the Segway, the Segway was great technology. Yeah. Right? Fantastic. The technology was, it worked. That's, again, another fallacy.
Starting point is 00:19:51 If the technology works, people will use it. It's like, no. No, no. It's not necessarily. People might decide. Like on silly things. Like it looks dorky. I'm not going to use that technology because it looks dorky.
Starting point is 00:20:04 And then electric cars on the vine or our current electric car system owns owes that to tesla in some ways right i mean like they made a car it didn't look stupid a car that didn't he made a car that looks good and people want to own something yeah it looks sharp looks futuristic i love it yeah yeah no absolutely there's a lot of this marketing is involved you know know, absolutely. It's, it's, it's, uh, you could really only talk about again, like the potential and, and, and try to, you know, imagine how things might play out, but there's just so many individual variables there. variables there. It's funny, like on just kind of a tangential note, I do a lot of hiring. So I look at like different interview questions and try to do a better job doing that time to time. I usually don't succeed. But one of the sort of classic questions that I don't ask, but I have thought about, and everybody's heard it, is like, where do you see yourself in five years? And everybody's
Starting point is 00:20:59 heard that question. I think it's an asinine question. But it is kind of funny to think back, and I've used this as a thought experiment personally many times, and thought if I were to go back and say five years, just five years, and say, okay, predict out the next five, I would have gotten everything wrong. Sure.
Starting point is 00:21:17 Like if I were to go back in time to 2017, and in just five years, and say, predict the world in 2022. I would have got, I can tell you with certainty that every reasonably important thing, I would not have predicted the pandemic and any of the migration and social changes that came with that. I would not have predicted the war in Russia with Ukraine, although maybe I should have. There's so many things I would get wrong just at a very basic macro level on a five
Starting point is 00:21:53 year time scale. And then it's also a fun game to play with your personal life because it's also always been wrong for me too. But it's kind of funny that five years is an impossibility to get right in most meaningful ways. So futurism has a high bar to cross. Absolutely. But a couple of thoughts on that. One is past, future, all futurists, when you're predicting the future, you're also creating a time capsule, right? you're also creating a time capsule, right? So you're essentially describing your current time. And when you predict the future, you're really reflecting the thoughts and ideas and biases and perspective of the day. It really says more about your time than actually the future.
Starting point is 00:22:41 So you do create a time capsule. And then it's fun to go back and say, what did the people from 50 years ago predict about the future? And that gives you this window into that time period. Right. Yeah. It offers historical perspective. Yeah. So we hope that our own book will be a fascinating document for future futurists. But when you talk about something like the pandemic, interestingly, we interviewed an infectious disease specialist on our show, Mark Klisop, about 15 years ago, and he completely nailed it. He predicted the pandemic to a T. Now we didn't say it was going to be specifically a coronavirus and he didn't say it was going to be like 2020 to 2022 or whatever. He didn't, but he said, yeah, sometime in the next 10 to 20 years,
Starting point is 00:23:25 there's going to be a major respiratory pandemic. And he said, stock up on toilet paper. It's going to disrupt the economy. People are going to have to lock down. There's going to be tens of millions of deaths since blah, blah, blah. He basically completely nailed it. So experts knew this was going to happen. They didn't know the details, but they knew something like this was probably going to happen. And that's climate versus weather. That's climate versus weather.
Starting point is 00:23:52 Yeah. We know what's happening. We may not know when Antarctica is going to fall into the ocean, but we know that it's going to happen at some point. So that's why it's a little bit easier sometimes to predict two or 300 years in certain ways.
Starting point is 00:24:10 Because by then, we know all this stuff's going to happen. We're not sure exactly when between now and then it's going to happen, but we're going to be at the other side of some transition by that point. We know we're going to have artificial intelligence, like general artificial intelligence, in a couple of hundred years. When between now and then we're going to get there, it's a little bit harder to say,
Starting point is 00:24:31 but you know, and then we can say, well, once we do, what's that, what's the world going to be like, at least from the point of view of that technology. But of course it's interacting with all the other technologies. Let's shift gears and talk about your current book that's out. Yeah. Let's talk about that book. What was the impetus to write that book? A publisher threw a whole bunch of money at us. Nice.
Starting point is 00:24:56 Nice. That's the best reason. Good. Genuinely the best reason. That is motivating. It was a very good motivator. Tremendous motivator. tremendous oh yeah absolutely money and a deadline and you will accomplish a lot uh but i mean you know i've uh written book chapters
Starting point is 00:25:15 obviously write a daily blog i've been doing a lot of writing always had like this idea i really want to write a book uh you know and you know lots of ideas about the first two or three books that I would write. But then we were contacted by a publisher. And what they wanted to do is they wanted to write a book to our podcast listeners. And then we learned this is what publishers are doing now. Because book publishing definitely took a hit with everything digital, but they're finding ways to survive. That's why when you publish a book, it's a paper book, it's a hard book, it's audio, and it's a digital format. So they're selling it in all three formats. That's part of what they do. But they also realized that they could tap into people who already have a built-in social media audience.
Starting point is 00:26:07 Sure, sure. And that kind of sets a floor for them. They can calculate, okay, how many listeners do you have? They want to know all our social media stats. And they plugged in those numbers. And they figured out pretty accurately how many books we would sell, at least at a minimum. And also, it's free advertising. They don't have to put any money in marketing. Yeah. We do all the marketing for them, right? It's like built in marketing, built in audience.
Starting point is 00:26:37 They wanted it to be named after the podcast, you know, the skeptics guide to the universe. Like that was not even negotiable. Like that's what we want the name to be. Cause that's, you know, we're like, oh good. We can reach out to a greater audience. Like, yeah. Okay. Sure. But Hey, settle down, son.
Starting point is 00:26:54 We got, we already got this figured out. Now. And so what they, what they're doing is, and they, they, they've done this for a while, but what, what they're doing is they, they just just publish 100 books and they only expect one to hit. They don't know which one it's going to be, but one will hit and then they'll put their marketing might behind that one. And the rest, they just want to break even. Wow. So that's like they're playing the lottery and they just say, I'm going to publish a bunch of books.
Starting point is 00:27:22 We know how much of an advance to get based upon this and that. So we'll break even, but one will break out. We'll put our money behind that one that will make millions, right? That's, that's their, their strategy. So, but we're, we actually fell kind of in the middle because we exceeded the advance, which is always great. That's why we got a second book because they're happy with us. But we didn't break out, you know, like. We didn't get beyond an order of magnitude beyond the advance. We just extended it a little bit. So it's good and it's still selling. And it's mostly the audio version.
Starting point is 00:27:54 The audio version is like 80% of our sales for the book. That's how people are used to consuming your product. Yeah, exactly. I mean, it shows you that that's... I think they absolutely knew that. There was no question that they were going to do an audio book. I mean, it shows you that that's, you know, they, and they, I think they absolutely knew that. There was no question that they were going to do an audio book. I was going to record it.
Starting point is 00:28:09 They wanted, and that was it because they knew what they were doing. So yeah, so I learned a lot about the industry and everything. So, which is good. And I'm glad that they're surviving and this is what they're, this is how they're doing it.
Starting point is 00:28:22 You know, they're tapping. Social media how they're doing it. They're tapping. Social media was presenting a problem for them. So they just use it now and they're surviving that way. Absolutely. Content-wise, let's talk about content-wise. Let's talk about the book itself. Why did you put all this together in this? In the way you did.
Starting point is 00:28:41 In the way you did. Yeah. So we wanted to make basically the book of our podcast, right. And we talked about a lot of ideas about how that would exactly look and how it would work, but the core of it was always going to be just a primer on skepticism, critical thinking, scientific literacy, the, the you know the core knowledge base of skepticism and of our podcast all the things we refer to you know it's like this is essentially a reference book for listening to our show and also a primer for critical thinking and you know
Starting point is 00:29:21 media savvy and you know and scientific literacy um so you know it was it's a and, you know, and scientific literacy. So, you know, it was, it's a tome, you know, it came out really big, but it's like, but, and we, we had this conversation with the, with the publisher. They're like, yeah, 92, 100,000 words. We came in at 137. So when I, yeah, we were blowing past their limit. We had this conversation like, you know, do what you got to do. This is your magnum opus.
Starting point is 00:29:54 The book kind of needs to be complete. You know what I mean? Like, what are we going to leave out? Let's leave out logical fallacies. Like, you can't do it. The book doesn't work. It only works if you cover everything. And they got that.
Starting point is 00:30:06 So, like, just do whatever you got to do. Speak it as long as it has to be in order to be what it is. So that's what we did. And we're all very happy with the result. It is everything we wanted it to be. It's a fun read. It's very personal. We try to carry through the personality
Starting point is 00:30:23 of all the people on the podcast. personal. We try to carry through the personality of all the people on the podcast. But it's also like a primer for critical thinking and scientific skepticism. That's exactly how I'm about halfway through it. I've started listening. And it's funny because I am listening to it. So I drive a lot and I'm like, oh, of course I'm going to this. This is how I consume your product. So this is definitely the way I'm going to go with it. And it does very much feel like that sort of necessary 100 level introduction to the basic rules of logic, the basic rules of critical thinking. Like get this stuff right and you're doing a fair way along in your epistemological journey.
Starting point is 00:31:10 These are good tools to have in your toolbox. So your audience, obviously, the audience is your podcast. I mean, from a marketing standpoint, that's your audience. But your audience knows most of this. We made a specific effort to include stuff in the book that was extra, that we've never spoken about on the podcast, where either just a new idea or a new example, or we take stuff we may
Starting point is 00:31:42 have referenced on the podcast, but go much deeper. And we asked that, like every chapter is like, all right, what is in this chapter that you would have never heard listening to our show? And, and so, yeah, but we knew it was going to be, first of all, it's like, you know, the podcast, you get bits and pieces, like, like we might talk about one logical fallacy in an episode or whatever people say for years people are writing to us i want to i need a place where i can get this all in one place like so i know like all the logical fallacies that you talk about all the cognitive biases you know etc so it's more thorough yeah it is sort of it you know it is
Starting point is 00:32:22 sort of a narrative that walks you through that. It's not a reference book, really. It has a narrative. It reads like a story. But it also had to have new information that you would not have heard just from listening to the podcast. That was by design. I think it's a real boon, to your point. I think it is a real boon to have it all in one place. This is, as I go through it, it's like, it's important information.
Starting point is 00:32:48 But, and as I was thinking about it, when I was listening to it, I was like, you know, I think I know most of this, but I don't know that I've ever encountered all of it in one place. I've gleaned it from here. I've picked it up from there. A little from Carl Sagan, a little from, you know, I mean, just a hundred different sources to put it all in one place and to have it be well organized and to have a natural flow to have. It's funny because like your show, your show's title suggests this book that there should
Starting point is 00:33:21 be a guy. Yeah. Right. That there should be a skeptics guy where the hell is the it took you 15 years to get that was absolutely and we toyed with that i because you know it's based upon the skeptic the uh the the hitchhiker's guide to the galaxy like that was the inspiration for the title which is an actual book in the universe of the hitchhiker's guide and that was one of our ideas like let's make this a book in the world of the skeptics guide to the universe of the Hitchhiker's Guide. And that was one of our ideas. Like, let's make this a book in the world of the Skeptic's Guide to the Universe. But it was like too clever to execute. You know what I mean?
Starting point is 00:33:54 Right. Until you got a great big check in front of you. And then all of a sudden, we all get a little more clever when you get a check and a deadline. Right, right. Absolutely. But you know what I mean?
Starting point is 00:34:04 It was just, there was lots of ways to have different layers of story being told in the book. And that was one that we had, which we didn't execute specifically, making it... We did keep elements of that. so like the chapter headers and the way
Starting point is 00:34:27 it's organized it kind of was supposed to be like like the hitchhiker's guide to the to the galaxy book in the universe itself um like pretends that this book exists you know uh so but that's so it's funny that you mentioned that because that was actually one of the ideas that we had but the other other layer that we tried to put in there was, so a lot of these things that we talk about are not static. These are ongoing research paradigms by psychologists, etc. So there's all these ideas that we all read, you know, The demon haunted world by carl sagan and it was fantastic it's still a great read i still recommend it as a good entry into scientific skepticism but it is dated it is we just read it yeah there's there's he talks about things that are just no longer considered to be true when it comes to like the way we consume information and the way
Starting point is 00:35:22 you know the best way to address certain issues and how conspiracy theories work whatever it's just it's 30 years old so part of it this book is partly an updated version of a demon haunted world you know it's like let's what has the literature shown in the last 30 years you know right to inform this project that we're doing like the dunning-kruger effect didn't exist exist, you know, when Sagan wrote his book. So part of it was an update. And it's like one of those things. That's funny to think about. Yeah, even more so for the current book.
Starting point is 00:35:51 But it's like, there's constantly studies coming out. Like, ooh, ooh, I got to update that chapter to reference this new thing, which adds a new layer or nuance to this topic. And at some point,
Starting point is 00:36:02 like the editor had to cut us off. It's like, no more updates. That's it. You can't keep sending us updates because you got to lock in the text at some point. But we're still collecting these studies. And so there's probably going to be a second edition at some point when 10, 15 years go by. There's going to be 15 years of research that we got to update with. And the future one, forget about it. Every week, it's like, oh, shit, this's going to be 15 years of research that we've got to update the book with. And the future one, forget about it. Every week, it's like, oh shit, this is going to...
Starting point is 00:36:30 It's the future again today! It's slowly becoming obsolete even before it gets published. We recently, so on our show, we recently as a book club sort of did a week after week, read a chapter of Sagan's Demon Haunted World and we went through it.
Starting point is 00:36:48 And you're absolutely right. It's super dated. And some of the things that he's talking about, it's funny when he talks about how he's sort of appalled at the 10 second attention span people have. And then I started thinking about like TikTok and how quickly you could just scroll through a TikTok page and how quickly your eyes can just capture all this information. I think, I don't think, I think when we read it, we tried to be very, we tried to be very forgiving about like him not understanding the future, being able to predict it, but also seeing if there's some way what he said still could match maybe our current technology. And there's some places it just doesn't work. But in other places, you can sort of see he knew what was coming. And there's
Starting point is 00:37:33 a quote that's actually being passed around very recently about the foreboding in America. I don't know if you remember this quote, but there's a quote from the book where he talks, I feel a foreboding in America. And it's like, you know, maybe about a paragraph long. And several people have quoted it recently. And, you know, a lot of people say he nailed it. He nailed it so many years ago. We just read a quote from a book from 1939 by a scientist, like a popular science book from 1939. from 1939. And the author described, you know, society at the time and the challenges they were facing. And it could have been written yesterday. It was like a perfect description of our current
Starting point is 00:38:17 political social climate. So in a way, you know, it's kind of reassuring because the world survived. I mean, we had to fight a world war, so we may have some bad stuff coming our way. So I think when you're talking about human nature, whether it's 1939 or 1991 or whatever Demon Hunter World came out or today, there's some universal truths that just, and things go through cycles, so they're always going to be true. But there's just so much new stuff. Think about when Sagan was writing, I mean, like the vaccine,
Starting point is 00:39:01 anti-vaccine movement was not really prominent at that time. Climate change denial was just a whisper at the time. All the science denial stuff He's minuscule, Mason. well, isn't that fun child? There's kind of simple issues where now we're like wrestling with all these, you know, very weighty anti-science or pseudoscientific issues and issues of misinformation and weaponized misinformation. And just, we're at a whole new level that Sagan may have seen, you know, the antecedents to that.
Starting point is 00:39:43 Like he knew that this kind of thing happens, but back in the early 1990s was like nothing compared to what we're facing today. Talking about aliens. And of course, social media, you can't even... That was some X-Files shit. Yeah, he was talking about aliens back then, and a lot of the stuff that he was talking about was about aliens.
Starting point is 00:39:59 Do you think, you know, like, clearly your book is about thinking critically scientific skepticism. Do you hope that your book winds... Because, you know, you look out at the landscape, the information landscape, there's some, it's a little scary. It's a little scary, especially when you think about how many people believe in QAnon in the United States, how many people believe in, you know, how many people didn't get vaccinated, you know, things like that. Do you hope that your book will get into people's hands
Starting point is 00:40:26 that might not agree with everything and maybe change their mind? And do you think that you wrote it in a way that will be able to do that? I mean, that was the goal. Like we're always writing simultaneously to multiple audiences, you know. We're writing to somebody who has no idea
Starting point is 00:40:41 what any of this is, and this is their first entry into it, and we don't want to scare them off. But we want to be a sort of a friendly entry into that. We can't pull every punch, right? And so there's, oh, you dissed acupuncture. I hate you. It's like, okay, well, I have to call it like I see it.
Starting point is 00:40:58 I justified it. But, you know, I can't not call pseudoscience pseudoscience. But we try to creep up on it and give them the tools to think about it. And you can't take a belief away from somebody. You have to give them the tools to surrender it voluntarily. Some people will do that. Some people will not do that. So we definitely wanted it to be that. But also we wanted it to be like, if you're already a skeptic, already a listener, you already are pretty much there. This will take you to the next level, right? There's
Starting point is 00:41:29 enough to do deeper, more information in there for you. And, you know, we occasionally get emails from people who listened to the show or whatever, encountered the book, and they were hostile. Initially, they were true believers or whatever. Some of them still are, and they hate us, but some occasionally will say you know that they we changed the way they they view things they stopped believing in alternative medicine or whatever um got out of some you know pseudo-scientific cult or religion that they were in it happens you know so um you know obviously what what author doesn't want their book to change the world, right? I mean, so we put it out there and hope for the best.
Starting point is 00:42:11 Yeah, I only ask because one of the difficult things, especially just seeing how things are going in the world is how do you change somebody's mind? And I think you kind of touched on it a little bit there by making them sort of surrender to it. Can you expand a little bit on that? Can you talk a little bit about that? So people do change their mind. We all change our mind all the time, but it's usually a step-by-step process, right? People don't change their mind. Like the scales fall from their eyes and they
Starting point is 00:42:39 suddenly have an epiphany. It's very rare. You know, not that it never happens. Sometimes you do get to this place where you do have a sort of epiphany and people's very rare. Not that it never happens. Sometimes you do get to this place where you do have a sort of epiphany and people usually remember those one or two times in their entire life where something just crystallized for them. But that's very rare. Most of the time, our beliefs just sort of slowly morph from one thing to the next. And so that's kind of the, you have to be engaged in that process. So the primary goal of the book is to get the reader,
Starting point is 00:43:13 to get people to engage in metacognition, right? It was just thinking about thinking because that is kind of a binary thing where there are some people who are just existing in an intellectual realm where they do not think about their own thinking. There's never any introspection, you know, analytical introspection about how they are thinking. And so they're just going with the flow, cognitively speaking wise. And then other people are introspective.
Starting point is 00:43:42 They will question how they think about things, how they know things, you know, is their memory accurate, et cetera, et cetera. And so we are trying to get people into that camp, you know, to try to get them to, for, you know, partly by just surprising them with stuff that they, you know, just show this is how easy it is to fool you. This is how easy you are to fool. That's one piece to it. Once you realize you can be fooled, you realize that you need to have a check in place. You need to have some kind of way of evaluating whether or not you're being fooled, you know? And then once you do that, we then said, then you're there. Now we just got to give you more and more ways in which
Starting point is 00:44:20 to do that, more and more tools in which to do that. So one of the things we recommend in the book is to, when you're trying to help somebody, I mean, we're all on this journey, right? No one's there, right? So we're on this journey. We're trying to maybe help other people who are not, maybe not as far along on this journey. How do you do that? One of the ways to do that is to engage with somebody about a topic that they are skeptical of. Like, tell me something you don't believe. Why don't you believe that? Oh, that's interesting.
Starting point is 00:44:51 That's interesting. And you're engaging their own, because we're all, there's got to be something you don't believe in. I mean, if you literally believe in everything, including mutually exclusive belief system, then, you know, you're probably hopeless. But if you can get,
Starting point is 00:45:04 if you can find something they don't believe in and then just start to basically nurture that, tell me why you don't believe in that. Oh, because it doesn't make sense. Oh, because there's no evidence. Oh, because it contradicts this other thing or whatever. You start to develop these basic rules of metacognition, of skepticism from there and they're already invested
Starting point is 00:45:30 and now you're just trying to get them to extend that a little bit to something else that maybe they only haven't thought about but they don't really have any, that you don't go right
Starting point is 00:45:40 for their core identity vested interest, right? That's not what you do. Just rip away the very thing that makes them. That's, that's, you're not. I don't even know if I love my wife anymore. What is happening? They're just like a crumbled mess weeping on the floor.
Starting point is 00:45:57 I never should have read that damn book. Yeah, they're just going to cling all the harder, you know? So you just got to take the pathway of least resistance. I want to say, so you mentioned your book is a tome, and it is. It's a significant, it's a weighty read. It takes some time. And I was thinking about when I started listening to it, I was like, oh, it's a longer book.
Starting point is 00:46:22 I was just like, oh, cool, it's a longer book. That makes me happy, right? So I enjoy that. But I was thinking, what the heck do we do in a world full of anti-intellectualism? Your book is this series of sort of intellectual tools, but we are living in a world of anti-intellectual forces that are actively sort of pursuing anti-intellectualism as a new national value. And how do we bring this tome or which ideas from that tome, I guess, might be the most important? Like if you had to go SparkNotes, you're going to talk to somebody. If I had 15 minutes with Matt Gates and Madison Cawthorn and Marjorie Taylor green,
Starting point is 00:47:11 and I wasn't going to use it to berate them and make them feel small inside. How, how would I spark notes them? What is it? What are the most important pieces? I mean, we don't have the magic formula, obviously. Like we can't account for how other people, they may some people just so locked into for how other people, they may, some people just so locked into their way of thinking, you know, I don't have the magic way of getting them out of it. Again, it's more like you just create the opportunity for it to happen and hope for the
Starting point is 00:47:34 best. But the intro chapters, that was exactly the thought process because you know, you have to capture somebody right from the get-go. And if you don't, if they are not vested in reading that book by the third page, you've lost them. Yeah. So, you know, so that was exactly the thought process we had when we're trying to craft those introductory chapters to the book. And again, the approach is, you know, people are deceiving you. You know, that's sort of my opening line is Spock lied to me, right? Spock as a child, he was an authority figure, you know, almost of mythic quality. And at some point I got old enough to realize that In Search Of was not a documentary, you know, like Leonard Nimoy was hosting this In Search of program and it
Starting point is 00:48:25 wasn't a science show. It was something else. I was deceived. And then, you know, as you get older, you realize that at some point, pretty much everyone has lied to you. Your parents told you a whole bunch of shit that wasn't true. Your teachers told you stuff that wasn't true. Everyone's telling you stuff that isn't true. Why is that? Well, they may believe it, or they may have nefarious purpose. They may be trying to sell you something or whatever. So we kind of need some way to get through this life. We need to have some kind of tools to figure out how to know what's real and what isn't. That's why the subtitle of the book is How to Know What's Really real in a world increasingly full of fake. Right. So that's basically it. So if you get people to to want those tools as a self-defense mechanism and also at the same time as an adventure and a journey.
Starting point is 00:49:17 So we try to map it out both ways. tools will protect you from misinformation and fakery, but also will send you on a journey of discovery to cool shit, you know? And so that's always a thing. And Sagan really pioneered that duality of, you know, protecting from pseudoscience, but reveling in the wonder of science at the same time. And we definitely tried to use that inspiration and model that. But, you know, again, just updating it for a new generation, new world. When you talk about thinking about the past
Starting point is 00:49:53 and looking into the future, you know, when we first started our show over 10 years ago, we even said, I think on one of our first or few episodes, you know, one of the first, maybe within the first 50, we had mentioned, you know, we don't think this show is going to be needed in a few years. We think the world's going to become rational and there's not going to be a need for this show. Did you have the same thought when you first started podcasting?
Starting point is 00:50:18 Could you have possibly envisioned a world that we're in now, this world that is really very anti-science? Yeah, I mean, not surprised at all. But we had been running a local skeptical organization for almost 20 years before, no, I'm sorry, 10 years before we started the podcast. And we had the experience of working with colleagues who, in our organization with us running a chapter or whatever, who did go into it with that attitude. Like, I'm going to change the world, you know, and then they burn themselves out in two years. Because if you think that you're going to rapidly alter reality, you know, with your awesome information and you just tell people what they need to know and they're going to be like, oh, of course, of course we have to be skeptical, then you're not going to have longevity in this world in doing this. So we knew
Starting point is 00:51:17 from early on, this is a generational struggle. We are picking up the torch from those who came before us and we'll be passing it on at some point and the world will still be just as crazy as it was. But we just hope and this is like the one thing that we can't really ever know. It's like it's a wonderful life. What would the world have been like if we weren't here? How much worse would it be? We just have to imagine it would have been a hellscape of misinformation if we weren't here um but uh you know it's like whatever it's like the thing is we can't know that so you have to say all right well either i'm going to go down fighting that's what this is i'm going to do it you know or maybe it's like a little bit of a parachute will it you parachute will make the world not as bad as it would have been if we weren't here.
Starting point is 00:52:08 Maybe we might make some actual gains in some narrow areas. And I think we have done that. So we will be a resource for people who want it. I can't force this on anybody. And we take our victories sometimes one person at a time. Like somebody emails me and says, oh, I, you know, whatever. I had cancer and I was going to give all my money away to this quack, but you convinced me not to. It's like, okay, great. I saved one person, you know, that kind of thing. So you got to take your victories where you can and you got to enjoy
Starting point is 00:52:41 the struggle. You got to really believe in it. But if you think that you're going to change the world, you know, you're going to be disappointed. Yeah, I don't want to come off as saying like we're going to change the world. We thought the world was not going to get crazy, I guess is what I mean to say. Yeah, you were naive. I get it. I was, I was absolutely.
Starting point is 00:52:57 Dr. Novella, here's the thing, man. I look back 10 years ago and I think, like even at the beginning, even this five, six years ago, when we first heard of QAnon, I thought it was an absolute, the most funniest thing I'd ever heard. I thought it was a joke. I thought it was a absolute joke. And if you were to tell me five years ago that 29% of the people in the United States or something, members of Congress, I would, I would, I would be incredulous. I wouldn't believe you. I literally wouldn't believe you.
Starting point is 00:53:25 Yeah. I mean, so I have absolutely been surprised over recent history. Not QAnon, surprisingly, because I think after the Flat Earthers, you're like, yeah, there is basically nothing so stupid that lots of people won't believe in it. You're right. And the capacity for motivated reasoning is unlimited. It's unlimited. There's nothing so stupid that somebody won't believe in it permanently.
Starting point is 00:53:52 So I was already there. So the idea that there's lots of people who believe in QAnon, I was like, oh yeah, of course there is. Especially when you tie it to identity, like political identity or tribalism, like your political tribe. Absolutely.
Starting point is 00:54:03 And we happen to know people in our personal lives who are like this. So it's like, we're there. We completely understand that. What has surprised me is just how fragile our institutions were. I did have this, this is where I was naive, this naive belief that our institutions were robust enough that they would prevent, you know, anything from seriously happening. You know, like the idea that our democracy would fail because of it was like, of course not because, you know, at least we have, there are adults, at least a few adults running the country and there's institutions will
Starting point is 00:54:40 prevent abuse and they have absolutely helped. You know, absolutely. We do have robust institutions in this country. But I was shocked at how absolutely quickly and completely, you know, certain political parties just caved to the mob.
Starting point is 00:55:00 Like, no shame, no self-respect at all. Like, the all like the utter no need for internal consistency at all i mean it was that was like i knew that that existed in politics but this is an order of magnitude beyond what i imagined was possible and now i just have to accept the fact that yeah there are just people out there who will completely lie, like shamelessly lie if they think they can get away with it. Yeah. And lie in the middle of a global catastrophe. Lie to the detriment of tens of millions of human beings' lives.
Starting point is 00:55:41 Our institutions held, when you look at some of that it's like they held because like one or two people held a line right and you're like that's about how fragile it was that is a hair's breadth from disaster yeah that's just like one guy who woke up and drank some coffee and took a shit and decided we still had democracy on wednesday yeah and you're like what and it's what do you mean? And what's crazy is that happened multiple times. That wasn't just one time. There was multiple times throughout the last four years that that happened. That was crazy.
Starting point is 00:56:11 But the really scary thing is that now the anti-democratic forces have a roadmap for, oh, okay, so all we need to do is get our people in these key positions and the whole thing collapses. Right, yeah.
Starting point is 00:56:26 They know where those pinch points were now. They get to just disrupt them. Let's make that happen now. Yeah. I have one more question. Yeah, go ahead. So we talked about this recently on the live stream we did.
Starting point is 00:56:37 There's a guy who does this thing that he has a joke conspiracy theory called birds don't exist. Yeah. Okay, so he does this thing on 60 Minutes. They did a whole thing on it. It's actually really funny. The guy is very funny.
Starting point is 00:56:49 He's a good improv artist. So he's actually clever and funny. What do you think about this sort of troll culture? Is this damaging to our information ecosystem? Or is it just good fun? Where do you fall on that birds don't exist thing? Because, you know, for us, we walk a fine line. We see some of these very equally absurd conspiracy theories
Starting point is 00:57:13 that have the same, pretty much the same weight that are very damaging. Where do you fall on this? I mean, we've confronted this question for our entire skeptical career, going all the way back to the beginning. It's like, should we do satire? Should we, you know, fake Bigfoot and then say, aha, we faked it, you know, or UFO or whatever. You know,
Starting point is 00:57:34 James Randi had some success doing that. Although he's an outlier because he's, you know, he was a fabulous magician and entertainer. and I wouldn't try to replicate that lightly. But I came to believe that I shouldn't do that. I don't have the skill set to pull that off. And also, I think that it degrades your trustworthiness, you know. So the thing is, Randy's a magician. He lies and cheats for a living, you know, or, you know, he did. And so it kind of goes along with being a magician and an entertainer. His job is to deceive you for entertainment. My job is to be a reliable expert and an honest broker of information. And so I think it's incompatible with what we do. Like if you're doing anything, that's journalism,
Starting point is 00:58:25 that's science communication. People should never be questioning whether you're being for real or not. You know, you always kind of have to be above board. And whenever we do dabble into satire, it's always labeled, this is satire. This is satire. Like make absolutely no mistake. And then you could use it as an art form. But yeah, for my organization, for me personally and the people who work with me, no, we will not do hoaxes. We will not do deception. We will not do trolling.
Starting point is 00:58:55 We think it's anathema to our science communication journalism. Well, a quick follow-up to that, just out of curiosity, just an opinion. Is satire, do you believe in our current day and age, is satire meaningfully possible anymore? Yeah, that's the other thing. It's become so hard to do satire. First of all, a lot of the stuff that we're like QAnon, how could you satirize QAnon? It's beyond satire. How could you satirize flat eartherism? That's what I would have used to satirize something else. It's beyond satire. So that's like, it kind of took that away from us
Starting point is 00:59:34 because we can't even do it anymore. And because of that, because there are QAnon believers and flat earthers, you can't come up with something so ridiculous that people will automatically know that it's satire. Yeah. Right. It won't prove the larger point. Right. Like that's why I get stuck is like I think of like great satirists of literature, you know, like Swift. And it's like, all right, well, you know, like let's tell this absurdist tale and the absurdity will highlight these important social issues,
Starting point is 01:00:05 you know, like, well, it's all, we're all going to eat children, you know? And it's like, holy shit.
Starting point is 01:00:09 Yeah. That's Q now. Yeah. Like that was literally Swifty and satire is like eating. Now we're just there. Yeah. So 25 years ago, I wrote a satirical article about alternative engineering.
Starting point is 01:00:22 Right. And it was a satire on alternative medicine. And I used all the alternative medicine tropes, but applied them to like civil engineering. Oh, that's amazing. Like, are you going to drive over that bridge that's floating with magic? You know, of course not.
Starting point is 01:00:37 And then I got contacted by a producer from 2020 who thought it was real. Oh, I don't even. Yeah. And so that's like, holy shit. I can't do this. Yeah. No,
Starting point is 01:00:49 it's too dangerous. I thought it was blazingly odd. This was like, I made it so ridiculous that nobody could possibly believe it. But, but I was like, I was like, okay.
Starting point is 01:01:00 So I had to, that's when I came up with it. I will have to put the satire label on it. But I'm like, I just stopped doing it. It's like reality is beyond satire now. Amazing. Amazing. Dr. Novella, if people were going to find your podcast
Starting point is 01:01:13 and your book, where would they look? Just go to theskepticsguide.org. Everything is there. Excellent. It's a pleasure to have you on the show again. Thank you so much for joining us. Thank you very much. A lot of fun, guys.
Starting point is 01:01:25 Thanks for having me. So we would normally thank our patrons here, but we didn't. This week, we wound up taking a little time off. We wound up not. We wound up recording a little early. We recorded with Dr. Novella last week. So that's why you weren't going to hear
Starting point is 01:01:45 any sort of current event stuff, like the replacement theory shooter that was this weekend and things like that. We're not going to talk about that this week because we just, we recorded stuff last week and we also didn't put up a document. So we didn't get our patrons added by Ian. Can it still be Ian's fault though?
Starting point is 01:02:00 It is absolutely 100%. Even though I did not create the document because we were not producing a show. It's still his fault. I mean, let's be honest. I mean, he really, he could have done it proactively. Right.
Starting point is 01:02:09 I mean, Ian, why do you have to react to everything? You know? I think that's fair. I really do feel like this is his fault. So what, what, so we have, we will talk about our patrons next week.
Starting point is 01:02:20 We want to thank you, of course, everybody for being patrons. We will mention you. If you haven't been mentioned, we promise we're going to mention you. We have a couple of things we want to talk about. We have a couple of voicemails we want to play and a couple of emails we want to thank you, of course, everyone for being patrons. We'll mention you if you haven't been mentioned and we promise we're going to mention you. We have a couple of things we want to talk about. We have a couple of voicemails
Starting point is 01:02:28 we want to play and a couple of emails we want to talk about. So this message is from Katie and we got a couple of messages about your Lysistrata comments a couple of episodes ago, Tom. Yes.
Starting point is 01:02:38 We talked about this sort of a sex strike and then I talked about an actual, like a real strike, like going out on strike and a couple of people sent in some messages. This one's from Katie and Katie says, you know, she, she basically says women have sex for pleasure and anti-choice legislators
Starting point is 01:02:53 want to prohibit them from doing so. Body autonomy is the point, not doing what we want to do with our bodies is not an effective strategy is what she says. And then, uh, another thing too, that came up was like sex as transactional is bad for relationships. So those two things, I know that it was like sort of tongue in cheek. It is. Like it is tongue in cheek and we talked about this after the show
Starting point is 01:03:15 or outside the show too. It is tongue in cheek, but I do think that there is a sort of like corollary to the idea that we should check the politics to some degree of even our casual partners. Casual partners, yes. And if you're some fucking dude bro on Tinder or whatever, and people aren't hooking up with you because they're checking your politics on this issue,
Starting point is 01:03:38 it's not a sex strike. I get it. And there's also safety reasons. There's a lot of reasons that that's like a play in a Greek comedy. Yeah. And not a real thing. And not something people do. Yeah.
Starting point is 01:03:48 Right. But, you know, I also think like, you know, to a certain extent, we should check and be like, oh, wait, you know what? Like, even if I'm just looking for like a quick hookup and, you know, like on fucking Tinder or whatever the kids are using. Yeah. You know, like check and see who you're hooking up with. Yeah.
Starting point is 01:04:03 Like don't, maybe fucking shitty dude bros shouldn't get laid. You know, and I, and I think're hooking up with. Like don't maybe fucking shitty dude, bro. Shouldn't get laid. You know? And I, and I think probably people, a lot of people do that. A lot of people look at me like, Oh, that's a MAGA hat. Swipe whatever way is the way you don't want. Right. So I get that.
Starting point is 01:04:14 I also think too, like. This could be a dick mistest. But, but I think like, you know, we should be doing that. Not just in the casual relationships we have, but in the real relationships we have too. You know, like, there's a sense that,
Starting point is 01:04:31 you know, that the side that always seems to lose is the side that always seems to give up their rights to other people. Right? That's the side that always seems to lose. And I think at a certain point, we need to say,
Starting point is 01:04:43 you know what, if you're just not for human rights, why am I around you? Like point, we need to say, you know what? If you're just not for human rights, why am I around you? Like as a friend or as a, you know what I mean? In any relationship. In a relationship. Why am I around you?
Starting point is 01:04:51 And so it's not just sex. It's human relationships. It's familial relationships. It's friendly relationships, whatever it is. It's the no quarter for bigots thing, right? It is like, like misogyny is a kind of bigotry.
Starting point is 01:05:06 Yeah. And why should we allow people in our lives who are misogynists? Yeah. Like why in the world? I mean, it's 51%. You're like, yeah, you know what? I don't respect half the people that are on the fucking planet. Fuck you.
Starting point is 01:05:19 And I also like, I know that a while back we talked about, we talked to Dave Warnock. And I remember we mentioned something similar where he was saying, you know, I think you should talk to people. I think you should go out of your way to talk to people. And I had said something like, you know, there's some subjects are just off limits, right? Some subjects, and this is sort of where this idea lays
Starting point is 01:05:37 with us is like, you know, the same thing with no quarter for bigots. It's the same thing with like, this conversation is useless, right? There is no use in this, having this conversation. I'm not saying that no one should ever try to have that conversation with that person, right?
Starting point is 01:05:51 I'm just saying you don't feel obligated to do it. Exactly. Yes, that's exactly. Because I think like, I think like there are Nazis out there that can change their mind. I'm sure there are, but a Jewish person isn't obligated
Starting point is 01:06:05 to try to tell them that they're people. That's what that argument means, right? That's what a trans person shouldn't have to look Tucker Carlson in the face and be like, I'm a human fucking being, period. They don't need to expend their fucking energy doing that. But there might be a human being out there that is probably from a different,
Starting point is 01:06:26 maybe even from a different class that isn't being attacked, right? And also somebody who comes from a level of privilege that the other person might see in a different light, and they may be able to convince that person, and maybe they have the energy to do that. But that obligation shouldn't fall on the shoulders of the people who are the ones who are being attacked. Absolutely. Yeah. Absolutely. The marginalized and the oppressed do not have a responsibility to their oppressors. Right. Absolutely. Absolutely.
Starting point is 01:06:49 Never is that the case. And I think it's entirely reasonable and fair for all of us to establish what in our lives are deal breakers. Yeah. Ethical and moral deal breakers. And I guess like, like the core of my suggestion is I do think that fundamentally this issue should be, you know, you should consider whether or not this is a deal breaker. Yeah. Whether this is a moral and ethical deal breaker. This is one of those deal breakers.
Starting point is 01:07:17 This should be, you know, if it affects you in that way, you should think about it like that. And I think it should affect you in that way because I think it's a dire, dire time. Yeah. And people are being oppressed. You know, half the population. Yes. Half the population. It's unreal.
Starting point is 01:07:35 So we want to play a couple of voicemails. This one is also about the Lysistrata. And this one is a little longer, but it's from Poland. So it took a while to get here. Hello, Tom and Cecil. I've just listened to the episode 628. It was fun, as always.
Starting point is 01:07:52 Anyway, Tom, you mentioned Lizistrata, that not fucking with guys would make them think twice about abortion laws. It was a funny part, and I know it wasn't an educational slash literature part of the show. However, just to recall, remind you that the comedy part of Liz Strata was that it was women that could not stand not fucking. The guys were okay. Women had problem with not fucking.
Starting point is 01:08:32 So that makes this whole funny situation. Nonetheless, yeah, I love your show, guys. Your show. And cheer you around. Glory up. You had mentioned too that it also ends tragically
Starting point is 01:08:47 it's like it's like it's it's just it's Greek yeah no but thank you it's funny it's a tragedy
Starting point is 01:08:53 and also it's a comedy so but like I mean it's Greek so nothing ends well yeah right okay that's fair alright
Starting point is 01:08:59 we also got another message now this is about Tom and I's dream of having a five guys freestyle machine and so here we go this is another message. Now, this is about Tom and I's dream of having a Five Guys freestyle machine. And so here we go. This is another message we got. Hey, y'all.
Starting point is 01:09:09 This is Phoenix. And I used to work at Five Guys and I had to service those machines. And there's actually not a lot of shit behind there because they don't pump in individual sodas the five guys soda machines they have little cartridges that you stick under the machine and then they pump in like the carbonated water and then add the uh syrups from the cartridges as the water passes through based on your selections. So just an FYI. Toodles. Okay, Phoenix. All right.
Starting point is 01:09:47 Okay. So Phoenix, I know that it's just syrup. They use the water from the tap and carbonate that water. I understand that. But we also think like, okay, so Tom and I, back in the day, both of us worked in food and food service. And those old packages, you call them cartridges, they were syrup boxes.
Starting point is 01:10:06 Yeah, back in the day, unless they've changed something dramatically. They were boxes of syrup. They were 50 or 60 pounds a piece. They were heavy. And you would plug in these individual boxes and every single stream of soda on there got a different stream of syrup.
Starting point is 01:10:22 Because it had to have a flavor. It's a flavoring. Now, don't get me wrong. I do think like with those soda machines, there might be like, you know, the vanilla could be used for the vanilla barks or the vanilla Coke. Right. But you still need a barks and a Coke. And then you need a diet barks and a diet Coke.
Starting point is 01:10:36 And you need a fucking orange. And then you need a Dr. Pepper and you need an orange. And you need, you know, all the other weird stuff. A prune for the pepper, Dr. Pepper, I guess. But you know what I mean. I'm wondering though, when she said cartridge. I wonder if they changed it. I'm wondering if they changed it.
Starting point is 01:10:49 I wonder if it's super concentrated. It might be very, very concentrated now where it wasn't before. It was like kind of watery. So I don't know. Cartridge sounds different to me. Cartridge sounds like what you'd put in like a K-cup machine. Right. Right?
Starting point is 01:11:01 Where you make the coffee and you ruin the environment. Like one of those. Yeah. All I know is, we are getting one for the students. Do you know how much those things have to weigh? Can you imagine getting it in here?
Starting point is 01:11:15 You got to pay a person to bring it. They're looking at you like, I'm not bringing that in. Fuck you. At my old work, we had vending machines. Yeah. And I had no idea
Starting point is 01:11:25 how much vending machines weigh. Well, they weigh an absolute fuck ton. Fucking unbelievable. They weigh an enormous amount. And I'll tell you, I know why they weigh that much. It's so that you don't rock them.
Starting point is 01:11:35 So you don't fucking steal shit out of them? So you don't rock them because they're so easy to rock. Then you could just knock that shit out of there. But they put like fucking lead bricks
Starting point is 01:11:44 on the bottom. Because like they wheeled it out. Yeah. And they ruined the floors of the, like they- No shit. Yeah, no shit. So when they wheeled it out, it was so heavy. It was wheeled on cardboard over vinyl flooring,
Starting point is 01:11:57 commercial vinyl flooring. And it carved ruts from the wheels of the mover right into the vinyl. The thing weighed like 900 pounds or some crazy shit, man. All right. Uh, so that's going to wrap it up for this week. We want to of course thank Dr. Novella for joining us. Uh, you can check, you can pre-order his new book, The Skeptic's Guide to the Future or you can buy his book,
Starting point is 01:12:18 The Skeptic's Guide to the Universe. You can find it at theskepticsguide.org. We want to thank him for joining us. He's always a wonderful interview, really smart guy. And so we want to thank him for coming on. No stream this week. And like we said, we'll do the patrons next week. But we're going to leave you like we always do with the Skeptics Creed. Credulity is not a virtue.
Starting point is 01:12:41 It's fortune cookie cutter, mommy issue, hypno-Babylon bullshit. Couched in scientician, double bubble, toil and trouble, pseudo-quasi-alternative, acupunctuating, pressurized, stereogram, pyramidal, free energy, healing, water, downward spiral, brain dead, pan, sales pitch, late night info-docutainment. Leo Pisces. Cancer cures. Detox. Reflex. Foot massage. Death in towers. Tarot cards.
Starting point is 01:13:09 Psychic healing. Crystal balls. Bigfoot. Yeti. Aliens. Churches. Mosques and synagogues. Temples.
Starting point is 01:13:15 Dragons. Giant worms. Atlantis. Dolphins. Truthers. Birthers. Witches. Wizards.
Starting point is 01:13:21 Vaccine nuts. Shaman healers. Evangelists. Conspiracy Double speak Stigmata Nonsense Expose your sides
Starting point is 01:13:31 Thrust your hands Bloody Evidential Conclusive Doubt even this The opinions and information provided on this podcast are intended for entertainment purposes only. All opinions are solely that of Glory Hole Studios, LLC. Cognitive dissonance makes no representations as to accuracy, completeness, currentness, suitability, or validity of any information and will not be liable for any errors, damages, or butthurt arising from consumption.
Starting point is 01:14:14 All information is provided on an as-is basis. No refunds. Produced in association with the local dairy council and viewers like you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.