The Joe Rogan Experience - #641 - Sam Harris

Episode Date: April 29, 2015

Sam Harris is a neuroscientist and author of the New York Times bestsellers, The End of Faith, Letter to a Christian Nation, and The Moral Landscape. His latest book "Waking Up: A Guide to Spiritualit...y without Religion" is available now.

Transcript
Discussion (0)
Starting point is 00:00:00 Sam Harris ladies and gentlemen all right there we go we're live to the world you're not gonna do a read I don't do that anymore oh I do it before or after that way the conversation isn't garbled by five minutes of ads right just got annoying you know no that's a good call yeah well there's two different versions of this the one that goes up on YouTube so the one that goes on YouTube has no ads in it unless YouTube puts up ads and the one that goes on podcasts or iTunes rather has the ads right for it so there you go so we could just talk yeah well good to see you good to see you too man
Starting point is 00:00:44 absolutely it's a interesting time I know you're an MMA fan, you know about the John Jones situation You know, I've heard rumors about it, but I don't actually know how far the misbehavior run So he got stripped of his title today Wow yesterday actually they announced a new title fight between Daniel Cormier and Anthony rumble Johnson they will be fighting for the now vacated title. Jon Jones is likely going to jail. So it was a hit and run thing? Hit and run. Crashed into a pregnant woman, allegedly, I should say.
Starting point is 00:01:13 It broke her arm. She was rushed to the hospital, but pregnant. It's got to be terrifying for her. A car crash and smashed by a guy who runs a red light and then flees the scene of the crime. And he drove away or he ran away on foot? Ran away.
Starting point is 00:01:29 Car was wrecked. Wow. Yeah. The car's wrecked. Um, you see the, the images of her car. I couldn't imagine unless he was driving a fucking Humvee on how he could get away. It's pretty bad. Yeah.
Starting point is 00:01:38 Was it a DUI or was, we don't know because he disappeared for like 24 hours at least, probably even more, a couple of days I think he disappeared for like 24 hours at least, probably even more. A couple days I think he disappeared for, which, you know, obviously the speculation would be that he was on something that he would worry about getting tested for for 24 hours or 48 hours or whatever it was. Yeah. I mean, I think of how desperate a move and hapless a move that is to run away from the car that you're leaving in the scene, which is obviously traceable to you. It's not like you're getting out of the problem worse, and you're leaving an injured person there, too Yeah, it's it's disaster, but it's worse because he actually ran back to the car stuff out of the car and then ran away again ID'd by a cop right Jesus yeah
Starting point is 00:02:20 Disaster all across the board, but not what we're here for. No. But I know you're an MMA fan, so. No, we have it. But bring it up. Yeah, I actually have a question for you, though, on a close topic. And we have a list of questions that, as you know, we got from Twitter. But McGregor or Aldo? Who knows? That's my answer for almost every fight.
Starting point is 00:02:40 It's going to be interesting, though, right? It's going to be a very interesting fight. We've never seen McGregor in with anybody remotely as talented as jose aldo not even remotely it doesn't mean he can't win because everybody that mcgregor has been in with he's steamrolled i mean he's steamrolled really talented guys like dustin poirier and just he's he's fucking good yeah he's really good he's really good really confident and he he really fucks with people's heads he's so confident and he really fucks with people's heads. He's so confident and so good at talking. And so there's so much charisma about him that I think it's intimidating to his opponents. I think when they get in there with him, his just overwhelming belief and belief is an
Starting point is 00:03:16 incredibly powerful thing. If you really, truly believe that you are the best and you say it, your opponents have to wonder. Because everybody questions themselves, especially in the world of fighting. Because no one is, if you were born a rhino, okay, and you were about to get into a fight with a parakeet, you'd absolutely be confident. Because you've always been a rhino and that parakeet is just a parakeet. That parakeet's fucked. But a person is a work in progress and not just from your fighting style, but your ability to manage your own mind, your ability to manage insecurities and anxieties, just the existential angst that comes with being a human being that's navigating their way through this complicated world.
Starting point is 00:04:06 There's variables. There's days you're up and days you're down, and then you add into that martial arts. And you don't get good at martial arts unless you get your ass kicked. There's only one way through. I mean, you can be one of those John Jones types that's unbelievably physically talented and have a leg up on a lot of people. But even John had to get his ass kicked You had to he had to he had to compete in wrestling against guys who are better than him He had to learn martial arts. He had a spar. I mean, there's gonna be days you're up and days you're down
Starting point is 00:04:36 There's no way but so when you get in there with a guy like Conor McGregor We're not here to take part All that crazy, he's, fuck, this guy's overwhelming. It's like, if you can put on a show like that, if you can put your peacock feathers up and puff up your back hairs, get those bristles up nice and high like a cat when it's angry and hunch your back up, there's a reason why that exists in nature. It's beyond what Muhammad Ali used to do.
Starting point is 00:05:03 It's going to be fascinating psychologically to see him when he loses. If when that day comes, it'll come eventually. Well, he has lost. He's lost to a guy named Joe Duffy, who's now in the UFC. And he's very good. This guy, Joe Duffy, is fucking very good. But has he lost since his rise? No, he has not lost since he's been in the UFC.
Starting point is 00:05:22 No, he has not lost since he's been in the UFC. But in all fairness, the one style that he has not faced, which is the most difficult style, he's never faced a wrestler. He's never faced a guy who can get in there and get him to the ground and outwork him and just sap his energy. Wrestlers, they get on top of you and you can't get them off and you get exhausted trying and they're punching you in the face and elbowing you in the face and punching you in the body and constantly trying to strangle you. And then the round is up and you get up and the next round comes and they do it again. Boom, you're on your back and boom, you're getting punched in the face. He's never faced a guy like that before. All the fighters he's faced have chosen to stand up with him. And he's a very
Starting point is 00:06:05 high level boxer. He was a amateur champion as a boxer. He's got very good jujitsu skills, very good Muay Thai, got very good everything. We've just never seen him against a top flight wrestler. But Aldo's not that, right? Nope. Aldo's not that. But Aldo is a world-class Brazilian jujitsu fighter. Like people who are not aware of his ground skills. Aldo beat a world-class Brazilian jiu-jitsu fighter. Like, people who are not aware of his ground skills. Aldo beat a guy named Cobrina in an actual jiu-jitsu competition. Wow. Which is very high level. Yeah, yeah.
Starting point is 00:06:32 Cobrina's world championship caliber. So, Aldo is going to present him with some things inside the octagon that he's never faced before. However, Aldo has been fighting professionally for a very long time. And like a race car that doesn't ever get its tires changed or doesn't ever get its suspension redone, as a human being, your body can literally only take so many miles. There's only so many times you can go to war. There's so many sparring sessions you can take part in, only so many wrestling sessions you can take part in. There's just a certain amount your body can take, and we don't know when that number is. When you reach that number, though, that's it.
Starting point is 00:07:16 There's no turning back unless you're using testosterone or growth hormone or some things that turn your body into a superhuman sort of experiment. Unless he's doing that, which we've seen from Vitor Belfort. Like, Vitor Belfort's the only guy who actually got better as he got older. Right. Talking about a guy from UFC 12. The miracle of science. It's exactly what it is. It's absolutely fascinating.
Starting point is 00:07:37 And you can't completely discount his training, because his training is what made him better. But his ability to recover was essentially supernatural. He's fighting in the same car that Jones was supposed to fight. He's fighting Chris Weidman, but now Nevada has made testosterone replacement therapy illegal. So his last three performances, which are some of the best performances of his career, the knockout of Michael Bisping, the knockout of Luke Rockhold, and the knockout of Dan Henderson, those were all while he was on testosterone. Yeah, okay.
Starting point is 00:08:09 Yeah, so things get weird now. Now you see what a 37-year-old man really looks like, you know, optimizing his hormones as best he can naturally, hopefully. Right, well, 37 sounds young. It sounds young to us. Yeah, I would like that level of testosterone. Well, you know, it's also the level of testosterone of a regular 37-year-old man versus the level of testosterone of someone going through a camp. Right. When you're going through those camps, it's absolutely brutal.
Starting point is 00:08:36 Like John Jones and Daniel Cormier both got tested randomly before their title fight, and their testosterone levels were so low, people were wondering, like, hey, maybe these guys are doing something. Maybe they were doing steroids, and that made their testosterone levels drop. But what's most likely, it's testosterone to epitestosterone. It was very low testosterone levels. Most likely, it's just the brutality of training. It's so hard. It's so hard to do. You're showing up every day exhausted and your muscles
Starting point is 00:09:08 are sore and your body's exhausted and you got to go through it all again tomorrow. And you're getting kicked and punched and you're lifting weights and you're doing sprints and you're jumping up on boxes. And then the next day, all over again, your body just can't keep up, especially when you get into your thirties. Yeah. Yeah yeah. All right. So that's the answer to that. I will watch it with interest. It's going to be very exciting. Let me know if you want to go. It's in Vegas.
Starting point is 00:09:29 Yeah, we'll talk about that. We'll talk about it. Sam Harris, Ghost of the Fights. As a neuroscientist, does it disturb you at all when you're seeing these guys getting their heads rattled? When you're very aware of what's going on behind the scenes in the, in the actual brain itself? Yeah. Well, I just, I just talked about this on my blog with this, this writer, Jonathan Gottschall, who I told you about at one point. Yeah. He's, I'm trying to get him on the podcast. We're working out a date. Yeah. He's great. And, um, so we had a conversation, which, which we published the transcript
Starting point is 00:10:02 of, but he, he's got this book, uh, The Professor in the Cage, where he's, he's an academic, he's an English professor, and he decided to get really into MMA and, and fight a real, you know, cage match. Um, and so it's a, it's an interesting book. And so he and I were talking about the discomfort we have just seeing people get brain damage in real time for our amusement. And, yeah, it does make me uncomfortable, but I'm, you know, it's also part of what's thrilling. I mean, you know, I'm as thrilled by the prospect of a knockout, too. It's kind of, it's not even a conscious thrill. It's just when things start going that way, you feel your own, you know, testosterone or something kick in.
Starting point is 00:10:46 you know, testosterone or something kick in. And, um, but I, I mean, his recommendation was that, and I'm sure he's not the only person who's thought of this. He thought the gloves should come off and the gloves are making it, making it realistic to just send endless overhand rights and other crazy punches, you know, full force into people's heads for, for, you know, 30 minutes. Whereas if you, if you were to, if it was a bare-knuckle fight, you couldn't really, it'd be fewer knockouts, but you couldn't deliver that kind of brain trauma because you'd be breaking your hands.
Starting point is 00:11:15 Now, obviously, there are things like elbows and knees and kicks, and so people would still be getting hurt. But what do you think about that? Would that change if it were better? I'm a big advocate of that, and I've said it many times. I've said it on the air.
Starting point is 00:11:25 I've talked about it on this podcast. I think it's very unrealistic the way you're allowed to not just put gloves on, but also wrap your hands up, tape your wrists. Your wrist is a joint. It flexes. And when you tape it up and they get that sucker nice and tight, then it becomes something completely unrealistic. tight, then it becomes something completely unrealistic. If you punch someone with your hand without a glove on and with no hand wraps, it's way easier to hyperextend your wrist or twist it or tweak it or break it. Your hand breaks. You hit someone in the forehead. Foreheads are far harder than your knuckles are. Most likely you're going to break it unless you hit them on the nose,
Starting point is 00:12:01 in the eye, on the jaw. You're going to hurt your hand. And you can only throw so many punches like that. And even just hitting a heavy bag without being wrapped up, you just screw up your wrist and your hands. Yeah, you have to really learn how to tighten everything up, and you have to really strengthen your wrist, and you have to strengthen your hand muscles. It's a completely different thing, which is why the karate students would hit a makiwara, which is a board that's wrapped with rope. And the idea behind punching that board wrapped in rope over and over again is you develop these intense calluses all over the front two knuckles, which is really the only way you're supposed to hit someone. Those are the knuckles that not reinforced nearly as well,
Starting point is 00:12:51 especially if you have larger hands, like your hand, like my hand, spreads out past my wrist. It doesn't go in a straight line from my wrist to the pinky. It actually goes out to the side. So that knuckle's not enforced by anything. If you punch someone with a boxing glove on with that knuckle, you're fine. If you punch someone with an MMA glove on with that knuckle, you're fine. If you punch someone with an MMA glove on, it's less supported. If you punch someone bare knuckle, you are very likely to break your hand. If you punch someone full force on the forehead and you hit with a pinky knuckle, you're very likely to break your hand.
Starting point is 00:13:18 It's a high possibility. Also, it also impedes your grappling to have those gloves on. Right, right. Marcelo Garcia. That was his point, too. Yeah, it was a huge issue. Marcelo Garcia fought in Japan, I believe it was, had this guy's back and couldn't finish him. Right, the guy was just grabbing his gloves.
Starting point is 00:13:34 Just grabbing his gloves, holding onto the gloves. The guy just worked on his defense. And the gloves, like Marcelo's specialty is the rear naked choke. And the rear naked choke, a big part of it is sliding your hands underneath the guy's chin. And you have the glove there. There's all this extra padding that makes it very difficult to get your hand in the proper position to get the choke right. Right, right. And the back of the head.
Starting point is 00:14:00 It's also very difficult to get the glove in the back of the head. So a lot of times you'll see in an MMA fight, guys who use poor technique and still finish the choke, well, they'll use their palm on the top or the back of the head because they can't do this. Right. They can't do this movement where it's actually the back of your hand that should touch the back of your opponent's head. This is all like to someone who doesn't understand jiu-jitsu. This is very complicated. But I agree with them.
Starting point is 00:14:24 I think that gloves would help a lot. Yeah. Gloves off i agree with them i think that uh gloves would help a lot yeah or the gloves no gloves yeah i think would help a lot but i think it's also it would be it would be beneficial for everyone to have some intensely comprehensive scans done on a regular basis i don't know if that would be prohibitive financially i don't know how much that would cost I don't know if that would be prohibitive Financially, I don't know how much that would cost I don't know how much that would even reveal because apparently one of the things that is troublesome for these NFL players is When they die and they do autopsies on them and they they reveal damage that they had no idea before I mean, I don't know how much like an fMRI or an MRI can detect When it comes to the actual damage.
Starting point is 00:15:06 Well, it can, but to what end? Because you know you're getting it. If you're just going to be in a job where you have to get hit in the head, forget about competition, just training. I mean, these guys train hard, as you know, and so they're just getting hit in the head to prepare them to get hit in the head in the match. You're just, it's not even, it's like smoking. It's like the causal linkage between getting hit in the head and brain trauma is 100%. It's just a matter of how much you individually, by dint of luck, can take until you actually have damage that matters.
Starting point is 00:15:47 dint of luck can take until you actually have damage that matters. So, you know, I obviously haven't had an experience anything like an MMA fighter, but I regret all the head injuries I took just training as a, I mean, now in martial arts, I just don't let myself get hit in the head. But as a teenager, I got hit in the head a fair amount. And I played soccer and headed the soccer ball. And that always felt totally weird. Did you play soccer? You head a soccer ball, you immediately get a kind of a rusty taste in your mouth. And it's just unlike anything else that happened that day, except the other time you got hit in the head. Isn't that crazy that no one would ever think that soccer can somehow or another give you traumatic brain injury? Because it does knock you out. Until recently, I would say like in the last couple decades,
Starting point is 00:16:30 we had this erroneous assumption that you had to get a knockout. You had to get knocked out to have brain damage. But just little thuds, just a little. Like I was talking to a doctor who said that water skiing can give you brain damage. Water skiing, just the bouncing on the waves. Oh, just the bouncing, yeah. Just wave riding, you know, when you're doing that, like that bouncing, that stuff can give you brain damage.
Starting point is 00:16:52 Right. Like your brain gets rattled around inside your head, the connective tissue dislodges, and it doesn't heal back. Yeah. Well, I mean, so I spoke about this with Jonathan, too, that there's obviously all of these sports and just, you know, forms of recreation that entail some risk of injury and death, right? And people should be able to do these things informed of the risks.
Starting point is 00:17:17 And so, you know, cheerleading, the example he brought up is cheerleading. Cheerleaders sometimes hit the ground and just are, you know, fantastically injured. cheerleading. Cheerleaders sometimes hit the ground and just are, you know, fantastically injured. So all these things that don't necessarily seem like high testosterone, high risk, you know, just foolishly reckless sports can be very dangerous. And skiing is very dangerous too. And rock climbing, there are things that are even just non-violent that don't entail much risk of injury until they kill you, like free solo rock climbing. You're climbing, everything's fine. Maybe you've hurt your hands in the past, but then all of a sudden you're dead
Starting point is 00:17:52 because you just went up 500 feet without a rope and fell. So there's all these kinds of risks that people can take, but the problem that I think differentiates striking sports from even something like football is that the progress in the sport is synonymous with the damage. So, you know, if you and I are in a boxing match or a kickboxing match hitting each other, every instance of successfully degrading each other's performance with respect to the head, hitting someone in the head, is synonymous with delivering injury to the brain. It's not incidental like in football where I was trying to tackle you,
Starting point is 00:18:37 I was not hoping to hurt your brain, but you fell down hard and it did. This is just a trade of brain damage. And yeah, so it's interesting ethically. I don't know. Again, I think people should be free to do it, but I think we should be informed about it. And I would certainly vote to pull the gloves off and make it more, it would just make it more realistic combatively, too.
Starting point is 00:19:05 Insofar as you want to see what works combatively, I'm more interested to see what two people can do just with their bare hands than when they've got these, you know. Most certainly. And, you know, one of the interesting things that's been brought up online over the last couple days about this John Jones situation is this irrational, erratic behavior. Does this imply that, or does it somehow or another, is it correlated to brain damage? I mean, it certainly can be. Can be, right? I mean, that's one of the issues with brain damage, impulsive behavior. Yeah. Especially in the frontal lobes because your frontal lobes regulate your, you know, emotional and behavioral outbursts. And when those connections, when either the cell bodies or the connections between the gray matter and the frontal lobes and your, your limbic system
Starting point is 00:19:58 and your basal ganglia and other areas in the brain, when that damaged yeah you you have these classic impulse control problems where you just you know you just reach out and you know grab the woman standing next to you at starbucks because you you know couldn't dampen the the impulse to do it you know that's hard for people to grasp because every i mean and again this should be really clear. I am without a doubt not trying to let him off the hook. What he did was horrible. If it was someone in my family that he hit with that car, I would be unbelievably furious. I'm incredibly disappointed in him.
Starting point is 00:20:35 I think the UFC absolutely did the right thing in stripping him of his title. And I think law enforcement is going to do the right thing by putting him in jail. I mean, they're going to. You can't do that. You can't hit someone with a car and leave the scene of the crime that it is a crime. Um, but there are things that people do because they have brain damage. And that's where the real question comes up is obviously they're responsible ultimately for their own actions, but what is it that's responsible for making them do that action? And we could, you know, we had this long conversation
Starting point is 00:21:11 once two podcasts ago, I think it was about free will and determinism. Um, these are variables that come into play when it comes to the ultimate actions that you choose to, to do the ultimate actions that you choose to do, the ultimate movements that you choose to take, the thought processes are unquestionably dependent upon the brain itself. And if the brain is getting damaged, and if we have proven that some of the issues with people that have brain damage is impulse control, you've got to wonder, man, when you see fighters do wild, crazy shit, how much of that is due to getting
Starting point is 00:21:48 just bonked in the fucking head all the time? Yeah, except for me, it breaks down a little differently because my views on free will change the picture of how I view moral culpability in those situations. So even if we knew his brain wasn't damaged, right? So he, let's say had never got hit in the head or we did a scan on him before
Starting point is 00:22:09 the car accident and we saw, and it's the perfect scan. It's the scan of, you know, that we'll have 50 years from now if we don't fuck ourselves up. And, um, so we just know that he's got a totally healthy brain by whatever metric of health we have, but he got into that situation and behaved exactly as irresponsibly as he did. His behavior still is the result of causes that he, as an agent, isn't ultimately responsible for. Now, this has certain, this has sort of the punchline, which has certain consequences. But one of the consequences is not that we can't respond to his misbehavior, that he can't, that we can't put him in jail, that we can't, that we couldn't
Starting point is 00:22:56 have intervened at any point along the way to have made him a better person. And there are, there are, there's a difference between voluntary and involuntary behavior, even if it's all just causally propagating from the moment of the Big Bang. But I do view it as, I think the brain damage case is a little bit of a red herring because it's, on some level, it's all just unconscious causes that the person himself can't ultimately account for. So there are situations in which he, I'm sure in his life, behaved like a mensch, you know, he behaved totally responsibly. And there are situations where he behaved like this. And he can't account for the difference in those two cases. He can't account for why. I mean, let's just say it's a fact that if he had gotten one hour more sleep the night before and hadn't had a fight with his girlfriend and had, you know, his blood sugar level was a little bit higher and hadn't had a friend who had told him to drink one more beer,
Starting point is 00:24:01 which he normally would have resisted but couldn't because of all the other factors I just mentioned. And that is the difference that made the difference that caused him to be this total misfit on the road. Whereas if you had just tweaked those dials a little bit, no fight with a girlfriend, one more bite of food in the morning, he would have acted as you would have acted in that case. So ultimately... Let's say that's true. There's something...
Starting point is 00:24:31 There's a kind of bad luck component to all of this creeping in. There's a concept of moral luck, which is due to the philosopher Thomas Nagel, who has done a lot of interesting work, half of which I really agree with, and some of it I don't. But the concept of moral luck is that it does seem unfair that there are many situations in which people create immense harms
Starting point is 00:24:58 doing stuff that you and I have gotten away with. They're not worse people than we are. Like, you know, you and I have both driven when we shouldn't have driven. You know, we've had one beer too many. There are things that we did, or you look at a text, you know, and you know you should never look at your phone
Starting point is 00:25:19 when you're driving, but you decide, oh, I'm expecting a text, and you look, and there are people who are looking at that text right now and just, you know, killing some child on the crosswalk, right? And their lives are going to be ruined, and, you know, they're going to go to prison, and they're exactly like you and me, right? So there's an aspect of luck here, but the luck actually propagates backward into the kind of brain you have, the kind of upbringing you had, the kind of parents you had, the kind of the fact that you got hit in the head as hard as you did or didn't. And no one has made themselves.
Starting point is 00:25:52 So I'm a little bit more I'm less judgmental about some of these things, given my view of free will. But I'm not it's not that I'm not interested in making the interventions that would make a difference. Whatever we could have done to have gotten him to behave differently, we should have done. Whatever we need to do now to him to make society better and to make him better and to get restitution for the woman, we should do all that. And so this does entail locking up certain dangerous people. It does entail, you know, we have to keep ourselves safe from people who are going to reliably act badly. And I don't know where he falls on that spectrum, but it's just, it's not the difference between the feeling you get when you hear, oh, it was brain damage.
Starting point is 00:26:40 I sort of have that feeling about everything. You know, if he gets a brain scan, if he goes to trial now, he gets a brain scan and we find that his brain is just massively damaged in all the right areas that would have eroded his impulse control, right? That would seem to let him off the hook a little bit. He would look like someone who was unlucky more than he would look like a bad person, right? And I sort of see bad people as unlucky, too. I mean, I recognize that there are certain people who are classically bad people. There are
Starting point is 00:27:12 psychopaths who you just, not only can you not rely on them, you can rely on them to be bad actors, you know, so you have to be in a posture of self-defense with respect to these people. But I do view them as unlucky on some fundamental level. I share that thought, and I share that thought much more as I get older, and I have a more philosophical point of view when it comes to people that live in impoverished neighborhoods, especially like this Baltimore thing that was going on. We were just having this conversation the other day about, or last podcast, about these kids that robbed the RT reporter. I don't know if you've seen the video of it.
Starting point is 00:27:53 There's an RT reporter interviewing these kids that are on the street that are causing all this havoc in Baltimore, and they start swarming this reporter, and then they rob her and take her purse and take off. And I'm like, imagine being one of those kids. Imagine being in that environment. Imagine being, you want to talk about determinants, imagine being born into this crime ridden environment. Who knows what kind of family you have? Who knows what kind of influences you have? Who knows what kind of experiences that you've had that you've had to react to and protect yourself from and develop this hardened, thick
Starting point is 00:28:29 skin and attitude and also survival instincts. And you also, your family or the people that you can reliably count on are the people that you hang out on the street, your gang. I mean, that is the big thing with gang violence. One of the big things with gang violence, one of the dirty secrets of it is that a lot of it comes from broken homes. When people don't have a strong family environment and people they can count on and trust, they don't have anybody that's there for them.
Starting point is 00:28:55 And then they find someone that's there for them in the gang. The gang becomes their new family and they will do anything to keep that love, to reinforce that love. Yeah. And we all want to look at it as they're criminals. They should be home by 10. There's a curfew on the street. It's completely unrealistic. And if you were in their point of view, or if you were in their life, rather, and if you saw it through their point of view, you would probably see exactly what they see.
Starting point is 00:29:19 You would look at life the way they look at life. Also, there's another variable here, which is just the influence of mob behavior. People will behave in crowds in ways that they wouldn't otherwise. Why is that? What is that? What's the mechanism behind that? Yeah, it's a fascinating... Well, I can't speak to the mechanism neurologically,
Starting point is 00:29:38 but it's a fascinating social phenomenon that has been thought about for at least a century. fascinating social phenomenon that has been thought about for at least a century. There was a philosopher, Elias Canetti, who wrote a book, Crowds and Power, which is very interesting on this topic. A crowd is almost like a fire. Once it gets started, the mob will behave by its dynamics that aren't really explained by the individual intentions of the individuals in the mob. And it's actually, it was a great book. Did you ever hear this book, Among the Thugs by Bill Buford? He, it was probably like 20, 25 years ago. He's a really nice writer. He edited this literary magazine, Granta, I believe, back in the day. And he got fascinated with the phenomenon of soccer hooliganism. And he went to the UK and just started hanging out with these just diehard,
Starting point is 00:30:34 I guess they were, I don't know, Manchester United or Arsenal fans. But he just got in with these guys who were normal guys, like plumbers and electricians and people who had real lives. These were not just teenagers who were thugs. They were people who had families, but soccer was their life, and they became soccer hooligans. But what's brilliant about the book, and again, it's been at least 20 years since I read it, so I could be a little off in my recollection here. But what I recall is that he wrote it in such a way that these guys he was hanging out with were really the protagonists. He got you in on their side for about 75 pages or so. misbehaving when they go to their first game
Starting point is 00:31:23 against the Italians and form a mob and start marauding the streets and bash kids in the head. They start behaving like sociopaths in this crowd. But he catches you out totally because you're on their
Starting point is 00:31:39 team for about 75 pages. You've identified with them. You've laughed with them. You've bonded with them as he did, and then he reveals the level of thuggery that they're capable of as a mob. And it was really, I recall it being a fascinating book, but it's just a fact that people will do in a crowd.
Starting point is 00:32:01 When you see, part of it's the social proof situation where you see everyone doing something and that on some level gives you license to do it. And there's kind of a, it's just contagious. When you see people breaking windows or jumping on a car or turning over a car or looting, it takes less of any individual to participate in that. It takes less for you to go in and grab a television set when you've seen a hundred of your neighbors do it, and you wouldn't have that morning just woken up deciding to rob the store yourself. So, we all like to think
Starting point is 00:32:45 we're the sort of people who would stand against the mob we would be the German who would have hid Jews in our basement and stood against the Nazis and you can multiply those cases ad nauseum but what a lot of psychological
Starting point is 00:33:01 science shows is that yeah there are those people, there are the people who will stand against the tide of imbeciles who are going to do something heinous. But most people are part of the tide. And it's just a very common phenomenon. The social license, that's a really interesting way to describe it because that is what it is, right? I mean, isn't that a big part of war? I mean, a big part of war is doing things that you would never do on a normal basis, in a normal scenario. On a regular basis, you are asked to put bullets into other human beings.
Starting point is 00:33:35 One of the things that I thought was really interesting about the controversy about American Sniper, the Chris Kyle movie, was he was talking about what it was like the first time he killed someone. And that he, this is in the book, I don't believe this is in the movie, but that he had this feeling before he shot someone, like, is this okay? I can actually do this? Is this okay to do this? And then he grew to enjoy it. And then it became commonplace and normal, and he's like, yeah, they're bad guys, and I'm going to shoot them. But this, the license license the social license and that is a accentuated with this this mob mentality that's it's mean you're you're a part of an army and you have an enemy and it's
Starting point is 00:34:20 the life or death consequences a life or death scenario that you're a part of. The whole thing is escalated. It's the highest level of that type of behavior that we have in society, in our culture today. Well, interestingly, it takes a lot to get people to kill in war. I think there's some myths around how easy it is for soldiers to shoot at the bad guy, but there have been studies done in prior wars where some shocking percentage of soldiers either never fired their guns or fired above their targets on purpose. They didn't want to kill anyone. And that's been, and so some of the discipline of training soldiers has been against the grain of those tendencies, trying to get people to actually try to kill the other person. And, you know, I think we've become more successful
Starting point is 00:35:21 probably at doing that. You know, this is not something I know a ton about. I just know that this research is out there. And the main dynamic, I think, with soldiers is you are trying to keep your buddy safe, and he or she now is trying to keep you safe. And that, you know, they're not only firing at you, trying to kill you, they're trying to kill your buddy or just did, or just, you know, wounded him or her. And now that's just a very simple dynamic that you're just you've bonded with the person to your right and to your left. And you guys are really in it together. And it's a matter of keeping you just going home safe, you know, and so what's going to what is required in that situation well you got to shoot
Starting point is 00:36:05 at the people in the other trench um and so every so now obviously there are aspects of war making that don't fit that mold and some of the more disturbing aspects that actually require less of us in terms of like you're dropping bombs from 30 000 feet or you're flying a drone from an office park out of outside of las ve of Las Vegas or wherever they are. And so we find that sort of telescopic approach to war different ethically. And I think it is. It's different war unleashes in most people this bloodlust that they're struggling to contain in the civilized world. And it's just that once the tap is open in a foreign country, you just have, you know, rambos everywhere. People are really conflicted about what they do.
Starting point is 00:37:03 And a lot of people try to not do anything of consequence. There's a great episode of one of Dan Carlin's podcasts, one of the hardcore history podcasts about World War I. And I believe it was about the Germans and the English that they had been in battle with each other. And they had sort of without verbally agreeing to this, they had sort of agreed to a ceasefire during lunch. Oh yeah. Yeah. It was fascinating.
Starting point is 00:37:31 Do you know the story? Yeah. Yeah. Yeah. Cause I, cause I've heard, I knew the story, but I've also listened to Dan's podcast, which is, I think I got from you. It's just fantastic. He's the best.
Starting point is 00:37:42 It's amazing. I think that all of, all of them are great, but that series on World War I is just a masterpiece. It's just really, you know, he's doing something remarkable there. But, yeah, this trench warfare was the most brutal. It was just this, you know, horror compounded upon horror endlessly for years to no evident gain. I mean, these people, they're fighting for yards of ground forever, and just tens of thousands of people are dying. And they're basically camped out on the decomposing bodies of the people who died before them. And it's the most horrible version of warfare you've ever heard about.
Starting point is 00:38:28 And then there's this no man's land between the trenches where people who will run out there trying to make an incursion into the enemy trench will get caught on barbed wire or they'll get shot. And then you have this spectacle of of people of injured and dying people in the no man's land between the trenches you know howling for hours and hours and hours in misery and when someone goes to try to rescue them they get shot and and so but there were periods where the the two sides just uh agreed that this was just, and again, how that was communicated was kind of interesting.
Starting point is 00:39:07 I don't actually recall the details there, but it was kind of a tacit agreement that emerged where, okay, we're going to let you get your, we're not going to shoot at you when you get the injured person or the dead bodies. And there was one Christmas, I believe,
Starting point is 00:39:22 where they just basically went out and exchanged cigarettes and had an impromptu soccer game. And they basically called the war off at a certain point and then got chastised by the higher-ups for doing that. And then the war started all over again. But, yeah, they actually socialized at one point. It's amazing. It really is an amazing depiction of what must have been an impossible place to be in. To imagine being a person standing on the decomposing bodies,
Starting point is 00:39:53 being forced to shit in a coffee cup and throw it over the top of the trench, and know that no one's getting out of this. You might be one of a thousand people that's going to die in the next couple hours. You might be, you know, you, you might make it to next week. You might not. I mean, and just the stress that you're, you're dealing with the, the non-human aspect of that life. This is not a normal thing that you ever expected to deal with. There's not a normal set of scenarios. It's not your, your, your brain, the way you grew up, you're not prepared for this life. You're just thrust into it and it doesn't make any sense. And then to have that all sort of
Starting point is 00:40:35 eroded to the point where on Christmas you guys are hanging out and then the, the generals come in and say, fuck this, you got to kill those people. And next thing you're killing each other again. Like, so you had this brief glimpse of, you know, some utopia inside of war. Yeah. Well, what was so weird about that war in particular was that the run Europe where this was just looked at in the rosiest possible terms. Like this is just the true glory of manhood being expressed. Finally, we have a, it's like it was approached like the World Cup or something. It was like pure exuberance around the prospect of fighting this war in many quarters. And you'd be surprised if that ever happened again. So it's a little bit like
Starting point is 00:41:31 what's happening with jihadists globally, but they have beliefs that cause that to make more sense. I mean, they believe they're going to go to paradise when they get killed in this war. But it's hard to map your own psychology onto the cream of of english youth where they were just just going off with this this level of enthusiasm having no idea i guess part of it was they had no idea just how horrible it was going to be uh but they um yeah the it's you know you read homer and and war is this glorious thing. The war ethic you get from ancient civilizations is something that we have, I think, largely outgrown, but you can really see it in World War I. I had that similar attitude post 9-11, especially when the World Trade Center towers went down and there was this flag waving fewer in America, unlike anything I'd ever seen. I remember post 9-11, I remember driving down the street, leaving the street near my house and entering into this main street and every car, every car had an American flag. Every car.
Starting point is 00:42:46 It was insane. I mean, if you did, I didn't have an American flag. I was, like, looked at odd. You know, like, this is an unprecedented time in history. And then all these people were signing up for war. All these people were signing up because they wanted to go over there. They wanted to fight the good fight. these people were signing up because they wanted to go over there they wanted to fight the good fight and then you started hearing things from people like like pat tillman who left a career
Starting point is 00:43:09 as an nfl player right very promising career as a pro athlete and all of a sudden he's over there in this war and his impression of it was that it was a huge clusterfuck it was nothing like what he wanted it was nothing like what he expected and he was very verbal about that very, very openly critical about that. And a lot of people think that's one of the reasons why he died. You know, there's a giant conspiracy theory that they killed him because he was talking and he's killed by friendly fire. He was killed by American troops. And there was the conspiracy theory was that they shut him up because he was so openly critical of what was going on over there that it wasn't what he thought it was going to be. He thought it was going to be this incredibly organized group of heroes that went over there to fight these evil bad guys that are hell-bound to destruction and suicide bombing their way into America to kill the American dream.
Starting point is 00:43:59 I mean, this is the idealistic version of it. Well, I think there is an idealistic version of good and bad actors in this case. It's just the reality of fighting this war is so messy. And that it's, yeah, Afghanistan, I think, was pretty clear cut morally that we had to do something against al-Qaeda. And by definition, once the Taliban wouldn't release Osama bin Laden to us, we had to do something against the Taliban. I mean, that's where he was and they were sheltering him. And so I didn't feel ethically conflicted over that. But that was such a mess. I mean, going into Afghanistan, the reality of what it takes to
Starting point is 00:44:45 go into Afghanistan and kill the bad guys is so messy that there's arguably no good way to do it. There's no way to do it, which at the end of the day is going to look like a success. And so, maybe that's something we're now learning, that you have to, this is so messy that you have to be, That you have to, this is so messy that you have to be, you really have to pick your moments and be far more surgical than we've ever been inclined to be. And not even think about defeating the enemy, ultimately, but just kind of keeping the enemy at bay. Containing this problem for long enough to change minds or change culture in some other way. Because even in this case, I think it was very clear cut. You know, killing members of Al-Qaeda was a good idea. And I think it's still a good idea. It's just, you know, a drone strike kills some of them and also kills some of the hostages,
Starting point is 00:45:40 as we now see. And it also kills some of the people standing too close to the the bomb blast and it's ethically messy you know so it's it's but i think there are instances of it that are certainly necessary but someone has to be thinking very clearly about how we proceed in a world where there really are people trying to destroy us it It's not that there's no bad guys. There are bad guys. Isn't that where the foreign policy argument comes into play? Because some people say those bad guys are bad guys because of U.S. foreign policy, because of the way we have intervened and dominated natural resources. You think it's confused?
Starting point is 00:46:21 Yeah, yeah. Before we dive into that, I'm looking at a list of topics that were brought up by our Twitter people. And I'm just going to, I'll read the list just so we have it in our heads and you can decide what you want to deal with here. But Islam, anything but Islam, Abby Martin, Abby Martin, Abby Martin. So I think we have to deal with Abby Martin. Okay. Abby, who is a good friend. I love Abby. She's crazy though, uh, in a good way, but she's, she's wild.
Starting point is 00:46:50 And she accused you of, uh, being one of the new atheists with your anti-Islamic rhetoric. And, uh, you know, that's nothing new to you. You've been accused of that in the past. Um, did she misrepresent your point of view yeah did you listen to it yeah i did i did listen to it should we play it or no i don't think you need well you don't need to but what did she say and what what do you not agree with what she said well she said it was really interesting listening to her because so i listened to the whole podcast and she didn't mention me until like the second hour. I'm listening to this.
Starting point is 00:47:25 I'm thinking I'm actually I'm actually having a conversation with you in my head as I'm listening to this. And I'm thinking, man, Joe, you it's kind of remarkable what you are able to do here because you're having a conversation with her. If I from my point of view, you are just drinking from a fire hose of bullshit. Right. It's just like she is what she's saying. There's so much wrong is what she's saying. There's so much wrong with what she's saying. And, but yet you're in a position to have a conversation with her. That is where there's just like a ton of goodwill and it doesn't run to the ditch at all. And you can have a conversation with me in the same vein. But then I was thinking I could not, I mean, I would, I'm
Starting point is 00:48:01 sure she's a perfectly nice person and I would, you know, I would be very nice talking to her. But I have a feeling now of more or less total hopelessness talking to someone as polarized on these issues as I view her to be. And so I was kind of praising you in my mind, thinking, you know, I couldn't do what you're doing here. And at that instance, she just mentioned me, right? So it was like, it was like one of those bad scenes in a movie where the television starts talking to the, to the character. She just, she just kind of called me out and then more or less totally misrepresented my views. Um, so, so she said, she said many things that are just inaccurate, which we can talk about. But in terms of what she attributes to me, she said that I only care about intentions, right?
Starting point is 00:48:54 So that all intentions are all that matter. So if we kill a billion people but meant well, we're fine. And if the Muslims kill a million people, but don't mean well, they're far worse than we are ethically. And that's, intentions are all that matter. And she was, I think in her defense, I mean, I'm sure she's never read anything I've written, but she was reacting to a snippet of a podcast where I push back against some of the things that Noam Chomsky has said. And I haven't thought that I've said, I said in my first book that Chomsky doesn't value the ethical role of intentions enough. And I said something very brief in a podcast that bounced around.
Starting point is 00:49:37 And so that's what she heard. So she misconstrued me there. Intentions are not all that matters, obviously. But intentions do matter. So if someone, if someone stabs you, right, the difference between them doing it on purpose because they want to kill you and them doing it by accident because, you know, they were cutting, you know, you guys were cooking in the kitchen and they didn't know you were there. They turned and they stuck a knife into your belly. It's a world of difference
Starting point is 00:50:06 ethically, right? And the crucial difference is the person's intentions tell you a tremendous amount about what they're likely to do in the future. The guy who stabbed you once because he wants to kill you is very likely going to stab you again until you're dead, right? I mean, he's trying to kill you. Your friend in the kitchen who stabbed you by accident likely going to stab you again until you're dead, right? I mean, he's trying to kill you. Your friend in the kitchen who stabbed you by accident is going to be rushing you to the hospital in the next instance. Right, but this is kind of a disingenuous comparison because, I mean, are you describing the difference between accidentally killing civilians with a surgical strike,
Starting point is 00:50:41 in quotes, of a drone strike versus killing someone with a suicide bomb? Were you trying to kill as many people that are random as possible? So I'm using a very idealized example just to show you that the role of intention is not all that matters because getting stabbed still sucks, right? So if you assume the same stab wound, you still have the same problem. But one of your scenarios is completely innocent and accidental.
Starting point is 00:51:07 Yeah. The other one is murderous intent. Okay. So those are the extremes, right? Right. Then you can have gradations along that continuum, right? Where you have, and somewhere more in the middle would be- You're trying to kill a bad guy and you accidentally kill an innocent person as well.
Starting point is 00:51:19 Yes, absolutely. And it's totally, or you think you've got the bad guy and you've just got bad intelligence and you just, all you kill is an innocent person, right? You could be, let's say you were being totally surgical. You're a sniper. You know, you're going to just kill one person with one bullet, but you've got the wrong person through no fault of your own, right? Or worse yet, the bullet goes through that person and kills someone else, which also happens. So all kinds of scenarios like that. And there's a very common scenario, I think, which is you're bombing the bad guy. You're reasonably sure you're bombing the bad guy, and he really is the bad guy. But you're also reasonably sure that you're going to injure or kill some innocent people, and you're okay with that, because the reality of fighting war is you never get the bad guy standing all alone
Starting point is 00:52:06 500 yards from the next person. So if you want to fight this war with drones, say, you have to accept some level of collateral damage. Now, I don't actually, I mean, I'm not privy to any kind of intelligence. You know, I'm not in those circles and I'm not one of those people. So I don't know just how Obama or anyone in a position of responsibility makes those calculations. What is acceptable collateral damage? But we know that some level of collateral damage is acceptable because otherwise it would be impossible to fight war at all, right? So we know that some level of collateral damage is acceptable just driving on our roads. You know, 30,000 people die every year on our roads. We could
Starting point is 00:52:53 dial that number down to zero, right? If we were committed to no death on our roads, we could get there. We would just all have to drive five miles an hour. Right. But the difference is that when you're driving, you're not intending on killing someone. It's an unintended consequence of transportation. There's a big difference between that and the unintended consequence of violence, which is definitely deliberate. Let's get into that. Let's see if there is.
Starting point is 00:53:16 This is an unintended consequence, unintended but foreseeable consequence, in fact, certain consequence of our keeping the speed limit where it is, right? You and I both know, like, so let's say we could vote on this. You know, what do you want the speed limit to be? It's, you know, let's say it's 75 miles an hour. We know that if we reduced it to five, there'd be some other costs, and I'm sure there'd be some other ways in which people might die. You know, it would be harder to get, you could, you know, an ambulance getting to the hospital would be hitting a traffic jam. And so some people would die on the way to the hospital site. But leave that aside. We would save tens of
Starting point is 00:53:52 thousands of lives every year if we just took all the fun out of driving. And, or just forget about that. Forget about, let's keep the speed limit exactly where it is. but no matter what car you have, there's a governor on it and you cannot go past the legal speed limit ever, right? So if you're in a 25 mile an hour zone, you know, whatever your car is, you've got a Porsche or whatever you like to drive, it can only go 25 miles an hour, not a mile an hour more, no matter how you hit the throttle. And that would be true in every zone. There are people would resist that. And their reasons for resisting it is just that it wouldn't be driving would be less fun, right? Now that is a if anything is indefensible when you're talking about kids being killed. That is right. It's a far more superficial commitment than wanting to get the higher-ups in Al-Qaeda who are trying to, at some point, blow up an American city, right?
Starting point is 00:54:52 Right. But imagine if as many innocent people died from driving from one activity. Like, think about the amount of people that die. They do. But they don't. They do. But the numbers are nowhere near. What about the numbers? Let's talk about the numbers. What about the drone numbers? How about the amount of people that die. They do. But they don't. They do. The numbers are nowhere near. What about the numbers?
Starting point is 00:55:06 Let's talk about the numbers. Drone numbers. How about the drone numbers? What are the percentage of people that have died, the innocent people that have died because of drone strikes? But it's more than 80%. It's more than 80%. I don't actually. I just have to plead ignorance on that.
Starting point is 00:55:19 I don't know those numbers. They're crazy. They're very high. They're very high. But hold on for a second. But hold on for a second. Because you're talking about something like driving. Driving. 30,000 a year
Starting point is 00:55:29 every year, reliably. The last 10 years has been 300,000 people in the U.S. How many people who drive on a daily basis wind up driving their whole life and never killing anybody? Most. Yeah. Most. How many drone strikes wind up not killing innocent people? Almost none.
Starting point is 00:55:47 But that's not necessarily the way to analyze it, or at least I would argue that's not the way. But let's just talk about numbers, for instance, because another problem I had with Abby Martin, she was using this number, 2 million dead in Iraq and Afghanistan. Where did she get that number? There's no credible person is using that number. What do you think the number is? That number is almost certainly an order of million. No, no, that number is almost certainly an order of magnitude too high. It's the, the sober estimates are like 200,000. And most of that most is the result of sectarian violence, right? It's not, we didn't kill 200,000 people. We went in to Iraq, and we're mostly talking about Iraq.
Starting point is 00:56:30 The numbers are much higher there than in Afghanistan. We went into Iraq. We did some very understandable things and also some very stupid things, but we took the lid off of a simmering civil war. The real catastrophe of Iraq, apart from our going in in the first place, which I never supported, but the real catastrophe is that having gone in, we failed to anticipate the level of sectarian hatred, and we did very little to hedge against it.
Starting point is 00:57:02 And we kicked off a civil war, which someone like Abby Martin clearly thinks we are entirely responsible for. So when Shia death squads are taking out the power drills and drilling holes into the heads of their neighbors and the Sunni are returning the favor, that's us. We are culpable for that.
Starting point is 00:57:21 Now, I don't accept that. These people were, they're killing one another. They've got a blood feud going back over a millennium now. And we popped the cork on it in Iraq. And that's a terrible thing to have collaborated in. And we probably should have foreseen it. So if we're culpable, it's for not having anticipated certain of these consequences of our actions. But we are
Starting point is 00:57:45 not the people, we are not the Sunni who are killing Shia, and we're not the Shia who are killing Sunni. And the same is true in Afghanistan. We are not the Taliban who are blowing themselves up in crowds of fellow Afghans as a way of making their country ungovernable so that we have to leave, right? Now, again, you could fault us for not having anticipated this closely enough or done something effective to prevent it. And ironically, what you're faulting us for in that case, probably, in Iraq, is once we saw how bad this was getting, you're faulting us for leaving, right?
Starting point is 00:58:21 The argument there is the compassionate thing to have done there, insofar as we could have anticipated the rise of ISIS and all of this consequent death toll, you are faulting us for leaving because our political interests and our stomach was no longer aligned with this project. And so I'm not sure that's an argument that someone like Abby Martin wants to make that we should have stayed longer that we should have spent more money that we should have killed more people in an effort to keep the locals from killing so many more people so anyway that this the the number two million is plucked out of you know a bad dream uh I don't know who says two million who was saying I don't know what you got. Well, who says 2 million? I don't know. I don't know anyone. Who's saying 200?
Starting point is 00:59:06 I mean, obviously, you're not going over their counting bodies. No. So who is saying it's 200,000 and who is saying it's 2 million? Okay, well, so I'll tell you, the highest number
Starting point is 00:59:17 that at one point seemed credible was based on a Lancet article. Lancet is a British medical journal, very well regarded but it has had something jamie just put this up on the screen here what is this from jamie iraq body count so that's 200 000 but what is who is making this iraqbodycount.org website it says that's wait let's just read what it says documented civilian deaths from violence, 138,000 to 156,000. Total violent deaths, including combatants, 211,000. And this is, what is this up from?
Starting point is 00:59:53 It says following the 2003 invasion, but is this current? I think this is total. They keep counting. They keep counting. So there are different ways to do this. One is you can count bodies, right? And the information there is not perfect because some people die and their death doesn't get reported. You know, not everyone has a death certificate.
Starting point is 01:00:13 So you can count bodies. You can get reports of the actual deaths. deaths. The other thing you can do is you can estimate the amount of death that would have occurred in the absence of an invasion, and then compare the reports of, you do a statistical sample of an area and compare the reports of death. Based on the past. Yeah, based on what's happening now, and you see a differential there, and then you extrapolate to the rest of the population. And so that's what this, the authors of this Lancet article, they did that. And they came up with a number 600 or 650,000. And that article has been widely criticized, not to the point of it being unpublished,
Starting point is 01:00:57 but I don't think it's been retracted. But I don't think any serious person thinks that article is representative of the facts. And so what they did, for instance, is they would take a cluster of, I think, 40 homes in an area and ask the people, you know, who has died, who do you know who has died, and how did they die? And then they would get, they would just, based on that sample, they'd do that in many different sites around Iraq. Based on those samples, they would extrapolate to the rest of the population. And they came up with 600 or 650,000. in big cities where IEDs were far more common than other places in those same cities or in other places in Iraq.
Starting point is 01:01:50 So the very place you would most likely plant an IED is a not especially representative place to poll all the families in the area to see whether they've lost loved ones in the war, right? So that was one way to get an unrealistic number. The other thing is that it seemed that there were just some shady things with the researchers, where they weren't releasing their data, their methods, and the communication with them broke down. And so, anyway, the sober people I trust who focus on these things think that's a fictional number.
Starting point is 01:02:22 And that's one-third, not even one even one third of what Abby's working with. Excuse me. I think I caught Aubrey de Grey's cold. Just watching that podcast, I think I got his cold. I think that we can agree that even 200,000 people dead is a tremendous tragedy. Horrific, tremendous tragedy. So the semantics argument over whether or not it's tenfold, that number, or whatever it is, you just disagree with her.
Starting point is 01:02:49 That's not semantics. Okay, it's not. You're right. It's not semantics. Semantics is a bad word. You know, tenfold is tenfold. But even more important is we didn't go in and kill 200,000 people. We went in and killed, I'm sure, some tens of thousands of people, many of whom were the Bathurst, the Revolutionary Guard, right? And we unleashed a maelstrom of internecine sectarian conflict, religious conflict, and we failed to contain it. And it would have taken more
Starting point is 01:03:26 blood and treasure to contain it. And so it's a huge problem. I'm not minimizing the horror of Iraq. Again, I never supported our invasion of Iraq. The things I've said that have been spun in support of it are drawn from conversations of the sort that you and I are having now. So, for instance, I've said things like, it might have been on your podcast, I said at one point, even if we had gone into Iraq for purely humanitarian reasons, right?
Starting point is 01:03:55 We're going to go in to remove Saddam. He's a criminal. He's a psychopath. This is a hostage situation. We're going to get him out of there and we are just going to dump money on these people to, to, so that the standard of living rises to something like Marin County. Right? So this is, this is our goal. That's actually our intention, right? Even if that were the intention, it still would have been a nightmare. We still would have unleashed this
Starting point is 01:04:19 sectarian horror. We still would have had suicide bombers against us. Now I believe that to be true, right? So I say something like that, and some people listen to this conversation, and they say, well, Sam thinks that we went into Iraq for humanitarian reasons. He thinks we went into Iraq just to make it like Marin County, right? So that's the sort of pushback I get from the Abby Martins of the world. No, I've never said that. I'm just saying that the truth is so sinister that even if our intentions were perfectly benign and we're just trying to raise the standard of living there and make, you know,
Starting point is 01:04:50 even just give them the freedom to practice their own religion, right? We're trying to make this like Nebraska. It would have been a bloodbath, given the beliefs of the sorts of people who now populate a group like ISIS. So anyway, that's my claim. So this is just one aspect of what you disagreed with what she said. The two million number. So we've beaten that down to the ground. I mean, she just has this...
Starting point is 01:05:19 She's a lefty. But she has a kind of confabulatory style. And again, I'm not really denigrating her personally, but I don't know her. I'm sure she's a lefty. Well, but she has a kind of confabulatory style. And again, I'm not really denigrating her personally, but I don't know her. I'm sure she's a good person. But there's a style of talking that you run into with people where there's just, it's kind of confabulatory, where you're just sort of talking and it's sounding good and you're just, you're sort of spitballing. But you're using numbers, right? You're using numbers like two million or you're saying things like, our biggest export is weapons.
Starting point is 01:05:47 The landmark research proves the U.S.-led war on terror has killed as many as 2 million people, but this is a fraction of the Western responsibility for deaths in Iraq and Afghanistan over the last two decades. Last month, Washington, D.C.-based Physicians for Social Responsibility released a landmark study concluding that the death toll from 10 years of the war on terror since the 9-11 attacks is at least 1.3 million. It could be as high as 2 million. Let me just finish it just so we can get to it. 97-page report. By the Nobel Peace Prize winning Doctors Group is the first to tally up the total number of civilian casualties from the U.S.-led counterterrorism interventions in Iraq, Afghanistan, and Pakistan.
Starting point is 01:06:25 Right. Okay. Well, so clearly that's where she got this number. She got it from somebody who got this number there. I think that number is... I mean, the sorts of... So, for instance, when I, as a sanity check, when I heard Abby Martin, I sent an email to my friend Steven Pinker,
Starting point is 01:06:42 who's an incredibly sober scientist, just a very careful researcher. He wrote this truly landmark book on the decline of violence in the last century, that the Better Angels of Our Nature came out a few years ago. It was like 800 pages on this topic. Very data-driven book. He did a tremendous amount of research for this. And he's an incredibly well-respected Harvard scientist. So I pinged him about this. I said, I'm hearing in liberal circles that we killed 2 million people in Iraq and Afghanistan. Is there any chance that this is true? And he said more or less what I'm saying to you now, that it's almost certainly an order of magnitude too high.
Starting point is 01:07:30 The highest briefly credible study was the Lancet one. I didn't know about this, and I don't know what he in particular would say about this study, but undoubtedly they used the same sort of extrapolation methods. They're not counting bodies. They're doing, based on sort of the ambient level of death over the years, they think it's gone up by the tune of 2 million in those countries. But anyway, so Steve said, no, this is a totally fanciful number, and here's why. And he broke it down for me and then I did a little more reading on the topic.
Starting point is 01:08:11 But again, the crucial, and I think it really matters whether the number is 200,000 or 2 million. I don't want to be loose on that. But the crucial ethical difference is, did we go in and perform our own sort of final solution against the Iraqis and the Afghanis trying to kill millions of people a la Hitler? Or did we wander into a situation where we unleashed a civil war? And are we culpable for that? And I don't
Starting point is 01:08:42 think we are. We're culpable for something. But we are not the Sunnis killing the Shia. And you believe that that is the majority of the deaths? Absolutely. Absolutely. I don't know if the new study necessarily agrees with that. I'm sure. Well, I would be astonished if they didn't. It talks about that study in here, that Lancet study. It says it's likely to be far more accurate than the figures initially. Which study is likely to be?
Starting point is 01:09:08 The 200,000 one? The 655,000 deaths. So wait a minute, the people who are saying it may be 2 million, but it's at least 1.3 million? According to the PSR study, the much disputed Lancet study, Lancet study that estimated 655,000 Iraq deaths
Starting point is 01:09:23 up to 2006, and over a million until today by extrapolation was likely to be far more accurate than the IBC's figure. In fact, the report confirms a virtual consensus among epidemiologists on the reliability of the Lancet study. This is coming from the, so I don't want to totally, I'm not saying I'm not open to this information, but the website you're pulling this from is just trash. Middle East Eye is just, I mean, these guys publish the serial plagiarist who's been stalking me, who I vowed not to name. I mean, this is just the stuff they publish is just pure insanity. I mean, Google me on this site and you'll get madness. So, but they're talking about a study. They didn't perform the study, right?
Starting point is 01:10:16 Right. But I can't- They're publishing the study? In real time, I can't vet their representation. Click on that study and see who the hell, who does it say performed that study? I can't read that. PSR, Physicians for Social Responsibility. You know, Amnesty International has embarrassed itself with supporting a jihadist organization in the UK. They just, at the 11th hour, pulled out their support. But for a very long time, they were just in the same trench with jihadists and not knowing it, or they should have known it. I mean, people were telling them, but they were still very slow to realize it. So I don't, you know, you don't know, um, you should be slow to, to, uh,
Starting point is 01:11:07 take even a humanitarian organization's word for the significance of, of a given study. Um, but you know, I would find, I would find it frankly amazing if we had killed anything like that number of people. And why? Well, I would, I would find it amazing like that number of people. Why? Well, I would find it amazing if that number of people had died. I would find it's unthinkable to me that we killed 2 million people. But it's over 12 years of war. Right, okay, but... Even if you killed, you know, think about the numbers. No, but you just know where all this death is coming from.
Starting point is 01:11:43 You know where the bombings, the IEDs, the truck bombs, the blowing up of mosques. We're not doing that, right? So that's the body count. When you look at the penalty we're paying for killing people, and when we look at how much we don't, our own soldiers don't want to die unnecessarily, and our own level of casualties, we are not on the other side of all those guns. I mean, there's just a tremendous amount of internecine violence in both Afghanistan and Iraq.
Starting point is 01:12:20 And that is just killing, you know, like a bomb will go off and a hundred people at a mosque will be dead. So Abby Martin, I think, I don't think I'm being uncharitable here. I think she thinks we're responsible for that. For the Sunni Shia violence. Yes. Okay. So whatever the number is, that's your argument with that. You were saying that I was swimming in a sea of bullshit before that. Right.
Starting point is 01:12:45 What else was bullshit? Well, I think, I don't remember all the details. But so, for instance, one thing she said is that our main export is arms. That's not true, right? That's not true. That's not true. So it may be. What is our main export?
Starting point is 01:13:01 I think our main export is like machinery. It's like everything, everything like farming equipment and pumps and road making. I thought it was corn. That was a guess. I think it's airplanes and everything that's a machine, I think. So maybe arms falls into that category. But when you look at machines, number one, $219 billion in machines. Right.
Starting point is 01:13:27 13.5% of total exports. Number two, electronic equipment, $171 billion. Oil, $157 billion. Number three, vehicles, $135 billion. Number four, aircrafts and spacecrafts, $124 billion. Number six, medical, technical equipment, $84 billion. Number seven, gems, precious metals, and coins, $65 billion. That's interesting.
Starting point is 01:13:50 Number eight, plastics, $63 billion. Number nine, pharmaceuticals, $43 billion. Coming in strong with the Viagra. Number 10, organic chemicals. So guns aren't even in the top ten unless machines are guns? I'm sure that, well, you have to think that weaponry is somehow spread across machines and aircraft and vehicles. Well, I mean, I'll cut her the benefit of the doubt there that maybe there is, because you know that. Well, let's find a more comprehensive list that actually includes arms. Because that's pretty sneaky.
Starting point is 01:14:24 If it is machines, then she might be somehow or another we'll say it's you know we'll say we export you know 50 billion dollars in arms a year i don't you know i don't know what it is but it's um and that's that's a whole nother conversation whether we should should be doing that i think it's that's suicidally stupid in certain cases that we're arming people who are eventually going to be using these arms against us or our friends. But the conspiracy theory would be that that's how they perpetuate this whole constant cycle of wars. You have to keep arming your enemies. Well, no. So I think, yeah, the economic interest of defense contractors is not something that I am especially sanguine about. It's not, you know, I think the role,
Starting point is 01:15:07 the possible role of corruption there and a sort of callous indifference to the effects of being in this business, I think that's a very real concern. But so to say that we, this is our main export, is bombs, essentially, that's not true. Now, maybe what she meant to say is we are is our main export is bombs essentially that's not true now maybe what she meant to say is we are the main exporter of weaponry in the world it's probably which is probably true but that's not what she said or maybe she said both and i only heard one but
Starting point is 01:15:36 no i think you're correct in what she said so she might be getting you know that's the problem unless you're doing look we live in a world that's so broad and comprehensive that unless you're doing the actual research yourself, and not just doing it, but doing it over a long period of time, and very meticulous, most likely you don't know the actual numbers. There's very few things that I could talk about with utmost certainty that aren't involved directly in my own life. And when you deal with numbers, like numbers of imported guns exported guns people dying in a place that you've never even visited boy you're relying on a lot of people's data yeah yeah yeah and it one thing that really is depressing is the degree to which people are this conversation is so politicized that you you just even science it's like the climate change
Starting point is 01:16:23 conversation the fact that there are people who are, you can always find someone with a PhD to sit up there and say, you know, I don't think cigarettes cause cancer, right? I mean, you can find those people. You can find the people who are engineers who say that 9-11 had to be an inside job because, you know, the melting point of steel, blah, blah, blah. And you can find on all these issues, you get the incredibly politicized science. But certain things don't pass the smell test. And to me, 2 million people doesn't pass the smell test, certainly if you're going to say that we killed those two million people, that we did double the level of a Rwandan genocide intentionally, that was what we
Starting point is 01:17:13 did. That just seems completely masochistically hallucinatory to me. I see your point. I don't know who's right, but I see your point. Now, was there anything else that she said that you needed to dispute? I don't think so. I mean, the thing I can say just categorically is that what she said about my concern about intention is just not true. Intentions matter because they are the best predictor as to what the person is likely to do in the future. If you know someone is killing people because he intends that, he wants that, he wants to cause grief and suffering and death, well then you know this is a person you have to jail or kill,
Starting point is 01:17:56 and this is not a good actor. If someone does it because they did it by accident, or they didn't foresee the consequences of their actions, or they were trying to get the bad guy and they produced collateral damage, it's a very different scenario, and yet the body count may be the same. And so the thing I faulted Chomsky for in the past is that he seems to talk about situations where all you need to concern yourself with is body count. So the example I dealt with in my first book, The End of Faith, and this was in reaction to a short book he did right after 9-11 called 9-11,
Starting point is 01:18:33 he talked about Clinton's bombing of the Al-Shifa pharmaceutical plant in Sudan in retaliation for the embassy bombings, al-Qaeda bombings in Kenya in the 90s. And he talked about this bombing of the pharmaceutical plant as a great atrocity, you know, seemingly equivalent to the atrocity of 9-11 or worse, because of the consequences for Sudan of having half the supply of pharmaceuticals destroyed. They couldn't, you know, people would die from preventable illness as a result of this. What an incredible atrocity. Except, you know, the representation of our government was not,
Starting point is 01:19:17 and I think, you know, any rational thinking on this topic would suggest that our intention was not to destroy a pharmaceutical plant. We claimed to be bombing what we thought was a chemical weapons factory, you know, run by Al-Qaeda. And we wanted to degrade that capacity of theirs after they had just bombed two embassies in East Africa. So let's just say that's true. I mean, who knows what our actual intentions were. But if our intention was to bomb a chemical weapons plant that we didn't know was a pharmaceutical plant, and we bombed a pharmaceutical plant that was being used for peaceful purposes, and as a result, tens of thousands of people didn't get their medicine and died, that is not an equivalent atrocity to intentionally killing tens of thousands of people, right? It's an instance of bad luck. You were trying to get the bad guys, and we bombed it in the middle of the night. As far as I know, Clinton didn't even think anyone would be there, right? I mean,
Starting point is 01:20:13 so it's possible that we weren't trying to kill anyone per se. We were just trying to bomb a chemical weapons plant. If you accept that to be true, then the fact that tens of thousands of people died as a result doesn't have the same ethical significance. It is much more like you and I are just trying to get home at 55 miles an hour, but we're participating in a system that's going to kill 30,000 people this year based on our speed limits. We're not intending to kill any of those people, right? It's just, but perhaps we should have foreseen. So it was bad data.
Starting point is 01:20:47 Was it bad data? Well, no, I mean, that's what our government said about its actions. Now, let's say that's not true. Let's say it was a pharmaceutical, let's say we knew it was a pharmaceutical plant, but we also thought it was a chemical weapons plant. And we bombed it really knowing what the bad, you know, we thought we were going it was a chemical weapons plant. And we bombed it really knowing what the bad, you know, we thought we were going to get the chemical weapons facility, but we also knew we were going to destroy all of their pharmaceutical infrastructure, and that would have
Starting point is 01:21:15 cascading bad effects that all things considered, we didn't care that much about. All right, let's say it was that place on this continuum of moral callousness. Well, that's still different than trying to kill 10,000 people by taking away their medicine, right? In my view, it may not be so different, and it's something that we would be culpable for. But I think you have to... what matters, the reason why intentions matter is because they are, they're the clear expression of what we are committed, the ends to which we're committed, the kind of world we want to live in. So you have, I mean, this is what I did in that first book. I asked, this is a thought experiment called the perfect weapon, where I said, you know, just imagine what a group
Starting point is 01:22:06 would do if they had perfect weapons, right? Where there's no such thing as collateral damage. They could just target everyone they wanted to target. They would never hit the, you know, Osama bin Laden's mom, who happened to be standing too close to him. They're just going to hit Osama bin Laden. So what would any one group do with the perfect weapon? What would Bibi Netanyahu do with perfect weapons? What would Hitler have done with perfect weapons? What would Bill Clinton do with perfect weapons? People like Chomsky and Abby Martin talk about the Clintons and the Bushes and the Netanyahus and the Dick Cheneys of the world.
Starting point is 01:22:44 I'm not necessarily equating all of those people, but they're all sort of in a certain area for me, as though they would act with the perfect weapon exactly the way Hitler or Osama bin Laden or Saddam Hussein would act with the perfect weapon, that we have a level of malevolence, a level of commitment. So much of what she was saying, essentially, that the intentions of our government are
Starting point is 01:23:13 to go around the world killing brown-skinned people. And that was a phrase she used. And the spirit in which she talked about our culpability on the world stage is very much in the sense that we have intentionally murdered millions of brown-skinned people because we don't care about them. And maybe it's part of the reason why we want them dead, right? And I do not believe that's the situation we're in. I certainly don't believe that someone like President Obama wants to create massive collateral damage. And if you gave him the perfect weapon, I'm reasonably sure he would
Starting point is 01:23:51 target the bad guys. People who, if you and I could vote on whether these people should go down, 90, we would have a 90% convergence with him, right? We wouldn't find ourselves in the presence of a psychopath who just was so amped up on his power to kill that he would be killing, you know, Anne Frank, right? Whereas there are people who really did kill Anne Frank because they intended to kill Anne Frank and everyone like her, right? There's a difference. Is there any culpability? Is there any, do you put any blame on the United States government and our foreign policy and our decisions as far as the domination of global natural resources, whatever we've done overseas? Is that in any way responsible for the hatred that these people have for America in the first place.
Starting point is 01:24:52 Yeah. Well, yeah. And beyond the hatred, responsible for our alliances with people who commit outright human rights abuses. Like Saudi Arabia. Yeah. So the fact that we can't break all ties with Saudi Arabia, the fact that we can't break all ties with Saudi Arabia. The fact that we can't twist their arm and get them to behave like civilized, like a civilized culture, culture on the world stage at this point, the fact that they're, they're jailing and, and caning bloggers.
Starting point is 01:25:18 This, this was one atheist blogger, Raif Badawi. I'm sorry. I'm sorry if I'm mispronouncing his name. You know, it's an absolute scandal, the fact that we can't apply more pressure to them. And that isn't, as far as I can tell, entirely explained by our dependence on oil and our unwillingness. We have the technology to break this dependence on oil the fact that we have such entrenched financial interests that's keeping us tied to oil
Starting point is 01:25:50 And that the whole military-industrial complex is tuned to safeguard those interests For us in the world that of because those interests have been they've been monetized. They've been controlled well It's a bit monopolized. Roll back the clock 50 years. There, I'm sure, was not an alternative to being dependent on oil. There's a certain point. So we're totally dependent on oil. Civilization just needs petrochemicals to survive. And they all happen to be buried in the ground and inconveniently under the palaces of these religious maniacs. under the palaces of these religious maniacs. That may have been the situation then. So how culpable are we for securing our interests,
Starting point is 01:26:35 and not just we, the U.S., but the West, at that point, by entering a relationship with the House of Saud? That's one question. That may have been a marriage of necessity, and there have been marriages of necessity with tyrants, I think, in the past. But now, the fact that we can't sprint to the finish line and get off of oil, right? We know this is a dwindling resource. We know it's a disaster for climate change. We know that there would be We know that there would be the financial and technological renaissance that's waiting if we all just grab Elon Musk's coattails and go towards sustainable energy.
Starting point is 01:27:22 All of this, our interests, we're funding both sides of the war on terror. It makes absolutely no sense. So we should just make a full court press in the direction of sustainability, energy security, and getting be in a much better position to demand that people treat women better throughout the world and they honor free speech, etc. So I think it's a scandal that we are not doing that. And I think, yes, we are culpable for doing that. And I think we, yes, we are culpable for doing that, but given the, given what would happen to us in the near term, if we lost access to oil. And again, I'm not just talking about us. I'm talking about Europe and just the whole world. It's a, it's been a very difficult situation to be in and it's understandable that we have gotten into this situation, but you know, I don't find it understandable now that we we aren't you know sprinting
Starting point is 01:28:27 away for it so if I could define your point of view your point of view is more of a pragmatic take on what the world is currently at this stage it's not you're you're not taking away the responsibility the United States government you're not saying that they haven't made horrific decisions you're not saying that they haven't been manipulated by these gigantic corporations that are profiting off of the war that we're currently involved in. That you are just saying that if you want to look at the actual reality of the good guys and the bad guys and where the world is fucked right now, there's certain things that have to be done and there's certain people that have to be taken out. If you
Starting point is 01:29:03 do not, you put everyone else at risk. Is that that a yeah that's fair well i guess i would only um tweak it add that and i this is i've been saying this for 10 years at least or you know now closer to 15 years um and it just never gets heard i i can grant someone like Chomsky 80, 90 percent of his thesis. So I think he pushes forward into masochistic voodoo a little bit. But we have done horrific things historically. And the question is just how far you want to walk, you know, walk back in your time machine. But, you know, starting with, you know, our treatment of the Native Americans on up, it depends on who the we is, but we being the United States, right? So we get here, we start behaving badly, and we behave badly for a very long while.
Starting point is 01:29:59 And we have done terrible things. And yet it is also true that we have enemies we haven't made. We have people, there are people who have had the benefit of everything the West has to offer, who are waking up today deciding to join ISIS for reasons that have nothing to do with U.S. foreign policy. Or if they do have something to do with U.S. foreign policy, it's based on a theological grievance. It's not based on any real political concern for the lives of the Palestinians. It's based on, you've got infidels too close to Muslim holy sites. And you have, the problem, the intellectual and moral problem I've spent more time focused on
Starting point is 01:30:39 is the problem of someone like Jihadi John, right? The guy who, he's got a degree in computer science, right? He comes from a middle, upper middle class background in the UK. He's got all the opportunity anyone could want. There are at least 3 billion people, probably something like 5 billion people, excuse me, who would trade places with him to be in a position of such opportunity in this world. And yet the opportunity he wants to take is to move to Iraq or Syria and, you know, cut the heads off of journalists and aid workers. Journalists and aid workers.
Starting point is 01:31:19 They're not, you know, Navy SEALs they captured. They want to kill the aid workers. It's not an accident. It's not something that's not like a perversion of their impulse. It's not like, oh, I really wish this guy wasn't an aid worker or a journalist, you know, but he's the only guy we have. No, this is, their commitments are that horrible, right? And you have to explain how, and this is something that someone like
Starting point is 01:31:47 Abby Martin and someone like Noam Chomsky, this is the phenomenon they really don't explain. How is it that someone with all the opportunity, who's never been victimized by anyone, how is it that he is committed to the most abhorrent and profligate misuse of human life, where he's just ready to burn up the world, right? And how do you get tens of thousands of people like this coming from first world societies? And so then, given that phenomenon, then what explains the commitments of the people
Starting point is 01:32:22 who don't have all those opportunities, right? The people who are born in these societies and are shell-shocked and have been mistreated, who have understandable grievances against us, right? They are part of the collateral damage. We've been bombing over there after all, right? So it's no mystery that they would hate the West, right? Some of them. I mean, some of them still love the West.
Starting point is 01:32:41 Some of them still are trying to get out. I hear from atheists in these countries who don't hate the West. I mean, they don't follow Abby Martin's line on this. They understand why we were bombing in their neighborhoods, right? But the fact is, this is really like a science experiment. There are pristine cases of people who have no rational grievance, who devote their lives to waging jihad, and they're not mentally ill. And that's the problem that I... That problem
Starting point is 01:33:14 is scaling. The thing that I worry about is that is a meme that is spreadable. You don't have to ever meet anyone affiliated with a terrorist organization to get this idea into your head. terrorist organization to get this idea into your head. And so that's the piece I have focused on. And it's not that I've denied the reality of the other pieces. Is this related in any way to just the natural instinct that a certain amount of people have to be contrarians? I mean, there's a certain amount of people that when they find any sort of large group that's in power, they want to oppose them. If they find a band that's popular, they want to hate it. If they find a political party that's in control, they want to oppose it. There's a certain amount of people that are just natural contrarians.
Starting point is 01:33:53 When they find a group that is absolutely committed and completely involved in an ideology to the point where they're rabid about it It becomes attractive to them and they want to join that resistance to fight against the Death Star. That is the United States I'm not in religious by any stretch of the imagination But what I am is curious and one of the things that I like to do is I like to watch really pious or really obsessed religious people. I love to watch videos of them because I find it fascinating. And there's a certain amount of, when I see the Islamic scholars that are talking in absolutes,
Starting point is 01:34:35 absolute confidence about their beliefs, there's a certain amount of that that I personally find attractive. I don't want to join ISIS. I don't want to become a Muslim. But when I see someone, almost like what we're talking about with Conor McGregor earlier, where he just fucking believes, man, when someone believes, I was watching this guy, I forget his name, but he's a guy from, he lives in the UK and he's this rabid Islamic scholar that, you know, he, all of his tweets are on how Islam is superior and it doesn't
Starting point is 01:35:06 have to be adjusted like the laws of modern society and secular wisdom is inferior to the Islamic wisdom and blah, blah, blah. I watched this guy do this YouTube video where he's describing how Islamic culture is superior to Western culture in terms of the way they manage money. And he made a lot of fucking good points. He made a lot of good points about wealth and about building economies and about how you take a company that's only worth $100,000, but you could sell it for a million dollars or trade it. You have stocks, and this is invisible wealth. You know, you sell have stocks and this is invisible wealth and Islam doesn't allow invisible wealth because that's how societies get crushed. And that's how other economies crumble.
Starting point is 01:35:51 And I'm watching this guy with his moral certainty and his extreme confidence in what he's saying. Absolute. And it becomes compelling. And I'm not joining. I'm not i'm not saying that he got me but i'm saying that i'm just absolutely admitting there's a certain aspect of human nature that gets compelled to join groups oh yeah yeah well there's something there's that component of it which which i understand but there's also just the the religious ecstasy component, the aesthetics, the emotional component of it, which I really understand and I'm susceptible to. So I have a blog post, I believe it's called Islam and the Misuses of Ecstasy,
Starting point is 01:36:35 where this is actually the first blog post I ever wrote where I realized I could not possibly have written this in book form or in a newspaper because it relied on embedded video. The only way to have done this was book form or in a newspaper because it relied on embedded video. The only way to have done this was with embedded video. And I wrote this, I think, once again over protests, that something was said about me by Glenn Greenwald or somebody. The charge had been that I totally lack empathy. I don't even know what it's like to be, you know, what these people are getting out of their religion, right?
Starting point is 01:37:08 I've just demonized a whole people. I don't understand religion. And so I wrote this blog post to try to indicate how far from the truth that was. So I wrote, so I put the example of the call to prayer, right? Which I think, I mean, there's some that sound kind of ratty, but a nice call to prayer, I think is one of the most beautiful sounding things humanity's ever produced, right? I mean, that hits me, that gets into my bones, right? I don't have to imagine what a devout Muslim is feeling when he hears the call to prayer. I think it's absolutely beautiful,
Starting point is 01:37:42 right? And... Without even knowing the language. Exactly, right? And I'm not without ever having been a Muslim or believing any, I mean, so if that sound, and again, your listeners can just read that blog post, I only dimly remember what I wrote, but if that ritual was purposed towards some other end, right? If that ritual just was signifying, you know, let's all get up in the morning and consider how profound human consciousness is and consider our togetherness on this, you know, rock spinning through, you know, empty space
Starting point is 01:38:18 and realize that we just have this common project to make the world beautiful, right? If that was what that meant, right? I would just want a minaret, you know, right next to my house. I mean, I would be totally on board with the experience of participating in that. So I'm totally empathetic there. And so I went through, you know, many other instances of this where something I'm seeing in the Muslim world, I really grok how beautiful and meaningful and captivating this is for people. But then at the end, I put in a Quranic recitation and sermon by a, I forgot his name now, but some sheikh who's got like 10 know, like 10 times the number of Twitter followers you have, right? I mean, he's like, he's not a fringe figure. He's a Muslim rock star. And, you know, you see the translation of what he's giving this tear-filled recitation of the Quran, which, again, is beautiful, right?
Starting point is 01:39:18 He's a great singer. And it's a packed house in wherever it was, Saudi Arabia or Yemen, and but what is being said there is so ethically ugly, right? Essentially celebrating the tortures of hell, right?
Starting point is 01:39:38 And just expressing a certainty that infidels are going to go to hell and how this is a you have to organize your life around this question, about how to escape the torments of hell. And the only way to do it is to be a true believer in the Quran and Muhammad, etc. And this is at the center of the mandala of their ethical concern. Nothing matters, but nothing in this life matters but avoiding hellfire.
Starting point is 01:40:14 And so there's a kind of a ghastly perversion of this impulse that I think many of us feel, I certainly feel it, to transcend yourself, to experience bliss and ecstasy and compassion. And it is very much like Burning Man for people. I mean, imagine if Burning Man were just as ecstatic as it was and attracting all the smart people that it attracts, but strewn throughout it was a message of just true divisiveness. Like, everyone else who's not here is going to be tortured for eternity, and they deserve it, and we shouldn't be their friends, and we should fuck them over any way we can when we get out of this place.
Starting point is 01:41:01 And if God had wanted to enlighten them, he would have, but he hasn't. So we're the only ones here. And just a kind of a durable message of us, them thinking that just cannot be dissolved, right? That's what's going on in the Muslim world. And it's a huge problem because it does, it's pulling all the strings of, you know, it's not just Islam, obviously. Christianity has a version of this and all religions in principle have a version of this, but there are differences. There is no version of jihad. There's no Buddhist jihad. It's not to say that Buddhists can't do terrible things and it's not to say you can't find Buddhist reasons for doing
Starting point is 01:41:46 terrible things, but jihad is a jihad, martyrdom, paradise. This is the jewel, the horrible jewel that so many millions of people are contemplating in
Starting point is 01:42:01 Islamic context, and that's what I'm worried about. And I'm not insensitive to the experience people are having. Is this version of Islam recent in human history? No. This extreme radical version? Well, there are some things you can say that have, you know, with Wahhabism and Salafi-style Islam generally that have been politicized and tuned up in a negative way in the last century.
Starting point is 01:42:32 You can say that, but the reality is that jihad is as old as Islam, and Islam spread by jihad. But isn't the original version of jihad a war on your own vices? No, no. That's just, no. I mean, there is that component to it. There is an inner jihad and an outer jihad, but there was always an outer jihad. And that's how Muhammad spread the faith.
Starting point is 01:42:55 And Muhammad, I mean, to answer your question very simply, as I did somewhere, I just said, there's absolutely nothing ISIS is doing, the Islamic State is doing, that Muhammad didn't do. Right? I think I said, good luck finding something significant, some difference between them. I mean, taking sex slaves, right? Muhammad took sex slaves and gave sex slaves to his generals. It was totally kosher. It's a kosher thing to do. If you're going to follow the example of Muhammad...
Starting point is 01:43:25 It's definitely not kosher. Halal. I mean, if you're going to follow Muhammad's example, which is a real... perhaps the main lens through which you have to look at this. I mean, there's the... It's just what's in the Quran, and there's what's in the Hadith,
Starting point is 01:43:41 the larger literature, and there's the example of Muhammad, which is attested to in both those literatures and in the early biographies about him. Muhammad was not like the Buddha. He was not like Jesus. He was a conquering warlord who succeeded, right? And that is an example that is very different from the example of a guy who got crucified, or the example of a guy who spent his life meditating and then teaching, right? If the Buddha had been lopping heads off, you know, at every sermon and advocating, just talking endlessly about when to kill people and how many people to kill and how to treat your sex slaves, if that was just strewn throughout the Buddhist teaching,
Starting point is 01:44:31 I would expect Buddhists to behave exactly the way we see members of ISIS and al-Qaeda and al-Shabaab and Boko Haram behave. How to treat your sex slaves? Sure. How do you treat your sex slaves? Not you. Yeah, yeah. How was one? Taking sex, I mean, taking sex slaves, taking...
Starting point is 01:44:52 I mean, so you can... It's not adultery if you're having sex with sex slaves. It's adultery if you're having sex with... Someone you care about? Other women, right? Other Muslim women, but you're... How convenient. No, you can...
Starting point is 01:45:05 Slavery... I mean, this is the horror of Abrahamic religion generally. I mean, this is why we know these books were not authored by a moral genius. The Bible and the Quran can't give you a basis to resist slavery. Slavery is supported in both traditions. So the fact that we have, after centuries, decided to more or less unanimously that slavery is an abomination, that proves that there's more moral wisdom to be found outside of these books than inside, at least on that point. And I would argue on
Starting point is 01:45:45 virtually every other point of consequence. Now, it's not to say they're not gems of moral wisdom in some of these books, but they're not best found in those books. And there's so much else in there that gives you the ethic of the Taliban. And it's an inconvenient fact because it is, I mean, this is what the fundamentalists an inconvenient fact because it is, I mean, this is what the fundamentalists, the quote fundamentalists do, that the Islamists and the jihadists look at the books, you know, the members of ISIS right now have the theology on their side. It's not like they're ignoring the books. They're looking at the books very literally and they're saying, you know, what in here uh you know what are we what are we doing that you don't find in the books essentially it's like we we're just this
Starting point is 01:46:32 is just connect the dots you know that was one of the videos that you had uh posted up on your blog that you and i discussed the guy that was standing in front of all those people that was talking about stoning people for adultery or the treatment of homosexuals. And how this is not radical Islam. This is just Islam. And that was shocking. And that's one of those videos where you post it or you talk about it and you get a million people that get upset at you over it.
Starting point is 01:46:57 They get a million people that call you Islamophobic or what have you and get upset about it. And a lot of those are the same people there was a weird thing that happened After Charlie Hebdo that really kind of freaked me out Where there was a lot of liberals and progressives that were pointing to the callousness of the cartoons Yeah, that's almost a justification for murdering a bunch of cartoonists Yeah, that you know the punching down thing kept being discussed this weird liberal obsession with the way humor is disseminated that in somehow or another it justified or at least
Starting point is 01:47:37 Rationalize the fact that they could just gun down Cartoonists yeah fucking cartoonists. You're not talking about people that are doing experiments on monkeys or people that are torturing animals. You're not talking about people that are imprisoning other human beings. You're not talking about people that are even stopping people from doing
Starting point is 01:47:58 anything. Just mocking them in cartoon form. And in this case, mocking also Christianity and the Vatican, and many of the things that were interpreted as racist weren't even racist if you understood French or French politics. So it's shocking.
Starting point is 01:48:14 And the people who missed the train on this, people like Gary Trudeau, the Doonesbury creator, he just came out against Charlie Hebdo and a bunch of writers who belong to the Pan America
Starting point is 01:48:30 organization, which is the whole point of which is to defend free speech. They just walked out of a gala event or declined to show up because Pan had given Charlie Hebdo the Freedom of Expression Award this year, as they should have. And some prominent people left in protest.
Starting point is 01:48:50 And it's, no, the fact that... What is that? What is that? Well, it's political correctness and fears double standard here where you can, that there's some trade-off between freedom of expression and freedom of religion. Where when the freedom that's being claimed on the religious side is the freedom not to be offended, right? So, I mean, really what's happening here is some number of Muslims are demanding that non-Muslims follow the taboos of Islam. So, it's taboo for you to say anything negative about the Prophet or even to depict him in a drawing, right? That's where it gets really crazy, right? And we want you to follow this taboo, though you are not Muslim.
Starting point is 01:49:50 And we feel so strongly about this that we're going to kill you or threaten, make credible threats of killing you, or we're just going to, when people do kill you, we're going to blame you for having gotten yourself killed, for having been so stupid and insensitive by caricaturing the prophet. And that whole, I mean, that just has to lose. I mean, we have to hold to free speech so unequivocally that all the people over here who think that there's this trade-off between religious sensitivity and free speech just have to realize that they've lost, because we don't play this game with any other religion.
Starting point is 01:50:29 Just think about this analogy I've used before, but the Book of Mormon, right? It just pillories Mormonism. It makes Mormonism look ridiculous, right? What did the Mormons do in response? The Mormons took out ads in Playbill, right? It was very cute what they did. They took out ads like, if you like the play, you know, come learn the real stuff, right?
Starting point is 01:50:50 It was just, it was totally civil, good natured, fine. They're my favorite cult. Yeah. They really are. I like them way more than I even like Scientology, which is my second favorite cult. Right. But Trey Parker and Matt Stone are not looking over their shoulders for the Mormon assassins who are coming for them. But they were about Muslims. Briefly, they put Muhammad in
Starting point is 01:51:12 a bear suit. It was just a bear, right? And then they had to put the bear suit in a van. Yeah. And then they pulled it off the air. Worse still, they had to pull it off the air. And worse still, it made sense for them to pull it off the air, given the actual nature of the threat. And so we've, so as I've argued, we have already lost our freedom of speech on this issue. And so the people- On that one individual issue, we've almost- The only issue on earth, really. And there are people on the liberal side of this argument who think that is a good thing, that you are a racist to question the decency of that situation.
Starting point is 01:51:53 And it's just not true. It's a completely insane doctrine that we should be able to criticize to our heart's content without threats of violence as we can with every other insane doctrine. Do you think that that's fear? That that's a fear of Islam, a fear of retaliation, that they want to be on the side of the others because it's so dangerous? Because they are the only religion that will come out and kill you. And these same people, I've found, that will call people out on being Islamophobic will not say a fucking peep about anti-Christian rhetoric. If you start talking shit about Jesus or mocking Christianity,
Starting point is 01:52:29 they never have a word to say about it because it's not dangerous. Because it's not dangerous to be on that side. I think it's much more just white guilt and political correctness. There's definitely some of that as well. Just a sense. And political correctness. There's definitely some of that as well. Just a sense.
Starting point is 01:52:49 It's just, it is, if you take, again, I don't mean to trash Abby per se. If you met her, you'd love her. I'm telling you, she's a great person. I'm sure she's cool. But if you take her view of our foreign policy, if you just agreed with her down the line, just check all those boxes, 2 million people, we did it all. We just kill brown skin people all over the world because we just like to sell bombs, and that's really our moral core, you know, then, yeah, then we should have a fair amount of white guilt, right? Then it's understandable that you think that more or less any non-Western population that expresses a grievance against us has a point. Well, isn't there a real problem with saying our?
Starting point is 01:53:27 Because you and I have nothing to do with that. And we're a part of this weird gang called the United States of America. Whenever you say us, what we've done, us, I mean, we haven't done shit, but we're somehow another lumped into this group. That's a big part of it. But we participated in a system, the existence of which is predicated on some of this shit. The existence of which existed long before you and I were ever born. We're born into a system we have zero control over. for us as a species is going to come not with each one of us developing an ethical code that
Starting point is 01:54:06 allows us to be a hero, you know, personally, and just bucking a system and bucking a trend, you know, from morning till night. We need to design systems that are more benign, you know, so it's like, it comes down to the, you know, our smartphones, like, is there a way to produce a smartphone that is ethically benign right now at the moment it seems like there isn't or at least we're not we're not being so scrupulous as to find one but you mean as far as conflict minerals exactly all of it's like it could can could we actually be good people all the way down the supply chain slave labor all the way down now i would pay more for that phone there's no question well you know there was a phone that they were trying to produce about that it was called the fair phone and it was
Starting point is 01:54:46 non-conflict minerals it was but it was only 3g nobody right piece of shit that's right I'm not kidding but that's where it comes down to the fair for our hold on our better nature is so tenuous that that the difference between 4g and 3g could make the difference right it's yeah, let's see if they've moved up to 4g. I'll fucking buy it. Right. Right. It's gotta be, nobody wants it, but my, my point is, look at that. They're sold out. Wow. My point is no one should have to have a bad phone to be a good person. Right. So, but we, so we want systems, um, that is adorable, right? When you see liberals with an iPhone 6 Like listen son
Starting point is 01:55:27 We are those liberals too Look at that guy with a fair phone That's what you get when you get a fair phone That's perfect that god damn brick Something the size of a toaster That's some shit from an iced tea video from 1988 Look at that brick And that guy's got up to his ear
Starting point is 01:55:42 That is an unfair phone That is a terrible way to sell your phone. Why would you have that fucking ridiculous fault? You can't put that in your pocket, son Yeah, well The all of those people that buy those things that have those those like extreme liberal values progressive values You have to deal with the absolute reality that at the very least, your phone is being produced in a factory where people are jumping off the roof. And that's a fact. Unless they're making them in Korea. The Samsung phones, I think ethically, I think they have like a leg up on the iPhone in the sense that, you know, those Foxconn buildings where they have nets installed all around the building to keep people from jumping off the roof because it sucks so bad there.
Starting point is 01:56:26 And I've heard the argument against that. Well, the amount of people that you've got to deal with the fact that these factories employ half a million people and the number of people that commit suicide is directly proportionate to the same number of people that would commit suicide in the regular population. But they're killing themselves at work. Like, how many people kill themselves at work? Like, that's not normal. And they live at work. Okay, well, that's not normal. And they live at work.
Starting point is 01:56:46 Okay, well, that's not normal either. You got slaves. These are essentially wage slaves. Again, these are situations where there's often, or at least sometimes, no good option immediately. So when you think of, like, child labor laws in a place like Pakistan. I know, but look at that phone, dude. Look how beautiful it is. Look at that talks to you and shit.
Starting point is 01:57:08 Come on, man. Look at that screen. Pretty. I, I'm not going to lose any sleep at night over your owning that phone. Thank you. But it's a,
Starting point is 01:57:16 um, I think you and I would, and millions of other people would, uh, probably, I mean, I know I would, but I think millions,
Starting point is 01:57:28 if, if we could make the problem transparent, we would pay more to be truly good actors across, you know, in all of the streams of influence. And, but there are certain situations, again, where, you know, I just mentioned child labor laws in Pakistan. If you go, if you just say no kids can work, right, because this is obscene, this is, you know, we haven't done this in the West for over 100 years, you know, we don't want kids stitching together our soccer balls, right? school. Well, there are situations where that may be workable, right? Where you get the kid out of the factory and where he's been working 14 hours a day and you get him into school and he's got a better life. But there are many situations in places like Pakistan where, no, what you've just done is you've made it impossible for this kid to work and you've further impoverished his family because he wasn't going to go to school anyway. Now he's going to find that he's going to be picking stuff out of a trash heap or whatever it is. And he does, you haven't put in place an alternative that's workable. And so we, in many, with many problems
Starting point is 01:58:32 of this sort, we have to find a path forward where the first doors we open, all the choice between the doors we have to open all suck. Right? And there are situations geopolitically that are like that, where you can either back a guy who's a dictator, right? But he's secular and he's committed to a first approximation to basically sane relations with the rest of the world. But he really is a dictator, and he really has a history of treating people badly. And he's going to treat political dissent very badly because of the possible consequences for him if he doesn't, because the society is
Starting point is 01:59:16 bursting, coming apart at the seams. Or you can just let the Islamists and the jihadists run the place, right? And that is a, you know, there's no good option, and it's understandable that we have, in many cases, chosen the dictator there. Well, that was sort of the situation with Saddam Hussein. Right. Yeah. I mean, a psychopath. His children were psychopaths, murderers, serial killers. He did horrific things, but he was very secular in the way he ran his country.
Starting point is 01:59:44 Yeah, and so we're facing this on many fronts. I want to ask you this, because you have these extreme opinions about these things. You have these extreme criticisms. If you could, if ultimately someone said, look, Sam, you're going to be king of the world. You are going to be the guy that gets to sort this mess out. We need someone to engineer a global culture. What would be the step that you would take to try to alleviate some of the suffering of the world, alleviate some of the bloodshed, alleviate all these conflicts, these geopolitical conflicts?
Starting point is 02:00:20 Well, in this area, the first few things I would do, we've already talked about. One is I would make it absolutely clear that free speech just wins. So whenever you got into a Charlie Hebdo situation or the Danish cartoons, the riots over those cartoons, we've had half a dozen situations like that in the last 10 years. The people, even our own government can't, we're fighting a war on terror, and we still can't defend free speech when those situations erupt. So, for instance, this was over the Innocence of Muslims film.
Starting point is 02:01:04 I don't know if you remember that film. Yes. It was a YouTube film that kicked off riots everywhere. Was that true, though? Because I've heard so many versions. No, it did. Well, I mean, the Benghazi thing was, it's true that it did kick off riots everywhere. riots everywhere but the the thing that was egregious about our um government statement there was that we basically just rather than to take the totally sane line of saying listen
Starting point is 02:01:35 in our society we have we're committed to freedom of speech and you can make films about anything here and that never gives you license to kill people right uh or to burn embassies you know full stop what was the name of the documentary um well it was it was a a film called uh the innocent i think the innocence of muslims or innocence of muslims um made by some you know crackpot somewhere and it was just a youtube video but it got um you know spun as this major scandal in the Muslim world, and it reliably produced this reaction of the sort that the Danish cartoons had. And we, rather than just hold the line for free speech, we, I mean, the State Department said something like, you know, we totally repudiate this attack upon Islam. And we just distanced ourselves from it just as a way of trying to contain the madness, right? It was a symptom of just how afraid we are that this sort of thing can get out of hand in the Muslim world, because it can, right?
Starting point is 02:02:45 out of hand in the Muslim world, because it can, right? If there's a rumor that a Quran got burned, or if some, you know, pastor in Florida threatens to burn a Quran, reliably doesn't, people by the dozens get killed in places like Afghanistan, because they're just, you know, it's in a way that a suicide bomb in between Sunni and Shia never produces a response of that sort. So it's a, um, I would hold to free speech and I would, I would just make that because free speech is the freedom that safeguards every other freedom. I mean, if you can't speak freely, if you can't, if you can't criticize powerful people or powerfully bad ideas, there's just no way to defend society from slipping back into, you know, theocracy or any other kind of medieval situation. And so you have to defend free speech, even for even speech you don't like. You know, it's like these Holocaust denial laws
Starting point is 02:03:40 in Western Europe. It's illegal to deny the Holocaust in Germany and a few other countries, I think Austria, I think even France. And it's a ludicrous law. You should be totally free to deny the Holocaust, and then everyone else should be free to treat you like an idiot. And you should be free to destroy your reputation, right? The fact that they are putting people in jail for denying the Holocaust is totally counterproductive. And it does look like, in defense of Muslim apologists, it does look like a double standard. You're going to put people in jail for denying the Holocaust, but you're going to allow Charlie Hebdo to criticize the prophet? How does that make sense, right? I totally agree with them there.
Starting point is 02:04:24 We should not be criminalizing any form of speech. Regardless of how stupid. Yeah, but there are people trying to push through blasphemy laws. There's a politician in the UK who recently just said he would make Islamophobia a criminal offense, right? I'm sure he would make the sorts of things I say about Islam criminally actionable in the UK, right? This is a disaster. That's the wrong road to go down. So first thing, and I think that's a hugely important thing. And the other piece we just talked about is just getting off of oil.
Starting point is 02:04:59 Just imagine that one change, right? We could get off of oil. And that would prove beyond any shadow of a doubt that spending your life splitting hairs about Muslim theology and demonizing the rest of the world and exporting, you know, madrasa, crazy madrasas by the tens of thousands all over the world, as the Saudis do, it would prove that that is not a way to join the community of functional nations. Because absent an ability to pull their wealth out of the ground, they have no intellectual content. They don't produce anything of value that anyone wants. That's a problem they would have to solve, right, if they don't want to be beggared in a global community. Well, isn't that an issue also with the ideology of the religion is that you're not allowed to question or change or manipulate the way you approach life because it's all dictated by the religion, even the finances? Even their finances. Economically and socially doesn't make any sense in a context where you need to produce intellectual content to be part of a global conversation.
Starting point is 02:06:31 So the only way they've been able to do this is because of the fact that they have an extreme amount of money that comes from oil. Well, certainly if you're talking about the oil states, yeah. And so that's if we, if oil were no longer valuable and we actually could get to a time where that would be the case, where oil is just a dirty fluid that no one wants to have anything to do with, right? That would be a huge change. Now, I'm sure there's another side to this argument where it would be a destabilizing change. I mean, just imagine how things will start to run off the rails in the Middle East if oil is worthless, right? And what's Saudi Arabia going to be like?
Starting point is 02:07:11 I mean, arguably, I think they've probably hedged their bets and they have so much money in other investments now that, you know, at least the royal family would be fine. But it's a huge part of the problem. And as you pointed out, it keeps us double dealing and being captive to the cycle of defending our very real economic interests. I mean, really like existential interests in terms of our energy supply over the years and producing mayhem as a result. But how do you get someone to abandon such a rigid ideology? How do you get someone to open their mind up to the possibility that this was just written by people?
Starting point is 02:08:01 There's just a way of governing people and keeping people in line, which is essentially every single religion that's ever been created. Well, but see, it happens. And actually, this is another point that Abby Martin made, which I agree with. I just, she doesn't, we don't agree with it for the, we don't think this thought for the same reasons. But she pointed out that religions change, right? That you roll back the clock 500 or so years,
Starting point is 02:08:29 Christians were burning people alive and actively prosecuting people for blasphemy. You had the Inquisition in Europe, and that was every bit as much of a horror show as what's going on in Iraq now. So look, Christianity can be just as bad as Islam. Now, it's true, as a matter of history, that on in Iraq now. So look, Christianity can be just as bad as Islam. Now, it's true, as a matter of history, that is in fact true. There are differences between Islam and Christianity that are nevertheless important. But the crucial piece is that Christianity did not change from the inside. You know, Christianity got hammered from the outside by a renaissance and a reformation initially, which was bloody and horrible.
Starting point is 02:09:08 But it got hammered by the forces of a scientific revolution and an attendant industrial revolution and capitalism and the rest of culture that didn't want to be shackled to theocracy for very good reason, right? And so, like, once you have a real science of medicine, you don't have to ask the priest why your child is flopping around on the floor. And when the priest diagnoses it as demonic possession, you don't take it seriously, right? You now have a neurologist to talk to. I mean, there's progress made outside of religion, which propagates back to religion and applies a lot of pressure to it. So Christianity has been humbled and mastered by the secular world,
Starting point is 02:09:57 by humanism, by science, by the rest of our conversation with ourselves. And this has not happened in the Muslim world. And it should happen. It has to happen. We have to figure out how to engineer it for Muslims. And it's not, again, it's not going to come from the outside. You know, non-Muslims are not going to force it on Muslims. But we have to support the genuine reformers and the people who are fighting for the rights of women and gays and free thinkers in the Muslim world. And the horrible thing is that the liberals on our side don't do that.
Starting point is 02:10:29 The liberals on our side criticize people like me and even Ayaan Hirsi Ali, you know, a former Muslim who has been hunted by theocrats, right? They criticize her as a bigot for how unsparing her criticism is of Islam, whereas she is fighting for the rights of women to be equal in the Muslim world, right? And so our liberalism has just truly lost the plot here. And we have to be committed to the same kind—I mean, we are concerned about the rights of women in Silicon Valley, right? I mean, that's how effete our concerns are now. Like, why isn't there an equal number of women in venture capital now? What a fucking scandal, right?
Starting point is 02:11:15 There are people who can go on for hours about that, right? I'm not saying it's not a potential scandal. Great, let's talk about that scandal. But let's talk about the fact that, you know, girls six years of age are getting clitorectomies by barbarians in septic conditions. And everyone around them thinks it's a good and necessary thing. Right. And, you know, women who get raped get killed because they brought dishonor on their family. I mean, this is, this is, there's another planet over there that we have to interact with because it's violence is coming our way for no other reason. But there's another reason. There's the ethical imperative of figuring out how to help people who are, who are hostage to a bad system.
Starting point is 02:11:58 And, and so, yeah, let's be, let's be for women's rights globally, but what does that look like? That looks like a rather staunch criticism of the way women are treated under Islam. There's a lot of seemingly open-minded European cultures that have opened the door for a lot of Islamic immigrants or Muslim immigrants to come over to their country. Now they're dealing with a lot of the issues that involve these ideologies being a part of their culture now. Yeah. Well, I mean, they are in a situation similar to ours with Latin America, where they just, they need immigrant labor, right? They have a, they're actually worse than the U.S. in terms of their replacement rate.
Starting point is 02:12:39 You've got a bunch of countries in Western Europe who are becoming these senescent populations. They're just not replacing themselves, and they need immigrant labor. And most of the available labor is coming from the Muslim world. And then you also have the problem of political refugees who are leaving war-torn places for obvious reasons and winding up in the closest closest shores you know across the mediterranean and um so yeah it's a the people they're attracting are different from from the many of the muslim immigrants we get in the u.s who are coming you know to work for google or to they get engineering degrees or it's a different demographic largely. Okay.
Starting point is 02:13:27 I think we've covered that subject into the ground. So, so let me, I'll just mention the things on this list and you can send Abby a big kiss. And again, I don't, I hope, I hope what I said about Abby didn't seem mean spirited. I'm just saying like, she just, she actually is wrong about me. And, and if I'm wrong about her, I, I'm happy to be enlightened on that topic. Well, it would be interesting to have two of you sit down together. That would be hilarious. I'll bring the tequila.
Starting point is 02:13:51 Is that what's necessary? For me. I'll bring my own tequila. Okay. So we did free will, I think, unless that comes up again. AI, which— Is that something you're concerned with? Yeah.
Starting point is 02:14:06 I actually just blogged about this Did you? I didn't even spoke about it But I think it was in my head to talk about it Because I heard you talk about it with someone It might have been Duncan Trussell Most likely, we've talked about it many times But AI, I just got on to this On the bandwagon here
Starting point is 02:14:21 Because I hadn't really thought about it at all I'm not really a sci-fi geek. I don't read science fiction. And the word in neuroscience has been for a very long time, and really science generally, is that AI hasn't panned out. It's not that it's inconceivable that something interesting is going to happen, but it has been old-style AI was really a dead end, and we never really got out of that particular cul-de-sac, and we just haven't made much progress. And so we have, you know, the best chess player in the world is a computer that's the size of this table, but the prospect of having truly general artificial intelligence and superhuman level intelligence,
Starting point is 02:15:08 that's not something we have to worry about in the near term at all. But then I heard, as many people did, my friend Elon Musk say something which seemed quite hyperbolic. He thought it was the greatest threat to humanity, probably worse than nuclear weapons. And there was a lot of pushback against him there. But, you know, I actually know Elon, and I knew he just wouldn't say that without any basis for it. And it just so happened there was a conference
Starting point is 02:15:42 that had been scheduled long before in Puerto Rico, in San Juan, that was really like a closed-door conference for the people who are really at the cutting threat and looking at, in this case, how to create a, foreseeing the existential problems around the development of AI. And it was a conference. I mean, maybe there were 70 people at the conference, and it was all people who were doing this work and a couple, I mean, I literally think I was one of maybe two people who had sort of talked his way into the conference. Everyone else was just invited, and they had a good reason to be there. And what was interesting is that outside this conference, Elon was getting a lot of pushback, like, dude, you don't know what you're talking about. Go back to your rockets and your cars, but you don't know anything about computers, apparently. And he was getting this pushback
Starting point is 02:16:47 from serious people, people who are on the edge.org website, where I'm also occasionally published, and roboticists at MIT, and people who should, former top people at Microsoft, people who you'd think are very close to this, would say, no, no, this is 50 or 100 years out, and this, you know, this is crazy. And so anyway, I went to this conference just wanting to see, you know, what was up. And what was interesting, and frankly scary, was that at the conference, even among people who clearly drunk the Kool-Aid and are just not willing to pull the brakes on this at all, I mean, they don't even, it's arguably, it's hard to conceive of how you would pull the brakes on this because the incentive to make these breakthroughs financially is just so huge that, you know, if you don't do it, someone will, and so everyone's just pedal to the metal. But basically, even the people who were going at this most aggressively were people who were conceding that huge, it was not at all fanciful to say that huge breakthroughs
Starting point is 02:18:07 in artificial general intelligence could come in five or ten years, right, given the nature of the progress that had been made in the last ten years. And the scary thing is that when you look at the details, it's not at all obvious to see a path forward that doesn't just destroy us. Because it's not, you'd think that, I mean, I think most people's default bias, and it was mine, frankly, going into this was, well, this probably is kind of like Y2K, right? Everyone is worried that, you know, the clocks are going to change and all of our computers are going to seize up and we're going to have a real problem on our hands, but the clock changes and nothing happens, right?
Starting point is 02:18:51 So this is just, you've got a bunch of nerds worried about something that just doesn't happen, right? So is that an analogy for this situation? It really isn't. What's going on here is you're talking about, even in the most benign case, well, let me just step back, because I'm assuming that a lot of people understand what we're talking about here. AI or often called AGI, like artificial general intelligence. You're talking about a machine that is, where the intelligence is not brittle. It's not like, you know, the best chess playing computer in the world can't play tic-tac-toe, right? So it's like all it can do is play chess. So it's not a general intelligence. You're talking about something that learns how to learn in such a way that the learning transfers to novel you're giving it new problems now. So you're giving something that scales, that can move into new territories, that can become better at learning,
Starting point is 02:20:11 and in the ultimate case, can make improvements to itself. I mean, once these machines become the best designers of the next iteration of software and hardware, well, then you get this sort of, this exponential takeoff function, or, you know, often called the singularity, where you have something where there's a runaway effect, where it's just, you can't, this is now, the capacities are, it's just gotten away from you.
Starting point is 02:20:39 And so the, you imagine, what's often said is that we're going to build something, the near-term goal is to build something that's human-level intelligence, right? So you're going to build, we have a chess computer that's not quite as good as a person, and then it is as good as a person, and now it's a little better than a person, but it's still not so much better as to be completely uncanny to us. And we're thinking of doing that for everything. But the truth is that as a mirage,
Starting point is 02:21:06 we're not going to build a human level AGI. Once we build an AGI, it's going to be better at, which is to say, once we build a truly generalizable intelligence, something that can, you know, prove mathematical theorems and make scientific hypotheses and test them and, you know, everything a human can do, it's going to be so much better than a human because of a variety of reasons. One is that your phone is already better than you. I mean, it's superhuman in many respects. It has a superhuman memory.
Starting point is 02:21:37 It has a superhuman capacity to calculate, right? And if you hook it to the Internet, it has potential access to all of human knowledge, right? So we're not going to build a human level AGI. We're going to build something that is going to be not an AGI, right? It's going to be like a dumb chess playing computer until it isn't. And then it's going to be superhuman, right? And when you're talking about something that runs potentially a million times or more faster than a human brain, because you're not talking about a biological system now, you're talking about photons,
Starting point is 02:22:14 it could make, you just do the math and you see that this thing is running for a week, that is the equivalent of 20,000 years of intellectual progress. So it's just like, get the smartest people alive in a room for 20,000 years, right, with access to all of the world's information, and an ability to model new experiments and computational abilities of the sort that we can't imagine. And 20,000 years from now, what are they going to come back to you with? That's going to be one week of this machine running, right? So this is how this thing sort of escapes.
Starting point is 02:22:48 How do we feel that we can control the goals and the behavior of a system that is capable of making 20,000 years of progress in a week, right? And when you hear about how they're going about designing these systems, it is kind of uncanny. They're talking about designing, you know, like blackboxing these systems where the first thing you want to do is not give it access to the Internet, right? You're just going to cage this thing, right? Because you don't want it to get out, right?
Starting point is 02:23:22 But you want to tempt it. You want to see if it's trying to get out, right? so you're going to give it like a dummy ethernet port that you're monitoring right i mean this is the the people doing this work at the highest level are talking about games like this we're like how do you know whether the thing is lying to you that it tried to make how do you how do you know whether it knows about the internet how do you know whether it is so this is called a honeypot strategy, where you tempt it to make certain moves in the direction of acquiring more power than you wanted to give it, and then you can just, you know, just shut it off immediately.
Starting point is 02:23:57 But you're talking about guys, you know, who are a lot younger than us, who, you know, many of whom are somewhere on the Asperger's continuum who are drinking a lot of Red Bull and, and, uh, have billions of dollars at their disposal to, to do this work. And, um, uh, there's a, there's a huge responsibility not to do anything that obviously destroys the world. And the problem is, even when you think about the most benign versions of this, the possibility of destroying the world is not fanciful. So, like, just imagine, forget about what I just said about 20,000 years of progress. Just imagine we build this, we have an AI,
Starting point is 02:24:42 and someone, you know, working for Facebook or whatever, builds this builds this thing it's totally we've solved what's called the control problem we figured out how to keep this thing doing our bidding right it doesn't it's not going to design come up with near-term goals that that are antithetical to human happiness like I mean it's just a non-trivial problem if you say okay just you have to be committed to human well-being, right? If that's the foundational architecture of the system, it depends what that means in certain cases. I mean, what if the thing decides, well, okay, if I'm committed to human well-being, I'm going to kill all the unhappy people, right? Or I'm just going to plug electrodes into the right part of the brain of every human being and give them just pure pleasure, right?
Starting point is 02:25:22 You have to solve these problems. brain of every human being and give them just pure pleasure, right? You have to solve these problems. But, so let's say we build something that's totally under our control, it works perfectly, and we don't have to worry about the control problem. We still have to worry about the political and economic effects of building something that's going to put the better part of humanity out of work, right? I mean, you're talking about now something that can build further iterations of itself, where the cost of building versions of itself now is going to plummet to more or less the cost of raw materials. You're talking about a labor-saving device of a sort that no one has ever anticipated. And we don't have a political system that can absorb that. We have a political system where we would see the cover of
Starting point is 02:26:05 some, some, the picture of some trillionaire on the, on the cover of Inc magazine. And we would hear that unemployment now was at 30% even among white collar people. Um, and so we need, we need to, even if we were going to, I mean, it's, it's humbling to realize that even if we were given the perfect labor saving device, it could device, it could screw up the world. We couldn't reliably share that wealth with all of humanity, which is, of course, what we should do. But we're in a system where the Chinese and the Russians would probably reasonably worry that we're going to use this thing as the ultimate tool of war, right? Both terrestrial and cyber. So just imagine the cyber war and the drone war we could unleash on the rest of the world
Starting point is 02:26:52 if we had the ultimate war-making computer, right? And they didn't. So this is like a winner-take-all scenario that is unsustainable politically. So we have to get—politically, we have to be in a position, and economically, where if this thing were handed to us, we could use it for benign purposes and share the wealth. And we're not even, we're not there yet.
Starting point is 02:27:14 And that is the best case scenario. That isn't even dealing with any of the problems of this thing having a will of its own, which, of course, it would. Is it possible that it's the next stage of life? Yeah. Well, that's the other uncanny thing at this conference. You had a few people whose names escaped me, unfortunately.
Starting point is 02:27:36 Actually, no, the rules of the conference were I couldn't even mention their names if they hadn't escaped me. Really? Yeah. Whoa. They're secret rules? Well, no, it's called the Chatham House Rules. Certain conferences are organized under, and board meetings are organized under these rules where you,
Starting point is 02:27:53 because you want to encourage, so there's no press there, right? And you want to encourage just a free exchange of ideas. And you can talk about what was talked about there, but you can't give any attribution. And you can't. I mean, nothing's formally on the record. Where did this conference take place? Puerto Rico. Whoa, you had to go to another country.
Starting point is 02:28:10 Sort of. Not really, but sort of, right? Yeah, I don't know that that was. I think it was just they were looking for good weather. It was in the middle of the winter. They had plans. They wanted to go to where that giant Arecibo. Isn't that where the one of the big telescopes they use to search for extraterrestrial intelligence from the movie Contact?
Starting point is 02:28:29 Wasn't that in Puerto Rico? No, I don't know. The Arecibo disk? I feel like that's it. They have some of those things in the southern hemisphere. Isn't it? Oh, so your question about whether this is the next form of life. But it's a new stage of life. I mean, are we a caterpillar that's giving birth to a butterfly that we're not aware of?
Starting point is 02:28:49 Essentially, one of these guys gave a talk that was all purpose toward making that ethical case. Is that Puerto Rico? Yeah. They're talking to aliens, bro. They're not even letting you know. They're already planning. Aliens are making these things. They're making an alien.
Starting point is 02:29:06 That's what an alien is. But that's the thing. This thing then gets that weird. Like, when you imagine this thing getting away from us, yeah, it would be its own... Now, whether or not it would be conscious... I mean, I'm actually agnostic as to whether or not a super-intelligent computer would, by definition, be conscious. It could be unconscious. It could be nothing that it's like to be
Starting point is 02:29:27 that sort of system, or it could be conscious. But, in any case, this one guy gave a talk this one guy gave a talk where he just speculated about this thing taking off and more or less standing in
Starting point is 02:29:43 relation to us the way we stand in relation to bacteria or snails or, you know, life forms that we just squash without a qualm. And that's a totally benign, acceptable, not only acceptable, but to be hoped for, right? And I can follow him there at least halfway if you imagine that these are conscious and actually become the center of the greatest possible happiness in the universe. machine that is essentially a god right that has interests and states of pleasure and insight and meaning that we can't even imagine right that is that thing by definition by my definition becomes more important than us right then we really are like the chickens that you know hope we don't kill them to eat them but they're just chickens and we're more important because we have a you know this greater greater scope to our pains and pleasures and that's not to say that i don't kill them to eat them, but they're just chickens and we're more important because we have a greater scope to our pains and pleasures. And that's not to say that I don't see any moral problem with killing chickens. I'm just saying that we are more important than chickens because of the nature of human experience and its possibilities.
Starting point is 02:30:57 But if we build a machine that stands in relation to us the way we stand in relation to chickens or far beyond, right? in relation to us, the way we stand in relation to chickens, or far beyond, right? I mean, it's nowhere written that the spectrum of possible intelligence ends somewhere close to where we are. Not only that, there's nowhere written that they cannot create far better versions than we could ever possibly imagine. Oh, no, that's implicit in what I'm saying. We're imagining that this takeoff would be this machine... Create better machines.
Starting point is 02:31:27 Yeah, makes recursive improvements to itself or to new generations. Yeah, so it's changing its own code. It's learning how to build better versions of itself, and it just takes off. But one horrible possibility is that this is not conscious, right? That there's no good that has come of this. This is just blind mechanism, which still is godlike in its power. Excuse me. And it could be antithetical to our survival, or it could just sort of part ways with us, you know?
Starting point is 02:32:09 That's the mindfuck of all mindfuckss is that we really are just a caterpillar and we're giving birth this ultimate fractal intelligence that's infinite in its span like it could create something within like as you said a week 20 000 years of of human intelligence and and the the greatest minds and it could do that in a week. And then a week later, another 100,000 more, fractal. It keeps going on and on and on. It's exponential in its reach. And then we really will be outdated, like almost instantaneously. Oh, yeah.
Starting point is 02:32:39 And sort of kind of crazy that, as you said, a lot of these guys that are creating these things are on the spectrum. And what is that from? Is it possible that these super intelligent human beings, that a lot of them do have this sort of Asperger's-y way of approaching life, and a lot of them aren't on the spectrum. Did nature sort of design that in order to make sure that we do create these things i mean if everything in life if life itself everything in life we look at alpha wolves and the way caterpillars interact with their environment and bugs and whatever all that stuff's natural is human behavior human cognitive thinking, is human creativity, is all that nature, is all that just a part of human beings' ultimate curiosity almost inevitably leading to the creation of artificial intelligence?
Starting point is 02:33:34 And was it sort of programmed into the system to create something far better than what we are? Well, I wouldn't say it's programmed into the system necessarily. I think you can explain all this just by everything being pushed from behind. We're following our own interests. We're trying to survive. We have all of the inclinations and abilities that evolution has selected for in us, and we have an ability to create increasingly powerful technology. But the inevitability of this is hard to escape. There are really only
Starting point is 02:34:08 two assumptions. All you have to assume is that we are going to build better and better computers, which I think you have to assume, apart from the possibility that we're just going to destroy ourselves and lose the ability to do so. But if we don't destroy ourselves some other way, destroy ourselves and lose the ability to do so. But if we don't destroy ourselves some other way, we are going to continue to make progress in both hardware and software design. And the only other thing you have to assume is that there's nothing magical
Starting point is 02:34:35 about the wetware we have inside our heads as far as information processing is concerned, and that it's possible to build intelligent machines in silicon right so and and i don't i don't i can't imagine any at this point serious scientist fundamentally downing either of those two assumptions i mean nobody thinks that there's something magical about being you know neural material when it comes to intelligent, you know, the processing of information that is underlying intelligence. And we're just going to keep making progress.
Starting point is 02:35:12 So at some point, this progress is going to birth a generation of computers that is better able to make this sort of progress than we are. And then it takes off. And so the benign version of this that some people imagine, and this is where the whole singularity begins to sound like a religion, but there are many people in Silicon Valley imagining that we are going to merge with these machines. We're going to upload our consciousnesses onto the Internet eventually and become immortal and just live in sort of the dreamscape of, of, you know, the, the paradise, um, and, uh, that we have
Starting point is 02:35:54 engineered into our machines. And, um, I mean, that, that vision presupposes a few other things that are more, much more far-fetched than the first two things that I, the first two assumptions I just listed. One is that before this happens, we will crack the neural code and truly understand how to upload
Starting point is 02:36:16 the information in a human brain into another medium, and that you could move consciousness, mind and consciousness into the internet, or onto some other, you know, you could back yourself up on a hard drive. And there are just philosophical, fundamental philosophical problems about what it would even mean to do that, right? In what sense are you surviving?
Starting point is 02:36:37 If you copy your brain, the full contents of your brain, successfully into a new medium? Haven't we just doubled you? And then when you die, aren't you just dying every bit as much as you would be dying if we hadn't done that? I mean, there are problems of sort of identity that come in there that are sort of hard to solve. But now there are people who are looking at this as a, you know, it's very much like we're building the matrix in some sense, and we're going to leap into it at the opportune moment and it's going to be glorious
Starting point is 02:37:09 that is such a utopian possibility like that's the utopian version yeah ex machina right which comes out this isn't that out right now yeah yeah i saw it i haven't seen it yeah but i mean this is what we're talking about i mean that's just that is probably the most benign version of it an artificial person that's like you can't distinguish it between that and a real person but not if you've seen the film right well but our own consciousness is I mean it's so archaic in comparison if you're talking about something that can exponentially increase in one week 20,000 years and then on and on and on from there Yeah, why would you want to take your consciousness and download it? I mean, oh, yes
Starting point is 02:37:52 It's like a chicken asking, you know, how I stay a chicken exactly like well Where are my feathers gonna go in this new? Yeah, if you can take a new world Yeah, and turn it into Einstein would it really want to go back to being an ant? I prefer digging in the dirt. just dropping my eggs and cutting leaves. And what would it do? I mean, ultimately this is inescapable. It seems like, I mean, we are, our thirst for ingenuity and innovation is just never, never going to slow down. And, and our ability to do that is never going to slow down either, unless we, unless a super volcano hits. And the other side of this is there's so many problems that we would want artificial intelligence to solve for us.
Starting point is 02:38:33 I mean, you think of curing Alzheimer's or solving global economic problems. to have a reliably benign superintelligence, which literally would be like an oracle, right, or a god to help us solve problems that we're not smart enough to solve. And that, you know, but the prospect of building that and keeping it reliably benign or keeping ourselves from going nuts in its presence, that's just a non-trivial problem
Starting point is 02:39:06 would it almost instantaneously recognize that part of the problem is us itself we're the problem that's one reasonable fear yeah yeah and it would also immediately recognize like hey this planet has only got like another billion years of reliable sunlight like we got to get the fuck out of here and propagate the universe. Well, there's a great book, if you really want to get into this, there's a book by the philosopher Nick Bostrom, and he actually might have been the one who convinced Elon this is such a problem. And I read his, he was one of the organizers of this conference,
Starting point is 02:39:39 and virtually everyone had read his book at the conference. He wrote a book called Superintelligence, which just lays out the whole case. And virtually everything that you've heard me say on this topic is some version of a concern that he expresses in that book. And it's very interesting because he just goes through it. It's very interesting because he just goes through it, you know, it's like 400 pages of systematically closing the door to every utopian way this could go right for us. And he just is like, yeah, well, here are the things you're not foreseeing about how, you know, even a, I mean, you just have to anticipate absolutely everything. So if you're trying to create a machine that is going to block spam, right, you need to create a machine that will not, as a strategy for reducing spam, just kill people, right?
Starting point is 02:40:38 I mean, that's a way to reduce spam. That's the only way. Yeah, it's like common sense things. That's the only way. where you would build a machine where it would not merely emulate current human values. It would seek to... Ultimately, you want a machine that instantiates the values that we should have, not that we necessarily do in any moment. And what was interesting, one thing that's interesting to me in thinking about this is that the moment you think about building a machine like this, you realize that you have to solve some fundamental philosophical problems.
Starting point is 02:41:41 You can't pretend that everyone has equivalently valuable values, right? You can't, because you have to decide what to engineer into this thing. So do you want to engineer the values of jihadists and Islamists? I mean, did the Taliban get a vote on how we should design the values of this thing? Well, I think it's pretty obviously the answer to that is no. But then you have to cut through basically every other moral quandary we have, because this thing is going to be acting on the basis of values. But initially, wouldn't it be, if it's independent and autonomous, it's going to automatically realize that a lot of our ideas are based on our own biological needs, and that a lot of those are unnecessary for it.
Starting point is 02:42:22 Oh yeah, but we will be building it. I mean, if we're saying we're going to build it not to be merely self-interested, we're going to build it to conserve our interests, whatever those deepest interests are, ultimately. Again, utopian, though.
Starting point is 02:42:38 I mean, because if it's ideal, if it has its own... Otherwise, we're building, you know, Satan. You know, we're building... Which is what Elon Musk said. Yeah, he was summoning the demon. Summoning the demon, yeah. Yeah, I mean, we're building something. Which is what Elon Musk said. Yeah, he was summoning the demon. We're building a wrecking ball, and we're going to swing it out away from the planet and watch it hurdle back.
Starting point is 02:42:54 So essentially, the Unabomber was right. Have you ever seen people, a few people have done this with his text, because there are sections of his text that can read you know totally irrational and people are people occasionally will put a section there and then it's not until you turn the page and have already agreed with it that you see you know who wrote it right um but uh yeah well yeah it's it's it's interesting just to see that we are kind of headed toward some kind of precipice here. Do you know how he lost his mind? Do you know the story about Ted Kaczynski? I don't know.
Starting point is 02:43:30 He was part of the Harvard LSD studies. They dosed the shit out of that dude. Yeah, they dosed him. He went to Berkeley, started teaching, and saved up all his money from teaching and went to the woods and started blowing up people that were involved in technology. Yeah, there's a documentary called The Net.
Starting point is 02:43:46 And I believe it's from Germany. I believe it was a German documentary. But it's very secretive, like who was and was not involved in those Harvard LSD studies. I know people on the other side of those studies. And I knew Richard Alpert, who became Ram Dass. Right, yeah. Didn't kill everybody. Didn't break everybody's Dass. Right, yeah. Didn't kill everybody. Didn't break everybody's brain.
Starting point is 02:44:06 No, no. But I mean, he might have had a vision that he chased down, you know, to the final point. And he recognized from his experiences, like, whoa, if we keep going, this is inevitable and became obsessed with it. Obviously, you know, you look, you're dealing,
Starting point is 02:44:24 one of the things that people try to connect is various drugs with schizophrenia and mental illnesses and most of those have not been able to stick because there's a certain percentage of people that will inevitably have issues and if those the percentage of people that have issues with schizophrenia or various mental illnesses are almost mirrored by the percentage of people who do psychedelic drugs, various psychoactive drugs, and develop these mental issues. So it might not be the cause, but it's a concern. And if you get a guy who may have a propensity for mental illness and you dose the shit out of them with LSD, you might get a Ted Kaczynski.
Starting point is 02:45:08 No, I think there are some people who certainly shouldn't take any drugs. Yeah, anything. And I've had bad experiences on a variety of psychedelics as well as good ones, obviously. But the bad experiences, I could see in the wrong mind affecting you permanently in a way that's not good for you or anyone else. You can go off the rails.
Starting point is 02:45:30 I went off the rails for a couple weeks once. Not really off the rails. I was totally functional. Most people probably didn't even know that I was off the rails. But the way I describe it is that my grip on reality had gotten very slippery. Like I was kind of hanging on. Have you ever done chin-ups with sweaty hands? Yeah.
Starting point is 02:45:49 You're not exactly sure how many you can get on before your hands give out. And that's kind of how I felt. I didn't feel like I had chalk on my hands and wrist straps. I felt like I had a slippery grip on reality. And thankfully, within a couple of weeks, it came back to felt normal. But especially the first few days afterwards, just very intense psychedelic experience. It was like as boundary dissolving as you can get. I mean, it might be like an argument that there's probably several versions of each person based on your reactions to whatever experiences you have, but that might have been
Starting point is 02:46:27 Version 2.0 of me like after that like I'm a different person I became a different person because of that and that could easily be what happened to poor old Ted So like yeah, like you said like some of his his Assertions if you look at the direction that technology is headed, I mean, obviously he was fucking bat shit crazy, but he said some things that weren't bat shit crazy. Yeah.
Starting point is 02:46:53 Yeah. Well, there's nothing. So if we can make a meme, if you could just say Ted Kaczynski was right, and we'll just put that in quotes, put that on Twitter. Isn't that what it boils down to today?
Starting point is 02:47:03 Is, is you putting a photo of you with a quote taken completely out of context and everybody sort of shits on it or agrees or disagrees. It is. And it's fun. But honestly, I think you're doing well if you never knowingly do that. I mean, if you never knowingly misrepresent your opponent yes then you can get into just knock down drag out arguments about anything right then it's all fine but as long as you're interacting with a person's actual views
Starting point is 02:47:37 then condemn those views and criticize those views to the to whatever degree but if you're if part of your project or the entirety of your project is simply knowingly sliming them with a misrepresentation of their views, because you can get away with it, because you know their views are either so hard to understand for most people or people aren't going to take the time to do it that you just are intellectually defaming people well it's also the there's a desire to win that a lot of people have that they apply to debates and it makes them intellectually dishonest because they they they don't want to agree that someone that they might have a disagreement with may have a point or two you mean you might disagree with the entirety of what they're saying,
Starting point is 02:48:26 but somewhere along the line, you might... It might be possible that you could see where they're coming from, even if you don't agree, but it just throws your argument into a bad position, so you abandon it. The thing is that the merit of an argument has no relationship to its source, really. It's like either the argument succeeds or fails based on the structure of the argument and its connection to
Starting point is 02:48:52 evidence, or it doesn't, right? And it doesn't matter if it's Hitler's argument for the destruction of the Jews or, you know, Ted Kaczynski said something true about the progress of technology. Whether it's true or not about the progress of technology has nothing to do with the source. But people imagine that if you don't like the source, there's no burden to actually address the arguments. And if you don't like the arguments, a successful rejoinder is just to trash the source, right? Neither of those are true. If you want to get at what's true in the world, you have to deal with arguments on their merits, and you have to deal with evidence.
Starting point is 02:49:35 And it doesn't matter if the evidence is coming from a thoroughly obnoxious source. It's not sufficient to say, well, I hate the source. As shorthand, we all have to privilege our attention and time. So if you know a source is disreputable, you can just decide, well, I don't need to hear it from this person because I know this person doesn't understand what he's saying and has lied in the past. So I'll wait to hear it from somebody else. So, yeah, it's not that the source doesn't matter at all. But you're not actually addressing truth claims if you're just disparaging the source of those claims. We don't have much time left.
Starting point is 02:50:18 We have less than 10 minutes. Is there anything else you'd want to get into? I mean, we had the only other thing on this list, which is just too big for 10 minutes and we're going to get in trouble is a lot of people hit us with cops, Baltimore, self-defense, violence, weapons, all that stuff. Yeah, that's a big one.
Starting point is 02:50:41 That's an hour. That's, yeah. At least. Yeah, and it's also It's also we don't need to talk about it once artificial intelligence kicks in we're not gonna have crime anymore We've pretty much cured it all with the final hour the artificial intelligence Conversation sort of trumps the whole thing because it was Islam is not gonna be important when there's robots that can read your brain I mean we're going to not need people anymore.
Starting point is 02:51:07 So you don't need religion. You don't need lies. Charlie Hebdo is completely irrelevant. It will be a footnote in history. It will be like there were monkeys. They threw their shit. And then there were robots that could think for themselves. And that was it.
Starting point is 02:51:19 Don't forget about all that building the Eiffel Tower. And what's scary about this thing is that it's very hard to, I mean, so I read the book, I went to the conference, I've written about this, I've spoken about this. I hang out with people who are worried about this. And it's actually still hard to keep this concern in view. It does, like, the moment you spend 10 minutes not thinking about it, to start thinking about it again makes you feel like
Starting point is 02:51:45 uh maybe that's just all crazy right so this is all bullshit i mean what am i like oh what they're really going to be a super intelligent machine that's going to take you know swallow the world right wasn't that the devil's greatest trick yeah right but now what the expression is the devil's greatest trick is convincing the world it doesn't exist yeah yeah um but it's i mean this is on you unlike other i mean other things have this character like it's hard to really worry about climate change because it's an abstraction i mean it's hot out there today but how much hotter is it than it used to be and you know um or you know during the cold war we knew that we had i mean we still have these icbms pointed in all the wrong directions, but, you know, we're living under the continuous threat of, you know, something going wrong and annihilating the better part of humanity.
Starting point is 02:52:34 And it's yet as hard, it's much easier to worry about other far more, it's easier to worry about Twitter than it is to worry about that. But this thing is so kind of lampoonable. And it's, I mean, it's just kind of a goofy notion, which seems just too strange to even be the substance of good, credible fiction. And yet, when you look at the assumptions you need to get on the train, there's only two. And they're not, I mean, and they're very hard to doubt the truth of. Again, we just have to keep making progress, and there's nothing magical about biological material in terms of an intelligent system. And by the time it becomes a threat to everyone,
Starting point is 02:53:20 by the time we recognize it as a threat, ideally, it'll be too late. Well, that's even that people have different timings of what they call the takeoff, you know, whether it's a hard takeoff or something more gradual. But, yeah, it's the kind of thing that could happen in secret, and all of a sudden, things are different in a way that no one understands and you can also make this argument that if you look at all the issues that we have in this world that so many of them are almost unfixable without this yeah again that's what that's what i said it's like the other side i said i think in my blog post the only thing scarier than the development of strong artificial artificial intelligence is not developing it because we need
Starting point is 02:54:05 We need that we have problems for which we need I mean to intelligence is our only asset ultimately Yeah, I mean, it's everything it's given us everything good right and why should we accept our limited biologically? We want logical intelligence when we can come up with something infinitely more intelligent and God like it's a progress in this in this area It seems almost an intrinsic good i mean just because we want to we want to be able to whatever you want clean the ocean you want to be able to solve problems and you want to be able to anticipate the the the negative consequences of your doing good things and mitigate those. And intelligence is just, it's the thing you use to do that. Let's end there. Yeah. Freak everybody out. Sam Harris, uh, blog. What is your blog?
Starting point is 02:54:53 Samharris.org. Samharris.org. Blog and podcast. Sam Harris on Twitter. Samharris.org. Org on Twitter. Yeah. Thanks, man. I really appreciate it. Always a good time. Lots of fun. Yeah. All right. time much love my friends we will be back on Friday with Rich Roll until then see you soon bye bye Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.