The Joe Rogan Experience - #804 - Sam Harris

Episode Date: June 1, 2016

Sam Harris is a neuroscientist and author of the New York Times bestsellers, The End of Faith, Letter to a Christian Nation, and The Moral Landscape. His podcast is called "Waking Up." ...

Transcript
Discussion (0)
Starting point is 00:00:00 The infamous Jamie double finger gun. And we're live. How are you, sir? Good to see you, man. I'm good. Good to be back. You're looking healthy. Looking fresh.
Starting point is 00:00:10 I'm glad to be here. Well, I stopped eating meat since I last saw you. I heard about that. I want to talk to you about that. Yeah. And I don't know that it's correlating with health. Every time I worry about this out loud, I get hate mail from vegans and vegetarians who say, stop.
Starting point is 00:00:25 Stop putting your bullshit on us. Well, they don't like when you associate any negative consequences whatsoever with only eating vegetables. Because it's essentially a cult. It's a wonderful cult of people that want to take care of animals and be nice to animals. But they're very tribal. Very tribal, very cult-like. And if you say anything that's negative against vegans, they gang up.
Starting point is 00:00:48 I go to forums and I read the things they say. They organize little troll attacks and they make YouTube videos. It's kind of hilarious from a psychological standpoint. Yeah, I mean, obviously I don't want it to become a new religion or my one religion, but there is a moral high ground to the position that I find very attractive because I felt like a hypocrite as a meat eater. Now, and I don't think this necessarily extends to someone like you who hunts and feels okay about hunting. And I don't have an argument against hunting the way I do against factory farming or more or less any of the way we get meat,
Starting point is 00:01:28 you know, the environmental implications of it. But it's very captivating as a position, and you feel like an asshole. Once you go far enough into the inquiry, you feel like an asshole not being sensitive to these concerns and just ignoring how you're getting your food three times a day. But I'm not, you know, for me, I mean, I'm sure there's individual variation, and I'm not the smartest vegetarian in the world in terms of how I prepare my food and how attentive I am to it. So the onus is somewhat on me, but I'm not totally sure it's the healthiest thing for me yet.
Starting point is 00:02:11 So you said you look healthy. I feel like my health is somewhat withering under this. It's been about nine months. Withering. Are you getting your blood checked? Are you doing B12 supplementation? Yeah. That's essential.
Starting point is 00:02:25 B12 is essential. D3 is essential as well. Most people are just not going to get enough from the sun. Right. And are you monitoring your intake of fatty acids and things along those lines? Not really. Beyond the supplementation, I'm just trying to get the food in. Yeah.
Starting point is 00:02:42 Well, it's all good stuff, but it's really important. There's one of the things that people rage against, unfortunately. It's the stigma and it's fats. It's dietary cholesterol. Yeah. Dietary cholesterol and saturated fats, which are critical for hormone production. And it's one of the reasons why people, when they get on an all-vegetable diet, if they're not really careful with coconut oil, you got to eat a lot of coconut oil.
Starting point is 00:03:05 I'm a big fan of avocados. I eat a lot of avocados, a lot of avocado oil. A lot of coconut oil as well, though. Almond butter, nuts, things along those lines. You really need to get those essential fats. Well, actually, I'm not vegan. I would like to be vegan, but I'm still eating dairy. Well, how could you say you would like to be?
Starting point is 00:03:22 No one's holding you down. It's like, God damn it, Sam, you need to have some milk. No, I think I would screw it up. I mean, again, I'm going to reap the whirlwind from the vegans. I should have brought you some eggs. I eat eggs. I eat eggs and I eat dairy. But you can only eat so many eggs and so much dairy.
Starting point is 00:03:42 I eat a lot of eggs. Eggs are very good for you too. There's another thing, dietary cholesterol. People are always concerned with dietary cholesterol. Well, that was a big myth for the longest time. As a matter of fact, dietary cholesterol barely moves the needle on blood lipids. A lot of when people have cholesterol issues, it's sedentary lifestyle, there's genetics. There's all sorts of other variables. But people with healthy lifestyles, it doesn't seem to be that dietary cholesterol is bad for you. And also, it's essential for testosterone production.
Starting point is 00:04:15 Yeah, yeah. So, no, I get a ton of, or not a ton, but I'm not trying to avoid saturated fat. So, I get a fair amount of cholesterol. Do you have a yard? Yeah. Could you raise some chickens? No. No?
Starting point is 00:04:31 Not enough of a yard. Come over to my house after this. I'll take you to my house. I'll show you the chicken setup. We just got five new ones, and we have 18 now. One of them died. They just fucking die sometimes. Right, and you don't know why.
Starting point is 00:04:45 No, they just go in the chicken coop. One of them's. They just fucking die sometimes. Right, and you don't know why. No, they just go in the chicken coop. One of them's dead. You're like, all right. Father Time got in there with his scythe. They're getting kind of old now, some of them. I don't know how old a chicken lasts, honestly. But they're like pets that give you food. Like, there's no negativity.
Starting point is 00:04:58 I open the door. They come up to me. They run around. I feed them. Like, there's no, they're not trapped. As a matter of fact fact they go into their their pen at night like you leave it open and they wander around my i have a big yard they wander around my yard they eat a bunch of bugs and stuff and then they go back inside when they
Starting point is 00:05:14 want to right so there's no like animal captivity there's no cruelty there's nothing weird going on and so those eggs they're pretty much karma free oh yeah yeah no i don't doubt that it's just when you read the details of how our dairy and and eggs are gotten i mean it's it's near arguably as bad if not worse than much of the meat production so you know moving from eating eating meat to being a vegetarian in some ways is a symbolic move ethically if you if you really wanted to to not participate in the machinery but there's an issue with vegetarianism there is an issue with how they gather food i mean there's there's a giant issue that people don't want to take into consideration is like how are they growing all this food how are they growing all these
Starting point is 00:06:02 plants well one of the things they're doing is they're displacing wildlife. They're chewing up this ground and these combines, if you eat grain in particular, combines indiscriminately just chew up all that stuff and they get deer fawns, mice, rabbits, rats, rodents, untold amount of bugs if you want to get really deep. I mean, there's no, like being a vegan and being a vegetarian is most certainly less cruel and less harmful overall, but it's not karma free. It can't be unless you're growing your own stuff. If you can grow all your own vegetables and you essentially live on a small farm, yeah, you could, you could do it and really feel good. But it's, if you're buying it in a store, you're participating in factory farming whether you like it or not. You're just participating in vegetable farming. But there's still issues. Oh, yeah, yeah, yeah. Just fewer.
Starting point is 00:06:57 I had this guy on my podcast, Uma Valetti, who's running this company called Memphis Meats, which is cultured meat. It's a startup in Silicon Valley. Oh, okay. Did you try that? I haven't tried it. No, I want to try it. But it was a fascinating conversation because he basically – what's fascinating to me on two levels is, one, it's fascinating that we're on the cusp of being able to produce actual, you know, biologically identical meat that is totally cruelty. I mean, there's no implication of cruelty at all in it, right?
Starting point is 00:07:32 And you would just grow this in a vat the way you, you know, brew beer, essentially. And so that's, you know, that seems like the future. But what's interesting psychologically is that people have this creepy feeling around it, which is very strange. Because so I'm telling you, I can take the misery and death out of the process, right? I can take the suffering animal out of it. I can take the chaos of the slaughterhouse out of it. There's no, you know, cow that has been mistreated for its whole life, stumbling in blood and feces on the way to the, you know, the killing floor. And somehow removing all of that makes it
Starting point is 00:08:13 creepy for people, right? They want that. I mean, that's the natural way to get meat. And if I told you this is grown in a vat by a guy in a white lab coat and has no xenoviruses and no bacteria, nothing, no antibiotics were used to plump this thing up. And it's just the cells you want. People start to, there's kind of an ick feeling that I think we're going to get over, but it's interesting psychologically that it's there in the first place. Did you get that ick feeling or are you just talking to the people that cultivate it? No, I understand it. I mean, I'm past it, but I understand it. Who do you know that got it, that feeling? I just see the reaction. I actually polled this on Twitter, and 25% of people, like 15,000 people
Starting point is 00:08:56 answered the poll. So it's not a scientifically valid poll of the general population. It's just whoever got it on my feed. But it was interesting to see. I asked, you know, of the people who wouldn't switch, as I asked, would you switch? And something like 80% said they would switch if this was affordable and available and safe. But of the people who wouldn't switch, it was 25% wouldn't switch because it's just creepy. And 25% assumed that it was not healthy, I think. I forgot the breakdown. Another 25% were already vegan or vegetarian and didn't want to eat meat. But there was an ick factor for at least 25% of the people who wouldn't do it. I understand that. I wonder how many people would go back to eating meat if they could raise it this way like how many people who had gone vegan would go back to eating this scientifically created lab created beef i think enough for a
Starting point is 00:09:58 serious market yeah oh for sure well there'd be a serious market for it for sure if they could get the it cost effective because last time i saw it was like a quarter million bucks for a cheeseburger. I think it's down to $18,000 for a meatball. I mean, it'll eventually be like the Apollo computers now fit in your pocket. Much stronger, in fact, than the Apollo computers. Well, sequencing the genome, this I just noticed, which is fascinating. Sequencing the genome 15 years ago cost three billion dollars it's now three thousand so it's a million fold reduction in cost in 15 years that's insane so something like things tend to scale that way so yeah that's's a giant scale, though. When you talk about human history,
Starting point is 00:10:46 I'm like, good Lord, imagine if you're a guy who spent $3 billion of it 15 years ago. You're like, God, if I just fucking waited. Yeah, yeah. That would have saved so much money. And we don't anticipate that when we think of how difficult it is to solve certain problems.
Starting point is 00:11:01 We don't. And this is Ray Kurzweil's point. Ray Kurzweil, who I think is a bit of a cult leader and a carnival barker on many topics. This point he makes again and again, I think is quite valid, which is when you're factoring how difficult it will be to get into the end zone, whatever that end zone is, you're not tending to factor all of the improvements and the compounding improvements in technology along the way. I would love to get back to ray carr as well but i wanted to bring up this uh point about did you one of the things about going vegan especially when you proclaim and you go public with going vegan if you back out of that yeah yeah well that's where that's
Starting point is 00:11:38 where i am as a vegetarian yeah they get fucking mad at you yeah They get mad. Did you see that family, a couple that runs a bunch of vegan restaurants, and they decided to start eating meat again, even though they run vegan restaurants? They decided to start, they have their own farm. They raise their own cattle, and they started eating their own cattle. It was real weird, too, because they brought, there was a lot of Jesus in their message. There was a lot of, like, Jesus said that, you know, we're supposed to take care of the animal. Like biblical quotes, you know, like really obscure biblical quotes about food.
Starting point is 00:12:11 Like, oh, okay. Like, what are you doing here? But they have, here it is. Vegans revolt against owners of famous LA vegan restaurants after meat eating outed. Well, I think, I don't think you could say they outed because I'm pretty sure they put it on their Facebook page. I think I was like first cheeseburger in 15 years. Is this, this is Cafe Gratitude? Yes. Oh yeah. So the other thing that's hilarious about that restaurant, which I like, the food is good, but have you been there? No. So everyone,
Starting point is 00:12:43 now forgive me if this is no longer true, but at one point every employee there did the landmark forum, you know, the successor to Est. Oh, that's right. They got sued for that. Who got sued? This restaurant. It's one of the reasons why they closed one of their restaurants. Yeah. Explain that, though, because the landmark.
Starting point is 00:13:01 So Est, which I've never done this so again i'm speaking outside the the cult walls um but um so verner erhardt was a a 60s um you know human potential figure who started est um never met him um but uh you know obviously impressive enough as a person to get a lot of people to to spend a lot of time doing whatever he said. And he had this kind of growth course called Est, which has been – I mean, you've seen it in many movies. No name comes to mind now. But some version of this where you can keep everyone in the room,
Starting point is 00:13:44 and people have to ask permission to go to the bathroom. And there's kind of like a pressure cooker situation socially where you have everyone sort of torn down. Actually, the classic case of this, I think this wasn't Est. This was the Forum, which is now the successor to Est. They were at one point hired to do coaching of various companies. And I think they were hired by the FAA. I wrote about this in one of my books in a footnote. It was the FAA hired the forum to coach their administrators.
Starting point is 00:14:21 And one of the exercises they forced these guys to do is, and they almost certainly were mostly guys, they chained the boss to his secretary and had for like the whole day, and they had to go to the bathroom together and like, you know, this sort of, you know, ego annihilating experience. Anyway, this is the recipe that – one of the recipes that Est has pioneered. This is not to say that people don't go to the forum and get a lot out of it. I've actually met those people. But every employee of this restaurant apparently has gone or used to do the forum. So it's a very – you walk into the restaurant and your interaction with people in the restaurant is unlike most restaurants.
Starting point is 00:15:07 People are just very – lots of eye contact and it's just an intense restaurant. And also the stuff on the menu, this is just so lacerating I can never comply. But the name of everything on the menu is like I am humble, I am magical, I am self-assured. So you have to, you're meant to order it that way, like I am humble. Oh God, I'll have an I am humble? Yeah, so when it's given to you, it's you are humble. So that becomes, a little of that goes a long way.
Starting point is 00:15:43 Okay, well they're retarded. That's what's going on. And if you go to- The food is good, though. Don't get me the wrong way. The food is good. Well, it makes sense then that they would go all Jesus-y, religious-y when they were trying to justify their meat consumption. Jamie, see if you can pull up the quotes for them justifying their meat consumption because it was real weird.
Starting point is 00:16:02 justifying their meat consumption because it was real weird. I was like, how weird is it that these guys are using Jesus and religion to justify eating cows when they've been a vegan for all those years? Did you not listen to Jesus all those other years? You're like, fuck you, Jesus. I'm not eating meat. What was the turnaround there? It doesn't make any sense. Well, I don't think you can pull veganism out of the Bible unless, well, I guess if you go back to the garden, there's nothing about them eating meat, right? It just was all provided from the trees.
Starting point is 00:16:32 There's no slaughterhouse in the garden. Well, vegetarianism is fairly old. I mean, vegetarianism has been around in Hindu cultures and so many different cultures forever. But not veganism, right? I mean, veganism is really fairly recent. It's pretty impractical in deep history. Okay. Herding remnants of our best tool to restore fertility to the earth, keep the earth covered, and reverse desertification and climate change, he wrote. We need cows to keep the earth alive.
Starting point is 00:16:59 Cows make an extreme sacrifice for humanity, but that's their position in God's plan as food for the predators. Whoa. Huh. That's a strange quote, but I think there was more of them, but that's good enough. But there was, there was more of that kind of stuff. Like all of a sudden he's a predator. Like predators go after animals, they chase them down and they kill them. I guess we're kind of predators in a way, but we're some new thing. We're some completely new thing. We use weapons or we corral them. And if you've got them corralled and you just stick in that no country for old men thing in their head and killing them with it.
Starting point is 00:17:38 I mean, that's what they're doing, right? If you're doing that, I don't know if you're allowed to call yourself a predator. Well, historically, we're predators. Yes, historically. And chimps are predators. Not entirely, but they are that too. We're certainly a kind of a predator, but we're so much different now. And this is what we're talking about.
Starting point is 00:17:55 We're talking about factory farming and these weird businesses where they slam all these animals to these entirely too small places. And they live in their own feces and urine and i'm sure you've seen that drone footage from the pig farm i've seen a lot of pig farm footage but i don't know if i've seen drone footage i don't think i've seen lakes of urine and feces it's disgusting and it's unbelievable they this guy flew this thing i mean they have these ag gag laws which are evil yeah. And if you don't know what that means, ag-gag laws are laws that they make it a federal crime to show all of the abuse of these animals, to show factory farming, because it'll affect the business so drastically and so radically when people are exposed to the truth that they've made it illegal. Those laws should be illegal. Those laws are scary scary that's a scary aspect of human beings i've never seen this whether it's some toxic lake that's a lake of pee and poo yeah and all those things are stuffed to the gills with pigs yeah there's a book um uh eating animals jonathan saffron
Starting point is 00:19:03 four's book which is um worth reading if you think you're immune to the details. I mean, there's two aspects to it. There's the cruelty aspect, which is—actually, three aspects. There's the cruelty aspect, which is horrific. There's the environmental energy use issue, which is also just totally untenable. environmental energy use issue, which is also just totally untenable. And then there's just the, if you have any concern about your own health and the contamination, I mean, just getting all these antibiotics that, you know, weren't prescribed to you,
Starting point is 00:19:40 that you don't want, that are still getting into you through this food chain. And, I mean, just the stuff that they, like the chickens, I mean, the details about chicken farming is almost the most horrible, but I mean, they're covered in just the most, the foulest, no pun intended, just the most disgusting material. they go into like a broth that's just like pure bacteria. And, I mean, they have to be washed with, you know, just ammonia or whatever they put on them. I mean, it's just— They really wash them with ammonia? I forget the details, but it's just—read Jonathan Safran Foer's book on this. It's just the details of what goes on from you the chickens actually i think largely
Starting point is 00:20:28 because they're so small and more of the process of killing them is automated that they almost get the worst of it because they're i mean they're just they're like you know they're getting singed before they're stunned before they're i mean it's just it's just not i mean at least with a cow you've got a single person interacting with a single cow, however briefly, and there's less chaos in the machinery. ethically comes around just the egg industry because fully half the chickens, the male chicks, just immediately get thrown into literally like a meat grinder because they're not the same chicken that is a broiler chicken. I mean, genetically, they're not the same. They don't grow into the same kind of chicken that would be useful. So they're useless. And so they don't lay eggs and you don't eat them. And so they just get literally just fed into a, like a wood chipper alive.
Starting point is 00:21:33 I mean, there's no, and again, this is in some ways an artifact of them being so small that it would be so much of a, just too much of a hassle to stun them appropriately, right? Imagine if they had to they made a law where you had to bury them all and put little crosses in the ground be labor intensive jesus i mean i was like billions and billions you know i was on the highway and there was a chicken truck that was passing me one of those trucks that's containing live chickens and they're just stacked just stacked in cages on top of each other. Cage on top of cage, just shitting on each other. And I'm watching this and I'm like,
Starting point is 00:22:08 it's so weird that we're allowed to do that with some animals. Like if you were doing that with horses, people would lose their fucking minds. If you had dogs in boxes like that, stacked in the open air on the highway and you're driving down the road with them, people would freak out. But no one bats an eye at these chickens. This chicken truck, chickens are a weird thing. We have like a hierarchy of animals that we love
Starting point is 00:22:29 and we're not really big into reptiles. No. Not really big into birds. It has something to do, I think, with facial expressiveness a little bit. And I mean, so fish are also way down. Like the fact that fish have these cold, expressionless faces matter what happens that's um that cuts down on empathy uh but i think it's its size its facial display and it's also the sounds they make right so you know if a lobster could scream every time it went to the pot you know um it would be it would be a different encounter with, you know, just how much do I want to eat this thing?
Starting point is 00:23:08 And so something obviously like an ape or something that's cute. I mean, it's amazing what a what a fluffy tail will get you. I mean, there's a squirrel and a rat. Right. Yeah. It's not. It's crazy. It is crazy. Squirrels could just hang out with everybody. We're cool with them. They look so close to rats they're like god i'm so close it's like a hot girl sister it's like what the fuck like so so much is so close there was an article today about some woman who had rescued a lobster from a restaurant and dropped it off in the ocean and the journey of this all and how you should think of this lobster as something with a
Starting point is 00:23:44 cute face and if you did then you would appreciate her you should think of this lobster as something with a cute face and if you did then you would appreciate her efforts and understand that this lobster even though they're not even capable of feeling pain in lobsters they don't have enough nervous system their nervous system is not strong enough for them to feel pain they don't have the same sort of sensors that we have allegedly yeah i don't i i i've threatened to do this actually when i when i decided to become a vegetarian i said at some point, maybe I will just do a taxonomy of the kind of a comparative neuroanatomy across species just to see where we could plausibly say, you know, the suffering really begins to matter. Like clams? Do clams feel anything?
Starting point is 00:24:20 No, actually, there are what are called bivalve vegans who eat clams and oysters and mussels. Oh, I could get into that. Because they think that there's no—I mean, you can make an argument that there's no basis for suffering there. Yeah. Well, if there's no feeling, there's no suffering. My argument against that, though, is lobsters are clearly not happy when you throw them in boiling water. Yeah. So what's that reaction?
Starting point is 00:24:41 I think anything that can behave, that can move, right, and move away from a stimulus, the evolutionary rationale for it to experience pain. The question of consciousness is difficult, you know, where consciousness emerges. And I think there is clearly unconscious pain mechanisms. I mean, the same mechanisms that give us pain at a certain level, we can be unconscious and yet they can be just as effective. I mean, all of our reflexes are like that. So, you know, if you touch a hot stove, you're pulling your hand away actually before you consciously register it. And that's as it should be because you're faster that way. So it's possible to have unconscious pain, but anything that can move very quickly is going to evolve an ability to move away from noxious stimuli. And there's every reason to make that, in evolutionary terms, as salient and as urgent as possible.
Starting point is 00:25:43 And our pain response is that for us. And there's no reason to withhold that from any other animal that's clearly avoiding stuff with all of its power. Right. Yeah, it seems to me that there's got to be all the way down to mushrooms, because everybody eats mushrooms, even vegans. But mushrooms breathe in oxygen and they breathe out carbon dioxide. They're way closer to humans than they really are to plants.
Starting point is 00:26:10 They're weird. They're a strange sort of an organism. I didn't know that about mushrooms. Yeah, they're not necessarily like a plant. I mean, we think of them as a plant because they grow. But they're a fungus. It's a type of life form. And they interlink through the mycelium.
Starting point is 00:26:25 Yeah. Yeah. Well, Terence McKenna, if you got him talking on mushrooms long enough, he would, some very spooky stuff that he thought about them. But there's no nervous system there. I think the crucial variable is the complexity of a nervous system. So suffering and pain and emotions. Yeah. I mean, just when you're, when you're, I mean, so there's suffering is one component of it, but then there's just the question of what sort of experience can this creature be deprived
Starting point is 00:26:57 of? Right. Right. So when you ask like, like, why is it a tragedy or why would it be a tragedy to, to kill someone painlessly in their sleep, right? So there's no suffering there, right? If anything, you've just ended whatever suffering they were going to have in their future. Why would that be a tragedy if it happened to all of us tonight, right? Just, you know, there's some neurotoxin comes down from space and kills us all in our sleep and no suffering is associated with it. The only, with it, ethically speaking, the only problem there, and it's a huge one, is that it forecloses all of the possible happy futures most of us or all of us or at least some of us were going to have. So all of the good things that we could
Starting point is 00:27:41 have done over the next million years aren't going to get done. All of the beauty, all of the creativity, all of the joy, all of that just gets canceled. And so leaving the painfulness of pain aside, you know, why is it wrong to deprive any given animal of life? Well, insofar as that life has any intrinsic value, insofar as being that animal is better than being nothing, right, then you're also just canceling all of that good stuff. And for that, you know, for any good stuff, you need a nervous system. And for any pain, you need a nervous system, insofar as we understand this.
Starting point is 00:28:24 Though there are people who say— When you say any good stuff, what do you mean? Like experience, like any richness of experience. Isn't that just movement, though, right? Like, are we being prejudiced about the experiences that plants are having by being immobile? You know, have you ever thought, ever paid attention to some of the more recent research on plant intelligence and the weird stuff that's going on with them? Calculations that they're exuding, expressing some scents that causes, when they're being eaten, causes other plants downwind to change their flavor to avoid predation. There was a New Yorker article on this.
Starting point is 00:28:58 I forget when this came out. It's weird. There's weird stuff that plants do. And I remember the details of that article aren't so clear to me. um we it's weird there's weird there's weird stuff that plants do yeah which and i remember it's the details of that article aren't so clear to me i remember not knowing what to think about some of it but some of it clearly can be explained in evolutionary terms that doesn't imply any experience it just it could all be dark right like it's all blind mechanism but so the experience consciousness the consciousness as we know it is what's most valuable to you.
Starting point is 00:29:27 Well, I think it's the only thing that's valuable to anything. It's like when you're going to talk about value. Right. So it's like if this cup has no experience, right? So if my trading places with it, insofar as you can make sense of that concept, is synonymous with just canceling my experience. Well, then this cup isn't conscious. There's nothing that it's like to be the cup. When I break it, I haven't created suffering. I haven't done anything unethical to the cup. I have no ethical responsibilities toward the cup. But the moment you give me something that
Starting point is 00:30:05 can be made happy or be made miserable, depending on how I behave around it or toward it, well, then I'm ethically entangled with it. And that begins to scale in, I think, in a fairly linear way with just how complex the thing is. So this is maybe something we even talked about on a previous podcast. You know, if I'm driving home today and a bug hits my windshield, that has whatever ethical implication it has, but given what I believe about bugs and given how small they are and given how little they do and given how primitive their nervous systems are, I'm not going to lose sleep over killing a bug.
Starting point is 00:30:51 If I hit a squirrel, I'm going to feel worse. If I hit a dog, I'm going to feel worse than if I hit someone's kid, obviously I may never get over it, even if I live to be a thousand. So the scaling... and granted there are cultural accretions there, so you're like, can I justify the way I feel about a dog as opposed to a deer? You know, there's a difference. But the difference is one of richness of experience insofar as we understand what other species experience. You could make that argument to justify eating animals as opposed to being a cannibal, right? You could say, well, what kind of experience does a deer have?
Starting point is 00:31:30 They're just running around the woods, trying not to get eaten. They eat grass. They mate. It's very simple. It's a very simple experience in comparison to your average person that lives in Los Angeles that reads books. I mean, someone who goes on a lot of trips, someone has a lot of loved ones, someone who has a great career, someone
Starting point is 00:31:49 who's deeply invested in their work. Yeah. There's much more complexity. Yeah. There's much more complexity and the deer behave just as deer all over the place. And it's a very primitive sort of a life form. Well, if you go further and further back, like what it seems like you can keep going with that and uh one of the things that i that concerns me the most about plants not concerns me but puzzles me the most about plants is whether or not the way i look at them me personally my prejudice is about them i just not thinking at all of them as being conscious what if we think about things in terms of the complexity of their experiences just because we're prejudiced about things that move?
Starting point is 00:32:28 I mean, it's entirely possible that, like, it's going to sound really stupid, but I've said a lot of stupid shit. I went into a grow room once, like a pot grow room. Right. This guy had this Mac Daddy grow room. And I walked in. I was like, this is like being in a room full of people. This feels weird as fuck. It felt very weird.
Starting point is 00:32:48 It felt like there was a tangible vibe of life in there. And there's no other way to describe that without sounding like a complete moron. Yeah. But it was my experience. I know what it's like to be that moron. But you know what I mean? You take enough acid, you know what you're talking about. I'm not saying that we shouldn't eat plants.
Starting point is 00:33:08 I'm not, don't, people are ready up in arms with their Twitter fingers ready to get off. But what I am saying is it's entirely possible that all things that are alive have some sort of a way of being conscious. May not be mobile, may not be as expressive. being conscious may not be mobile, may not be as expressive, but there might be the stillness of you without language when you're in a place of complete peace, when you're in a Zen meditative state. What about that stillness is really truly associated with being a human or really truly associated with being an English speaking person in North America? Almost nothing.
Starting point is 00:33:44 It's just an existence, right? Yeah. And then everything else sort of branches out from that. And then humans, we all make the agreement that of course it branches out far further and wider than any other animal. But how do we know that these plants aren't branching out like that too? How do we know that then if they're having some communication with each other, if they're responding to predation, if they're literally changing their flavor, they're doing all these calculations and all these strange things that they're finding out that plants are capable of doing, like what is going on there? We don't know necessarily. Yeah. Well, I'm agnostic on the question of how far down consciousness goes. And I
Starting point is 00:34:20 agree that there's very likely a condition of something like pure consciousness that really is separable from the details of any given species. I mean, this is something that I've experienced myself. It feels like you certainly have this experience. What its implications are, I don't know. But you can have the experience of just consciousness, and it doesn't have any personal or even human reference point. So it's not, you're not even, it doesn't even have a reference point in one of the channels, the human sense channels. So you're not seeing, you're not hearing, you're not smelling, and you're not thinking,
Starting point is 00:35:01 and yet you are. So there is still just open conscious experience. And whether that is what it's like to be a plant, I don't know, because I don't know what the relationship between consciousness and information processing in the brain actually is. Though it's totally plausible, in fact, I think it's probably the most plausible thesis that there is some direct connection between information processing and integrated information processing and consciousness. And that there is nothing that's like to be this cup and, you know, the atoms are not conscious.
Starting point is 00:35:41 atoms are not conscious. But yet, the thesis that consciousness goes all the way down into the most basic constituents of matter, that thesis is called panpsychism in philosophy, and that's pan, everywhere, psychism, mind. So that mind is in some sense everywhere in nature. That's not, you can't rule that out. I mean, there's nothing we know about the world to rule that out. What I think you can rule out is the richness of the contents of consciousness in these species.
Starting point is 00:36:18 So plants are not having conversations like this, right? So plants don't understand what we're doing. There's no way they would. There's just no, they don't have nervous systems. They're not, they can't be processing information in a way that would give them what we know as a rich experience. But your point about the time scale and movement is totally valid. If every time you walked into a room, your, you know, your fern just turned and looked at you, just oriented toward you, and followed you around the room with its leading branch, you would feel very different about the possibility that it's conscious, right?
Starting point is 00:36:55 You'd be a good person to ask this. Is that an urban myth that if you sing to your plants, you play your plants music, that they grow better? I haven't looked into it. We need to find out. I would bet that it is, yes. It sounds like one of those things the chicks who like crystals tell you. Yeah. I would bet that it is.
Starting point is 00:37:14 I sing to my flowers and they grow so beautiful. Yeah, it seems like one of those things that people say. I'm definitely not insinuating that plants would have as rich an experience as human beings, but I don't think a deer has as rich an experience no no as a human being either and it's just to me my um my what my curiosity lies in the future of understanding plant intelligence like wouldn't it be fascinating if we found out that they like one of the reasons why psychedelic drugs puzzle me so much is that they exist in like there's a lot of plants you could just eat them and you have your brain already has this place it'll go if you eat these plants like if you
Starting point is 00:37:50 eat peyote if you uh if you try the san pedro cactus out this spring uh you you can you can have like these really powerful psychedelic experiences just from a plant like why does the human mind interact with these plants like that like especially fungus like when you have like major league mushroom trips it's very strange sort of feeling like you're in communication with another you know and mckenna described it best in that way that there's someone there that it's not just you and hallucinations It has this distinct feeling that you're there's someone there. Yeah, and that someone According to the the hippiest of hippies is Plan intelligence. It's my it's mother Gaia. It's the earth itself. It's all life. It's love. It's God
Starting point is 00:38:40 It's all it all exists inside the the intelligence that's It's God. It's all it all exists inside the the intelligence that's intertwined in nature It's one of the most things that one of the things that's most puzzling about the most potent of all psychedelics Just dimethyltryptamine that it's in so many different plants like to make Dimethyltryptamine containing plants illegal would be hilarious because there'd be like hundreds and hundreds of plants They'd have to make it legal, including like phalaris grass. It's just really rich in 5-methoxydimethyltryptamine, which is the most potent form of it. Also our own brains. Yeah, our own brains make it.
Starting point is 00:39:15 Every brain. Every brain makes it. The little weird little lizards that have retinas and lenses where their pineal gland is, they're making it in their little screwy little lizard brains We don't even know what the hell it's for Yeah, like the the questions about it just was so much there's so many more questions and there are answers They're like you've paying attention to Rick Strassman stuff the general that book. Yeah, did you know that the Cottonwood Research Foundation actually found? mice brains
Starting point is 00:39:44 Producing dimethyltryptamine so it's been proven that's actually growing in the pineal gland the gland that they had always yeah yeah i i i had assumed that i yeah if i got that from the book or not but that's crazy um as a neuroscientist like doesn't that kind of freak you out that those Egyptians had those third eyes and like all the Eastern mysticism had that pineal gland highlighted? It was like the, on the end of shafts or staffs, they would put those pine cones, like how the hell they know all that? Well, there is a, I mean, the third eye metaphor, or, I mean, it's not, it's more than a metaphor, you know, anatomically, but it's, it's a, itically, but it correlates with the kind of experience you can have. I don't actually know if the experience people have, this chakra, if you want to talk in yogic terms, I don't know if that has anything to do with the pineal gland. It may be, because I don't know, I don't
Starting point is 00:40:45 think anyone's done this neuroimaging experiment where you can get people who can reliably produce a third eye opening sort of experience and scan their brains while they do it. In fact, I'm almost positive that hasn't been done. But there is a phenomenology here of people having a kind of inner opening of, it's almost certainly largely a matter of visual cortex getting stimulated. But you can meditate in such a way as to produce this experience. And it's also an experience that you have more or less on different psychedelics. Some psychedelics are much more visual than others at certain doses, in particular like mushrooms and DMT, which I've never taken, which you can say better than I, but it's reported to be quite visual. And the experience, so most people when they close their eyes,
Starting point is 00:41:42 unless they're having hypnagogic images before sleep or they just happen to be, you know, super good visualizers of imagery, you close your eyes and you just basically have darkness there, right? Now, if you close your eyes and if you're listening to this and you close your eyes and you look into the darkness of your closed eyes, that actually is, that is as much your visual field as it is when your eyes are open, right? I mean, it's not like your visual field hasn't gone away when you close your eyes. It's just, it's now, there's not much detail for you to notice, again, unless you are in some unusual state. But that, you know, based on different techniques of meditation, and this happens spontaneously, again, with hypnagogic images or with psychedelics, that space can open up into just a massive world of visual display, right? So you can just see a full-blown, you know, 3D movie in there. And it's a, but most of us just take it for granted that when you close your eyes,
Starting point is 00:42:49 you're functionally blind, you can't see anything, and we're not interested in that space. But you can actually train yourself to look deeply into that space as a technique of meditation. I don't want to interrupt you, but does that have implications in people having eyewitness testimony and eyewitness experiences that turned out to not be true at all? Because if you think about the human mind and the imagination being able to create imagery once the eyes are closed, like you can in sensory deprivation tanks. Sensory deprivation tanks, a lot of people's experiences are very visual, even though it's in complete darkness.
Starting point is 00:43:26 Now, you know how people see things and they thought they saw something and it turns out to not be true at all, whether it's Bigfoot or whether it's a robbery or a suspect. And they get the details completely all wrong. and when your pulse is jacked up and your adrenaline's running and you're worried about all these possibilities and your imagination starts formulating predetermined possibilities you should be looking out for. Like what if it's Bigfoot? What if it's a robber? What if it's an alien? What if it's a this?
Starting point is 00:43:57 And then these people that swear they saw these things that everybody knows they didn't, like maybe there was video footage of it or whatever it was. Is it possible that your brain can do that to you and can literally show you things that aren't real? Oh, yeah, it can do that. Although I think the unreliability of witness testimony, and it's shockingly unreliable, is more a matter of memory, the corruption of memory and the way memories are recalled. They're especially vulnerable when you're recalling them. They can be revised in the act of recall. And it's very easy to tamper with people's memory, you know,
Starting point is 00:44:39 albeit inadvertently. I mean, you can do this on purpose too, but people just do it with bad interrogation techniques. So, you know, the cop will the cop will ask you, so when did you see the woman at the crosswalk? And he's just put a woman at the crosswalk into your memory, right? Because you can't help but visualize a woman at the crosswalk. And it's just, memory is very fragile. And so whenever you're giving an account of an experience, even if it's an experience that happened half a second ago, now we're in the domain of memory. Now we're in the domain of just what you can report. It's not a matter of what you're consciously experiencing. consciously experiencing. Now, I know there was a case in India where I believe it was a woman was convicted of murder through the use of an fMRI or a functioning magnetic resonance imagery machine. And through this fMRI, they determined in some strange way that she had functional knowledge of the crime scene. And the argument against that, I believe, was that she had functional knowledge of the crime scene. And the argument against that, I believe,
Starting point is 00:45:52 was that she could have developed functional memory of the crime scene by being told you're being prosecuted for a crime. You might go to jail for murder. Here's the crime scene. Or just being unlucky enough to be familiar. I mean, normally when this gets done, and there are people who do it in the States, but they don't use fMRI as their modality, but they do interrogate people's familiarity,
Starting point is 00:46:16 and they use EEG as a way of monitoring their familiarity. as a way of monitoring their familiarity with it. So they'll show them, you know, if you are shown evidence from the crime scene that only the perpetrator could have seen, you know, hopefully it's really something that only the perpetrator could have seen. But if, you know, if they show you the picture and, you know, you see the, oh, yeah, you know, I have that Ikea end table and, you know and I have that dress from Banana Republic or whatever, just by dint of bad luck, you're familiar with something that you're being shown from the crime scene.
Starting point is 00:46:52 And especially if it's a murder, you're talking about someone who she was probably intimate with. She probably knew them at least. So she's probably been to their house. Yeah, that would be obviously a case where you really couldn't do it at all. How does it work?
Starting point is 00:47:16 Well, no one in the States, as far as I know, unless this has changed in the last year or so since I've paid attention to this, none of this is admissible in court. Right. Only in India. Yes. And only the one case that I ever heard of. Yeah, I mean, it's a crazy premature use of the technology. But how does the technology work? Like, what is it actually seeing? Well, you have, with EEG, I think this has worked out more. You can have a kind of a canonical familiarity response to a stimuli. If you've seen something again, if you've seen something for the first time, you know, it would be a novelty response. Seeing something for the third, fourth, fifth time would be a different response. And
Starting point is 00:47:50 you can, I mean, this has been, again, it's been a long time since I've looked at this particular research, and I don't know how, you know, I don't know what they're calling these waveforms now. I mean, there was a P300 waveform at one point, and there are waveforms that come from certain areas of the brain at certain timing intervals based on, you know, from the moment of receiving a stimulus, you know, let's say a photograph. And they are the hallmark of either being familiar or not with this thing. of either being familiar or not with this thing. And, yeah, so, I mean, it's not, there's no question that at a certain point we will have reliable mind-reading machines. I think it's really just a matter of time.
Starting point is 00:48:38 I think there's also no question that we don't have them now, at least not in a way that we can send someone to prison on the basis of what their brain did in an experiment. But, I mean, just as you, I mean, anything, well, a lot of the most interesting stuff is unconscious, but anything you're consciously aware of having seen before, right? So if you were to show me this cup, right, and then five seconds later say, is this the cup I showed you? You know, I have a very clear sense of, yeah, that's the cup, right? And if you show me a completely different cup, I'm going to have a very clear internal sense of, no, that's not the cup. If you're having that experience, you know, that is absolutely something
Starting point is 00:49:28 about the state of your brain that can be discriminated by a person outside your brain running the appropriate experiment. It's just our tools are still sufficiently coarse that, you know, it's not like, you know, sequencing your DNA where we can say, yeah, that was you and it was your blood at the crime scene. But eventually it will be, it will be just, there'll be no basis to doubt it because I'll be able to put you through a paradigm where it'll just be clear, I know your thoughts, right? So think of how much faith you would have in this technology if you could open your computer and read any file, a file of your choosing that I have never seen, right? So the contents of which are completely, I'm completely
Starting point is 00:50:22 blind to. And I'm scanning your brain while you're reading this journal entry or a newspaper article or whatever. And at the end of that, I can say, well, you know, based on this report, you clearly read a story about Donald Trump. And you actually don't like Donald Trump. And I could tell you, you know, in detail about, you know, what you were consciously thinking about. You know, if you did that, if you could do that 100% of the time, right, at a certain point, the basis to doubt the validity of the mind reading machine would just go away. You would just, it would be like, you know, it's like, are you really hearing my voice right now? Yeah, you certainly seem to, you know, it's, it's just, it would become a, just a, a background assumption that the technology works at a certain point.
Starting point is 00:51:10 Just as the, the meat is scary, the fake meat, that idea is terrifying. The idea of losing the, the losing the privacy of your thoughts. Yeah. The ownership, like losing a thought, becoming non-autonomous, like the idea of everybody sharing thoughts, it seems almost inevitable. Well, this, you know, I don't know that we would decide to do this. Certainly, I don't think we would do this all the time, right? I mean, it might be fun to do it sometime. But someday, it might be no option. We might be compelled to do it.
Starting point is 00:51:41 I mean, if you're at a murder trial, right, if you're someone who may have killed somebody, you, I mean, the Fifth Amendment becomes interesting in that case. But we have worked that out for DNA evidence, right? So you have a right to take the Fifth, but you don't have a right to withhold your blood from the proceeding, right? So I get to look at your blood or your saliva. So you don't have to talk, but I can read your mind. Yeah, yeah. And I think that's a—well, and this is like the iPhone case. You know, it's like we've thus far decided that we can't compel Apple to unlock an iPhone. This is just the ultimate case of that. So the argument in favor of Apple is this is far too intimate and far too comprehensive a look at a person's mind. You know, once you have cracked their iPhone,
Starting point is 00:52:32 you're basically walking around the corridors of their brain. That's a slippery slope we don't want to go down. but ultimately we will be able to do it with brains. And I think, obviously, you're going to want all of the safeguards that you can easily imagine, and probably some we have yet to imagine on this process. But safeguards in place, I think this would be the best thing that could ever happen to us in terms of, you look at a criminal justice system. When you look at the price we pay for not being able to tell whether someone is innocent or guilty or whether they're lying, it is the most intolerable price that I see in society. It's just the amount of human misery born of not being able to demonstrate that someone is lying reliably. It's just, you know, it's off. It's the biggest lever that I
Starting point is 00:53:34 think we could pull. Certainly when you have prisons that are filled with a lot of people that are probably innocent. Oh, yeah. Oh, yeah. People have gone to the gallows, no doubt. I mean, we know innocent people have been killed. And, you know, one way to deal with that is just, you know, be against the death penalty. Right. So there's always a chance to find their innocence. But imagine being in prison on death row for 30 years and you're, you know, for rape, a rape you didn't commit. You know, it's it's so horrible. It's horrifying. So, yeah, that would be the major argument.
Starting point is 00:54:04 It would be such a simple solution, really. Now, there are corner cases that wouldn't be solved. There are people who are delusional or easily self-deceived who either would be lying but so convinced of the truth of their lives that they would pass this lie detector, presumably. the truth of their lives, that they would pass this lie detector, presumably. And then there are people who are so suggestible that they can be led to believe something that's not true. I mean, there are people who can be convinced, you get these false confessions of, in response to a crime. I mean, it's one of the strangest things in the world.
Starting point is 00:54:41 But, you know, when someone gets murdered, this doesn't get publicized very much, but it's a very common experience of police officers or police departments to hear from people in the community who are confessing to the crime, and they didn't commit it. They just come in and they say, I did it. And they give all this, you know, this bogus account of what happened. And this is a sign of mental illness, or these are people seeking attention in some morbid way. But there are people clearly who are so suggestible that they can either lead themselves to believe or be led by others to believe that they've done things they didn't do in just shocking detail. There was another New Yorker article on, I think it was written by William Langewisch, this was years ago,
Starting point is 00:55:35 but on a satanic panic case where a guy got accused of running a satanic cult by, I think, his daughter, who was in hypnosis recovery therapy, right? So she had been led down the primrose path by her therapist, and so she obviously was fairly suggestible. and she recalled the most lurid, just insane Rosemary's Baby-style cult imaginable going on in her town where the friends of Dad were coming over and raping everyone, and there was a human sacrifice of infants, and the infants were buried by the barn. She recalled all this. The Dad was so suggestible that he just confessed.
Starting point is 00:56:29 And he was giving more details, and he went to prison. I don't know. This is now an old article. It's probably like 10 or 15 years old. But this guy just fully confessed, right? And the journalist, who I believe was Langewisch, at the end, they actually did a test. This guy's now in prison, right? The process is completed.
Starting point is 00:56:51 Justice has been done. The daughter is convinced that her dad is a satanic monster, as is he, and he's now in prison. And they went in and interviewed him, asking follow up questions with just details that they made up. Right. Because they began to suspect that he was just this kind of suggestibility machine. Right. Who just would cop to anything. And so they went in and they just made up stuff like, oh, you know, there's a few more details we want to iron out. Your daughter said that there was a time where, you know, you got you brought in a horse and then you were riding on the horse and um and then you killed the horse i'm not i'm making these details up because i don't remember but um something that they just concocted right and he said oh yeah yeah yeah that's um so and he just copped it out it became like a twilight zone it's
Starting point is 00:57:39 like a perfect twilight zone episode where like now you realize that this guy has been put away is just saying yes to everything right and that's that's um so clearly how they resolve it i don't know again i don't know if he ever wrote any follow-up on this because this as i recall and this this is like a 15 year old story um it just they ended with like you know the twilight zone moment where now you realize this guy is innocent and just saying yes to everything. And his daughter's crazy. Right.
Starting point is 00:58:08 Because she shares his genes probably. I don't remember. I don't recall what the daughter did with that. But he, I mean, the story, and perhaps there's more to the story, but the story on his face was totally exculpatory. Like the reader experience was you got to let this guy out of prison tomorrow. Wow. But he's fucked. I mean, even if he gets out,
Starting point is 00:58:30 his mind is probably so screwed up by this whole experience. And if he really does believe that he ran these satanic rituals, just the guilt and shame of it all. I mean, we're assuming that his mind works. I mean, this is the thing that I wonder, how many people are, like,
Starting point is 00:58:49 functionally deeply, deeply damaged, but they're functional. Like they're going to the same schools that you go to. They work where you work, but they're barely a person. They're like all their connections are all fucked up. Like if you went to the back wiring of their head, if you were like an appliance repair person, like, so your TV working huh let me go back here what the fuck is all this i mean how many people are like that that are just sort of kind of functional my question is if we do get to a point where you could read minds what if you go into their minds and you find out well this is what they really think like this is not a liar this is a person who's seeing things that aren't there like a person who's completely
Starting point is 00:59:27 delusional like people that have these um hallucinating uh hallucinogenic visions like some people have like really like deeply troubling visual images that they see imagine if these poor fucking people really are seeing that and if you could read their mind you would you would like literally be inside the mind of a person whose mind isn't functioning and we can get sort of an understanding about what that would be like yeah well i mean this has been done in a very simple way where you could with schizophrenics who mostly have auditory hallucinations you you can now detect auditory cortex activity so mishapsaps, misinterpretations you can detect? We just know that their auditory cortices are active in the same way that when you're hearing my voice, it's going to be active.
Starting point is 01:00:12 When they're hearing internal voices, it's active. Whoa, that's crazy. Which is what you would expect. I mean, the surprise would be the other way. If nothing was going on in the auditory cortex and they were hearing voices, then that would be more surprising to me. That's fascinating. You can watch the hallucinations take place inside the mind of the person. Whoa. And you can stimulate hallucinations in people. You can stimulate out-of-body experiences in people with transcranial magnetic stimulation.
Starting point is 01:00:45 Yeah. How do they do that? They've been able to do that recently and to try to give people the same sort of experience that they claim to have had on the operating table. Yeah. That's what people have a lot, right? When they're almost dead. Yeah. What's going on? What is that? Well, that is, there's an area, I believe this is at the temporal parietal junction, which is sort of here. He's pointing to above his ear.
Starting point is 01:01:12 Yeah, yeah, yeah, sorry. A lot of people listening. I'm talking to you, not talking to the millions. But it's, you know, where the temporal lobe and the parietal lobes intersect. And I think it was first discovered in surgery on an epileptic, or in any kind of resection of the brain where people are awake because there's no pain sensors in the brain. So you can stay awake while you're getting brain surgery, and they tend to keep you awake if they're going to be removing areas of the brain, you know, let's say a tumor or the focus of an epileptic seizure, and they don't want to remove working parts, especially, you know, language parts.
Starting point is 01:01:57 So they're keeping people awake, and they're probing those areas of the cortex to see what it's correlated with in the person's experience. So they're having them talk, they're having them answer questions, and they're putting a little bit of current in that area, which would be disruptive of normal function. And, you know, you can see they're mapping, almost entirely mapping language cortex when they do this. But there have been experiences where a neurosurgeon will put a little current they're up in the corner of the room looking down on their bodies or the classic astral projection experience or the near-death experience where people have risen out of their body or seem to have risen out of their body. And consciousness now seems to be located elsewhere.
Starting point is 01:03:08 body. And consciousness now seems to be located, you know, elsewhere. And that's a, there's just a, that region of the brain is, I mean, virtually every region of the cortex does many, many things. There's no, there's no one region of the brain that does one thing. There's a couple of exceptions to this. So the whole brain is participating in much of what we do and it's just greater or lesser degrees of activity. But in terms of your mapping your body
Starting point is 01:03:36 in space, this is the parietal lobe has got a lot to do with that. And when that gets disturbed, you can have weird experiences. You can have the experience of not recognizing your body or parts of your body, like alien hand syndrome, where this left arm seems like another person's arm, and people try to disown half their body. seems like another person's arm, you know, and people try to disown half their body.
Starting point is 01:04:14 And you can trick people with, you know, visual changes of display. Like you can wear headgear where you can make me feel like it's called the body swapping illusion. I can feel like I am located, my consciousness is located in your body. Looking back at me, there's a clever experiment that they did where that is the ultimate extension of what has long been called the rubber hand illusion, where you can put, like my two hands are on the table now, you can set up an experiment where if you put a rubber hand, if you set this up in a way where I am assuming I have my two hands here, you can put a rubber hand in its place and touch this rubber hand with a brush, right? So I'm seeing the rubber hand get touched with a brush,
Starting point is 01:05:09 and I can feel like my hand is being touched. It's like if my hand is being touched, my hand is elsewhere under the table being touched with a brush at the same time. I can feel like my hand is now the rubber hand, right? So I can feel like my hand is in place of the rubber hand based on visual and tactile, you know, the simultaneity of my seeing the rubber hand get touched with a brush and my feeling my hand, which is now under the table, being touched with a brush.
Starting point is 01:05:38 I'm not explaining that setup great, but people can look it up. But you can do the same thing to the ultimate degree with this video goggle display where I'm getting input, visual input from where you're standing. So like if you come up to shake my hand, right, I'm seeing you come up to me and shake my hand, but I'm seeing it from your point of view, right? So like I'm getting, you know, visual input from, but I'm seeing it from your point of view, right? So like I'm getting visual input from, you know, I now feel like I'm walking up to me shaking my hand.
Starting point is 01:06:08 Right. And you can just kind of feel like your consciousness is over there, you know, outside your body. And this is just to say that our sense of self, our sense of being located where we are in our heads, is largely, and in some cases almost entirely, a matter of vision. It's a matter of the fact that you feel you're over there is because you're, I mean, that's where your eyes are.
Starting point is 01:06:40 You're behind your eyes. You feel like you're behind your eyes you feel like you're behind your eyes and when you and tricks of a vision can can uh seem to dislodge um that's that sense of being located there so stimulating one area of the brain uh electrically has been shown like even just transdermally just in the i guess what do they do they put little little what are those things called? Those little glue-on things? Electrodes? Is that what it is? You're talking about EEG there. That's reading from the brain rather than... No, I'm actually talking about some new experiments that they've been doing that they were talking about on Radiolab. And apparently there's a bunch of do-it-yourselfers. Have you ever listened to Radiolab? Yeah, it's been a while.
Starting point is 01:07:23 Amazing podcast. One of the best ever. But it had this one episode that dealt with people learning certain skills while the outside of their brain was being stimulated with a little electrode. And this woman, who was one of the reporters, went to a sniper training thing. They set up this scenario and they give you a fake gun. You point at the screen and you try to hit the target. It says all these things are happening uh she did it once yeah nine volt nirvana is the name of the episode right right it is an outstanding episode it's so fascinating anyway she goes through one she's terrible at it she just sucks terribly then they hook her up to
Starting point is 01:07:59 this machine they attach this electrode to a certain certain area of her brain stimulate one area of her brain and stimulate one area of her brain. And she goes through it like a fucking sniper. Time slows down. She gets 20 out of 20. So she goes from being a complete failure to being awesome at it in some weird flow state that she described. And they're talking about how the U.S. government's using it. They're trying to train soldiers and snipers and people to try to understand this mindset and try to
Starting point is 01:08:25 achieve this mindset and that they're trying to do it. And there's certain companies that are experimenting with it, at least, by stimulating the outside of your head. So I could not know this particular story. I remember hearing that title, though. But the transcranial magnetic stimulation is magnetic energy, which is the flip side of electrical energy. So if you apply a big magnet to the side of your head, you are changing the electrical properties of your cortex. Is that what they're doing?
Starting point is 01:08:57 Well, no. I'm wondering if they've just made a transcranial magnetic device so small that it's... I don't think it's magnetic. So it's direct current. Yeah. Well... Yeah. Okay. Direct current stimulation. Transcranial direct current stimulation. Yeah. That's what it's called. TCD. Yeah. Well, so yeah, I'm unaware of the specifics of this, but it's been true for a long time that with a much bigger device in a lab, you can change a person's performance for good or for ill on various tasks. And it's just an electromagnet, which is focusing its energy on various areas of the cortex.
Starting point is 01:09:38 Sobhs are. Yeah. Because what you're doing, I mean, you are disrupting neural firing, but you can disrupt areas that are inhibiting other things you want to release. So it's not always synonymous with the degradation of performance. I mean, you could increase performance on a certain task by taking one region of the brain offline or more or less offline. But I'm not, you know, I'm not aware of how far they've taken it in terms of doing anything that seems useful in terms of performing something. I mean, the research I'm more aware of is just using this to figure out what various regions of the brain are doing. I mean, they're kind of mapping function because you want to see if I disrupt an area here, how does that show up in an experiment? And that gives you some clue as to what that region is
Starting point is 01:10:37 doing, at least in that task. As much as we know about the mind and being able to do things like this, like overall, if you had to really like try to map out the exact functions of the mind and how everything works how far do you think we are along to understanding that are we halfway do you think we understand half of how the brain works no well i mean i wouldn't even know how to quantify it at this point it's just we know a ton right and we know it's like we know a know a lot about where language is and where facial recognition is. So your visual cortex has been really well mapped. And we know a lot. And for the last 150 years, based on just neurological injury and then in the last decades based on just neurological injury, and then in the last decades, based on imaging technology,
Starting point is 01:11:29 we know regions of the brain that absolutely govern language and regions of the brain that have basically nothing to do with language, to take one example. And we know a lot about memory, and we know a lot about the different kinds of memory. And we know a lot about memory and we know a lot about the different kinds of memory. But it's, there's, you know, I think there's probably, there's much more we don't know. What's even more, kind of the greatest friction in the system is like, there's not often a lot to do with what we know, right? So like knowing is not enough for certain things. It's not because to intervene is another part of the
Starting point is 01:12:13 process, which where there are no guarantees. I mean, the way we can intervene in the functioning of a brain is incredibly crude, you know, pharmacologically or with surgery or with, you know, a device like that. So to get from a place of really refined knowledge to a place of being able to do something we want to do with that knowledge, that's another step. And that's a step that is, you know, there's no reason to think that we're not going to take it at some point, but it's an additional complexity to get inside the head safely and help people, you know, there's no reason to think that we're not going to take it at some point, but it's an additional complexity to get inside the head safely and help people, you know, or improve function. Even if you know a lot about what, you know, those areas of the brain do. But we don't, you know, we don't know, we haven't cracked the neural code.
Starting point is 01:13:00 We don't know how consciousness is arising in the brain. the neural code. We don't know how consciousness is arising in the brain. We wouldn't know how to build a computer that does what we do, to say nothing of experience the world as we experience it yet. And we may be going down paths where we will build it more by happenstance, where we might build it and not quite know how it's doing what it's doing, but it's seeming to do more or less what we do. We'll be doing it very differently. I mean, so there are two paths, or at least two distinct paths in artificial intelligence. And one path could try to emulate what the brain is doing, and that obviously requires a real detailed understanding of what the brain is doing. Another path would be to just ignore the brain, right?
Starting point is 01:13:55 So there's no reason why artificially intelligent machines, even machines that are superhuman in their capacities, need to do anything that is similar to what we do with our brains, you know, with neurochemical circuits. So, because they're going to be organized differently and, you know, could be organized quite differently and obviously made of totally different stuff. So, whether you want to go down the path of emulating the brain on the basis of detailed understanding of it,
Starting point is 01:14:31 or you just want to go down the path of maximizing intelligent behavior in machines, or some combination of the two, they're're they're not necessarily they're they're distinct and and one doesn't entail really knowing much about the brain necessarily so there's really two different ways they can go about it either they could try to reproduce a brain yeah and people they can make fake meat yeah i mean if they can make fake meat why can't they make fake brain tissue it seems like they could i mean I know a woman got her bladder replaced with stem cells. Did you hear about that? They took stem cells and recreated her own bladder in a, she had bladder cancer. And so they built her a new bladder in a laboratory and put her
Starting point is 01:15:16 own bladder back in her body. That is insanely fascinating. Now, if they can figure out how to extract a little bit of brain tissue. The thing is that the brain is famously the most complicated object in the universe. A bladder is essentially a bag. Yeah. So it's a... Big leap. It's a big leap, but it's not... I think it's a leap we will take.
Starting point is 01:15:41 And whether we take... It may be a leap we take in this stepwise way where we build machines down a path that is not at all analogous to recreating brains um which allow us to then understand the brain you know totally um in in the in the ray Kurzweil sense, where we can upload ourselves, if that makes any sense. But it's a... I mean, I think information processing is at bottom what intelligence is. I think that is not really up for dispute at this point, that any intelligence
Starting point is 01:16:28 system is processing information, and our brains are doing that. And any machine that is going to exhibit the kind of general intelligence that we exhibit and surpass us will be doing, that we exhibit and surpass us, we'll be doing, by dint of its hardware and software, something deeply analogous to what our brains are doing. But again, we may not get there based on directly emulating what our brains are doing. And we may get there before we actually understand our brains in a way that would allow us to emulate it.
Starting point is 01:17:04 We may get there before we actually understand our brains in a way that would allow us to emulate it. It's always very interesting to me how it seems to be there's always pushes and pulls in life. And when you have things that are as horrific as factory farming and people are exposed to it, then there's this rebound. And where people are trying to find a solution. And I always wonder, like, will that be the first artificial life that we create, like zombie cows? Like, maybe if we figured out that meat in the lab is not good because it has to actually be moving around for it to be good for you. Maybe they'll come up with some idea to just, look, we're going to make zombies.
Starting point is 01:17:44 We're going to make livestock that essentially can just move forward and consume food. There's no thought whatsoever. These are zombies. You can go right up to them. You wave your hand in front of them. They don't even move. Is it okay to kill those? And then go from that to making artificial people. Because it seems to me that artificial people, it's going to happen. I mean, it's just a matter of how much time if they're making bladders and then they're going to start making all sorts of different tissues with stem cells to try to replace body parts and organs and they're gonna they're gonna work their way through an actual human body it's gonna happen do you mean you mean a a brainless person that would
Starting point is 01:18:19 be like spare parts for for you that for sure that's an option. But I think also, an artificial human, I mean, it might take a thousand years, but I think if we stay alive, if human beings, rather, if human beings continue to evolve technologically, within the next thousand years, we're going to have artificial people that are completely... So you mean they're people,
Starting point is 01:18:39 so we're going to build a biological person. 100%. So it's a synthesized person, right? So it's not, you're not talking about the perfect robot, you're talking build a biological person. 100%. So it's a synthesized person, right? So you're not talking about the perfect robot. You're talking about an actual person built up cell by cell. You know, there's no reason why that's not possible. Whether or not we would do it is another question. But I had this idea.
Starting point is 01:19:03 Somebody would. China. Yeah, well, presumably there are limits even in China. Aren't they doing experiences already with human embryos? I don't know what they're doing in China.
Starting point is 01:19:14 I believe they are. I believe they got the go-ahead. I just know they have that dog festival that every year I see people tweet about. China's a big place, though. Can't run from all in there. But, you know, what is that gene-splicing software?
Starting point is 01:19:26 CRISPR. CRISPR, yeah. They're using CRISPR on human fetuses or human embryos. Right. So good luck. Good luck, world. China's creating super athletes. There are many issues there.
Starting point is 01:19:38 So when you're talking about changing the genome, especially when you're talking about changing the genome, and especially when you're talking about changing the germline, you know, then it gets passed on to future generations. That has big implications. But, you know, I don't see why. I mean, this goes to, it's like the artificial meat conversation. So to grow meat in a vat is ethically the same thing as, at least in my view, it'd be the same thing as producing a brainless cow, right? So you have the whole cow that you could slaughter, but it has no
Starting point is 01:20:14 brain. So presumably there's no experience in this animal, but it is the fully functioning animal, right? So let's say you could produce that and you would produce healthy meat. It's just a messier, presumably you have to feed this thing, right? I don't know how you get it to eat, but you'll say you feed it intravenously. It all begins to look weirder and weirder, but there's no suffering there because there's no brain. but there's no suffering there because there's no brain. I think we would, and I think we have, decided to bypass that vision and just go straight to the vat and just build it up cell by cell
Starting point is 01:20:54 and build up only what we need, which is the meatball or the steak. So why have the fur and the organs that you don't want and the mess and the energy-intensive aspects of producing a whole animal. And I think with, like, spare parts for humans, rather than create a clone of yourself that has no brain that you just keep in a vat somewhere in your garage where you can get spare kidneys when you need them,
Starting point is 01:21:22 we would just be able to print the kidneys. Because that gets around a lot of the weirdness, right? It would be weird to have a copy of yourself that's just spare parts. Whereas it wouldn't be weird, or at least in my view it wouldn't be weird, it would be fantastic to be able to go into a hospital when your kidneys are failing and they just take a cell and print you a new kidney. Yeah. Um, I think that can be expected. Yeah. That's gonna, I mean, if it's possible, it seems like it's going to be possible. The bladder example you just gave is, is, uh, what's happening there. It's amazing. But I always want to extrapolate
Starting point is 01:22:02 things to some bizarre place a thousand years from now for some reason. Because I've been, you know, since I got into Dan Carlin's hardcore history, it really fucked my mind up about how I think about the past in this way that I look like a thousand years ago in comparison to today. And I, you know, I try to think, well, how much different will people be a thousand years from now? And probably way more different. Yeah. I mean, the fascination that we have with ancient history is that we,
Starting point is 01:22:33 one of the things, obviously, is we want to know where we came from, but also we can kind of see people today doing similar shit if they were allowed to. Like if everything went horribly wrong, people at their base level are kind of similar today as they were a thousand years from now yeah one of them might be running for president we can talk about that and when i think about the future a thousand years from now with the the way technology is accelerating and the just the capacity that we have and ability to change things, to change the world, to change physical structures, to change bodies, to dig into the ground and extract resources.
Starting point is 01:23:14 Like we're getting better and better at changing things and manipulating things and extracting power from the sun and extracting salt from the water. There's like all this bizarre change technology that's consistently and constantly going on with people. And it continues to get better. When I think about a thousand years from now and artificial people and this concept of being able to read each other's minds and being able to map out imagery and pass it back and forth from mind to mind in like a clear spreadsheet form. Like, what is that movie? The Tom Cruise movie? Minority Report. Thank you. I mean, that's on the way. mind to mind in like a clear spreadsheet form like uh what is that movie the tom cruise movie minority report thank you i mean that's on the way it's on the way it's going to be incredibly strange to be a person yeah whether we will be people in in a thousand years i think that you
Starting point is 01:23:58 would unless we have done something terrible and knocked ourselves back a thousand years, I think we will decide to change ourselves in that time in ways that will make us – there may be many different species. It's like tattoos. You have a bunch of tattoos. I have none. You could take that a lot further if you can just begin really tinkering with everything. Oh, yeah, if you want to get nutty and put bolts in your head and shit. Or just give yourself fundamentally different genetic abilities. I mean, you could become a different species if you took it far enough.
Starting point is 01:24:40 Maybe that's what the aliens are. Maybe that's why they all look the same. Maybe they figured it out. Like, look, you got to look alike. Otherwise, you can't appreciate each other. One guy's weird looking. One guy's short. One guy's got a big nose.
Starting point is 01:24:54 Everybody's so confusing. Too many people I could hate. Everybody looked exactly the same. So the government gets together with all the people to their planet. And they go, look, we have a problem with this good looking thing. It's bullshit. It's holding us back. A lot of lot of people they're stupid they're getting ahead we gotta all look the same blank emotionless and big giant eyes yeah that's it they all just went
Starting point is 01:25:12 in yeah with a passion for molesting cattle allegedly i think that's people people are blaming that on the aliens i had this guy on my podcast david deutsch have you heard of him he's a physicist at oxford yes i have why i've heard him? He wrote a book. He gave at least one TED Talk, and he's written two very good books. The first came out about 10 years ago, The Fabric of Reality. And the more recent one is The Beginning of Infinity. And and extremely smart guy and very nice guy. And he has this thesis, which he and I don't totally agree about the implications going forward for AI, but he's convinced me of his basic thesis, which is fascinating, which is the role that knowledge plays in our universe,
Starting point is 01:26:07 or the potential role that it plays. And his argument is that in any corner of the universe, anything that is compatible with the laws of physics can be done with the requisite knowledge. So he has this argument about how deep knowledge goes and therefore how valuable it is in the end. So I'm cuing off your notion of building an artificial person, literally cell by cell or atom by atom. There's every reason to believe that's compatible with the laws of physics.
Starting point is 01:26:44 I mean, we exist, right? So we got built by the happenstance of biology. If we had what he calls a universal constructor, you know, the smallest machine that could assemble any other machine atom by atom, we could build anything atom by atom, right? And so he has this vision of it, like, you could literally go into an area of deep space that is as close to a vacuum as possible, and, you know, begin sweeping up stray hydrogen atoms and fuse them together and generate heavier elements.
Starting point is 01:27:27 I mean, literally, so you could start with nothing but hydrogen, right? And with the requisite knowledge, build your own little fusion reactor, create heavier elements, and based on those elements, create the smallest machine that can then assemble anything else atom by atom, including more of itself, right? And you could start this process of building anything from a person to something far more advanced than a person. To a planet? Just anything that's made of atoms, right?
Starting point is 01:28:01 Anything that is organized. And so the limiting factor in that case is always the knowledge, right? So the limiting factor is either the laws of physics, either this can't be done because it's physically impossible, or the knowledge is what you're lacking. And given that human beings are physically impossible, there should be some knowledge path whereby you could assemble one atom by atom, right? There's no reason why, there's no deep physical reason why that wouldn't be the case. The reason is we don't know how to do it, right?
Starting point is 01:28:38 But presumably it'd be possible for us to acquire that knowledge. And so the horizon of knowledge just extends functionally without limit, right? We're nowhere near the place where we know everything that's knowable, as witnessed by the fact that we don't yet know how to build a human atom by atom. But when you imagine just the changes that could occur in our world with the frontiers of knowledge explored 10,000 years beyond where we are now, we would be unrecognizable to ourselves.
Starting point is 01:29:22 Everything would be equivalent to magic if we could see it now. And most of human history is not like that. Most of human history, if you dropped into any period of human history, it was, for all intents and purposes, identical to the way it was 500 years before and 500 years before that. It's only very recently where you would drop in and be surprised by the technology and by the culture and by what is being done with language and the consequences of cooperation among apes like ourselves. And so I think that, I mean, this is one place where someone like Kurzweil makes a lot of sense. This is clearly accelerating, right? And if we don't do something catastrophic to set us back, this acceleration is, you know, the implication is that the future is going to be far less recognizable than it has been in any other period of human history.
Starting point is 01:30:29 The idea of creating a little machine that you could shoot out into the universe and it'll build you a planet. Yeah, yeah. But a planet is, a planet's not the most, a planet's obviously big and the basis for our biosphere and everything we care about. But the planet is not what's complicated. It's the ecosystem. It's the life on the planet and our own brains being the ultimate example of that complexity. But presumably, intelligent systems can become much more complex than that, right? There's no reason to think that we are near the summit of possible intelligence, biological or otherwise.
Starting point is 01:31:12 And, yeah, once you begin thinking about building things atom by atom, thinking about building things atom by atom, then the future begins to look very weird, because it's, and automating that process, right? Where you have, and this is the promise of nanotechnology, where you have something, tiny machines that can both build more of themselves and more of anything else that would be made of tiny machines or assemble anything atom by atom or treat your own body like the machine that it is and deal with it atom by atom. I mean, the possibilities of intervention in the human body
Starting point is 01:31:59 are then virtually limitless. So it's a, yeah, I mean, that's where we, that's where the physical world begins to look just totally fungible. When you're not talking about surgery, where you're cutting into someone's head and hoping, in very coarse ways, hoping you're not taking out areas of brain that they need,
Starting point is 01:32:25 but you're talking about actually repairing. I mean, if you can tinker with atoms in a way that you understand, then you're talking about repairing anything. And creating anything. Yeah. Like literally Dr. Manhattan style, build ice condos on Mars. I mean, you could create art with it you could do anything with it it would just somehow or another have to be programmed with whatever pattern you
Starting point is 01:32:50 were trying to create yeah you could you could essentially like make an earth somewhere else with all the biological diversity water even intelligent life it all could be done through some sort of a machine well it is ultimately one day it is at bottom again just the information it's the knowledge of how to it's organized and how to implement it so it's like it's analogous to what has happened in films now where like the animation and film has gotten good suddenly right where you can see, like they can animate, you know, waving hair and flowing water, and it looks pretty damn good. It looked terrible 30 years ago, right? And we just acclimated to it looking terrible.
Starting point is 01:33:35 But now you can really trick the eye. You can build up scenes of nature where they're not actually using any photography of nature to build it up. It's all a confection of ones and zeros, right? They've just built it in software. Like the Revenant. Perfect example. That giant bear.
Starting point is 01:33:59 The attack leaner of a capro. I don't know how that was done, i mean i i had heard that that was all just pure cgi right yeah when there's a dude in a costume that acted it out with them but uh essentially it was just all cgi yeah um yeah so that so that the fact that that's beginning to look good that's just i mean obviously that's just all surface that doesn't has no implication for the um you know building a rendering of a bear on film is not the same thing as building a bear. But the fact that we can move so far into building, modeling that kind of complexity visually, you know, you just imagine what a, you know,
Starting point is 01:34:40 a super intelligent mind could do with a thousand years to to work at it right and that's and that's we're on the cusp of and when i say cusp you know i don't mean five years but let's say a century we're on the cusp of producing the kind of technology that would allow for that and if we put it into perspective photography i don believe, was even invented until the early 1800s. Right? Is that when it was first? Is there photographs of Lincoln? There are, though.
Starting point is 01:35:11 Yeah. Oh, yeah. So was that the early 1800s when he, when was Lincoln killed? 1865. 1865? 1865? So somewhere after that, or somewhere before that rather they uh invented photography yeah that's so that's uh so essentially give or take 10 years 200 years ago that's really
Starting point is 01:35:33 amazing oh yeah if you stop and think about the revenant from 200 years ago we couldn't figure out how to get sound in our movies until but that 100 years ago that thought right there just actually just freaked me out. What thought? The 200-year thought. Like how little 200 years is in the Mongol days. And now 200 years ago, they didn't even have cameras. And now they do.
Starting point is 01:35:55 And now they have this insane ability to recreate things like Game of Thrones, wolves, and dragons. And she's riding around on a dragon. I'm like, yeah, it looks like she's on a dragon. It doesn't look like old school King Kongong movies you know you ever try to watch those yeah they're awesome pleasure in that they are awesome they're so awesome to to just get into for fun but as far as like visual effects what they can do now and the idea that it's all been done over 200 years is just spectacular not just capturing the, but then recreating an artificial version and projecting it, which is, you know, a thousand times more difficult. But there's another feature here of the compounding power of knowledge and technology.
Starting point is 01:36:51 There are certain gains that are truly incremental, where everything were and where you go from just fundamentally not being able to do anything in that domain and then all of a sudden the domain opens up totally. So, you know, like flight is an example, right? So for the longest time, people couldn't fly. And it was obvious that, you know, you can't fly or you're heavier than air and you're not, you don't have feathers and you're not and there's no way to flap your arms fast enough. We're never going to fly, right? And then at a certain point, flight is possible and opens this whole domain of innovation.
Starting point is 01:37:36 But the difference between not being able to fly— There's no progress you can make on the ground that doesn't avail itself of the principles of flight as we now know them that's going to get you closer. You can't jump a little bit higher. And so there's it doesn't matter what you do with your
Starting point is 01:37:57 shoes. So there are kind of fundamental gains that open up whole DNA sequencing is a more recent example where understanding and having access to the genome. is to basically make a good choice in wife or husband to you can just create a new species in a test tube if you wanted to. And that's that kind of compounding power of understanding the way things work. I think we're at the beginning of a process that could look very, very strange very, very quickly.
Starting point is 01:38:51 And, you know, I think, you know, obviously both in good and bad ways, but I don't think there's any break to pull on this train. Knowledge and intelligence are the most valuable things we have, right? So we're going to grab more insofar as we possibly can, as quickly as we can. And the moments of us deciding not to know things and not to learn how to do things, I mean, those are so few and far between as to be almost impossible to reference, right? I mean, there are moments where people try to pull the brakes and say, they hold a conference
Starting point is 01:39:31 and they say, you know, should we be doing any of this? But then, you know, China does it or threatens to do it. And we wind up finding some way to do it that we consider ethical. that we consider ethical. So there are things like germline tinkering that we, as far as I know, don't do and have decided for good reason we're not doing. But is that going to stop people from doing this? I don't think there's any way. They're more worried about actual real diseases than they
Starting point is 01:40:06 are man-made diseases when we went to the cdcc is that disease control cdc cdc in uh galveston i guess that's what it is i guess the name of the organization but it's a building where they house some of the most horrendous viruses and diseases known to man like they had anthrax in there. Yeah, well, CDC has a lot of that, yeah. Yeah, and they have these crazy walls, like thick, thick walls, and vacuums in the ceilings, and everyone's wearing suits. And they wanted us to get in the suits, and I went, fuck you. There's no way I'm going in that room. You did this for one of your shows?
Starting point is 01:40:40 Yeah, it was for a TV show with Duncan Trussell, the Questions Everything show. Right. Because we were talking about weaponized diseases. And the CDC was like, forget all that. Like, the real diseases that are constantly morphing we have to stay on top of, like, that's what we should be terrified of. Actual real diseases. Like, no one's shown any ability to create this stuff that's more fucked up than what we already have. shown any ability to create this stuff that's more fucked up than what we already have but weaponized anthrax and things along those lines like these russian guys we talked to they were
Starting point is 01:41:11 talking about how they had vats of this stuff they had all kinds of crazy diseases that they had created just in case we had gotten into some insane mutually assured destruction you know disease spreading thing. Like they were down for that. They were like, well, we have to be prepared in case the United States does that. Whoa. But what they're concerned is the Center for Disease Controls guys, they were concerned with things like Ebola, things morphing, things becoming airborne, natural things, new strains
Starting point is 01:41:42 of the flu that become impossible. MRSA. MRSA. MRSA is a terrifying one. MRSA is one that has a lot of people scared, a lot of doctors scared. It's a medication-resistant staph infection that kills people. I mean, it can absolutely kill people if you don't jump on it quick and take the most potent antibiotics we have. And even then, it takes a long time. Yeah.
Starting point is 01:42:04 Yeah. Well, it's a, I actually just tweeted this recently. I think I said, well, would some billionaire, would some, you know, 0.1% or develop some new antibiotics because clearly the government and the market can't figure out how to do it. And it really is, it's falling through the cracks in the government market paradigm where it's like either the government in the government market paradigm, where it's like either the government will do it or the market will do it, but neither are doing it,
Starting point is 01:42:29 because the market can't generate the rationale for developing antibiotics because it's so costly. And you take them, you know, with any luck, you take them once every 10 years for 10 days, and that's it. I mean, that's not like Viagra or any antidepressant or any drug that you're going to take regularly for the rest of your life. And so there's no real market incentive to do it, or at least not enough of one, to spend a billion dollars developing an antibiotic. And the government apparently is not doing it um and we're running out of antibiotics this has been in the news a lot recently but we're close to being in a world where uh it's as though we don't have antibiotics we're a super bug away
Starting point is 01:43:18 from from uh being in that world and you're freaking me out, Sam Harris. I don't like to get freaked out. Super bugs are the big concern, right? Do you think about that in jujitsu? Yes, I've gotten staff. My friend Ari had it, and we were playing pool, and he was limping. And I go, what's going on with your leg, man? He goes, I got a spider bite. I go, let me see.
Starting point is 01:43:43 He pulls up his pants. You got staff. You got to get a doctor. And he hasn't even done jujitsu in years. And I think he got staff again. And I think it's one of those things where once I guess everyone has it on their body. And when you get an infection, then it spreads and grows. And apparently it can be a reoccurring thing. So people who get it, particularly MRSA,
Starting point is 01:44:09 apparently they can get it again, and it can get pretty bad. There was that one fighter who just died. I don't think related to that, but he had these... Oh, Kevin Randleman. Kevin Randleman, yeah, yeah, yeah. That was MRSA, right? A hundred percent, yeah. He was in the hospital. Well, it was staff.
Starting point is 01:44:25 I don't know if it was Mercer. It could have been just staff that he ignored for long periods of time, but it was a hundred percent staff. Yeah. And he died. I don't know what exactly was the cause of his death, but I can't think that that helped him any. I mean, it had gotten so bad that it had eaten through his body. If you're interested in this. for people online that are listening, Google Kevin Randleman staff infection.
Starting point is 01:44:51 And these are horrific photos that look like something took a bite out of him. Like under his armpit. I mean, like a fist-sized chunk of meat was missing. Yeah, it's really, really scary stuff. So yeah, skin infection. There it is right there. You could look deep in and see his muscle tissue. Yeah, that is hardcore.
Starting point is 01:45:13 No exaggeration. It's like a baseball sized hole. And he's got two of them. He's got another one lower down on his back where it's just eating its way through his body. But that was years before he died, right? Or some years. Yeah, it was years before he died.
Starting point is 01:45:26 But who knows, like, how devastating that might have been. Like, that's a really, really bad infection. And I think that once your body's that compromised, I mean, you're, like, really close to death. I mean, he was a really tough, strong, healthy guy. He was a super athlete. Oh, yeah. So I'm assuming his body probably fought it off pretty well.
Starting point is 01:45:48 So it's probably one of the reasons why he let it go so long. Probably didn't maybe didn't even understand like how dangerous it was. Or who knows? Maybe it just jumped on him really quick. My dad's girlfriend just got it on her face and she was in the hospital for two weeks and they were afraid it was going to spread to her brain and it almost did.
Starting point is 01:46:04 And she's not 100% out of the woods yet, but she's back home now. Jesus. She's got a little scratch on her face, and it spread into her cheek, and then from her cheek, she's got a little red swelling, and then she couldn't see, and then she had to go in the hospital. Like, you got staph. Oh, boy. Chill out.
Starting point is 01:46:20 Like, heavy antibiotics right away. Yeah. Like heavy antibiotics right away. Yeah. This is one area that this worries me. I mean, there are the bad things we do, and obviously there's a lot to be worried about there. You know, the stupid wars and the things that it's just obvious that, you know, if we could stop creating needless pain for ourselves or needless conflict, that would lead to a much nicer life. But then there are the good things we neglect to do based on the fact that we don't have a system where the incentives are aligned to just make it easy or truly compelling to do them.
Starting point is 01:47:05 And the idea that we are not producing the next generation of antibiotics as quickly as possible is just unthinkable to me. And yet simply we're just hamstrung by the fact that we have a political and economic system where there's no incentive. We don't want to raise taxes. We're so overcommitted in the ways we spend money, the public money, and we're so short-sighted that even if we suddenly saved money in one area, it's not like we would immediately direct it to this life-saving necessary work, right? And the market can't get on top of this. So it really would be like something that the Bill and Melinda Gates Foundation could do.
Starting point is 01:47:59 And maybe they're actually doing this and I just don't know about it. They're doing a lot of medical work, obviously. But, yeah, we're talking about some billions of dollars to just get this kind of a laser focus on this problem. But it's such an obvious problem. And that's really the only thing that's holding it back? That's what's going on? It's not a research issue? It's just a financial issue? Well, I'm sure the research has to be done because, you know,
Starting point is 01:48:24 if it was totally obvious how to build the next generation of antibiotics that would not be vulnerable to having their efficacy canceled in three years by just the natural selection among the microbes, you know, someone would do it very, very cheaply. But so I'll admit that it's probably not easy to do, but it's got to be doable, and it's super important to do. I mean, when you look at just what cesspools hospitals have become, where people come at you, something like 200,000 people a year die in the U.S. based on essentially getting killed by the machinery of the hospital, right? They're getting killed by their doctors and nurses. Some of this is drug— Overdoses?
Starting point is 01:49:23 Or kind of incompetence in dosing or giving someone the wrong medication or whatever. 200,000 people? Yeah, 200,000 people a year, right? But a lot of it is just doctors not washing their hands, right? But some of this is also, you know, superbugs, where it's like the burden on hand washing in a hospital is higher and higher because hospitals just are just covered in in super germs.
Starting point is 01:49:49 Right. So it's a it's like a haunted house. It's like we're trying to fix people and around the house are a bunch of demons that are trying to kill people you're trying to fix. I mean, look, obviously it's not. Look, obviously it's not, but if you were a person who was inclined to believe things, like back in the day before they figured out microscopes, I mean, what else is that other than a demon? You've got a hospital that's filled with superbugs. Where else are they? There's nowhere else.
Starting point is 01:50:15 They're just in the hospitals. Yeah, and gyms. And some MMA gyms where they're getting them probably from the hospitals. But, I mean, it's like a haunted house. It's like a haunted house that's trying to kill the people that live in the house. It is. Well, and it's also just ironic. I mean, you go to the hospital to save your life, right?
Starting point is 01:50:35 You go when you are, by definition, you're most vulnerable. And you are laying your body open to the intrusions of the place because they have to get into your body to help you, right? And yet that is the very mechanism whereby you're getting all this misery and death imposed on you. And it is as simple as hand-washing, though, in many of these cases, right? It's just doctors and nurses not washing their hands. That is insane. It's so insane to think that that is a gigantic issue,
Starting point is 01:51:14 that we have these bugs that try to get into your body and kill you. So there's one area, I don't know if you ever had the bad luck to be associated with a NICU, a neonatal ICU. But our first daughter, who's totally fine, was born five weeks early and had to be in the NICU for a week. And there are people who are in the NICU for months. Their baby's born at 23 weeks or so. And it's just totally harrowing. But also just these incredibly compassionate, just amazing places where doctors and nurses are total heroes. But that's a space where you see the hand-washing protocol just exactly as it needs to be you know like people are they just have have understood you know finally that this you if you're going to go into the NICU and see your baby or be around anyone else's baby
Starting point is 01:52:12 you are going to wash your hands in in as complete a way as as 21st century science understands how to do that and it's um you know it is it's almost like you know the decontamination zone of Silkwood or the nuclear reactor. Wow. I wish it got hosed down. Yeah. But it's hand washing. The fact that we can't even do that perfectly is pretty impressive. Now, is it a fact that MRSA was created by medications or is that a belief or has that been proven that it was created by a resistance to medications that got stronger?
Starting point is 01:53:09 of happenstance, they are producing changes in their genome that leaves them no longer vulnerable to antibiotic X, right? So whether it's methicillin or any of its related antibiotics. And so this is what antibiotic resistance is. These bacteria, their genomes mutate, and unfortunately with bacteria, they also can swap genes laterally across bacterial species. So it's not like only their descendants in that line can inherit these genetic changes. They can transfer genetic changes across bacteria. So it just optimizes the process. And again, this is all blind. It's not like bacteria want to become drug-resistant,
Starting point is 01:53:55 but some percentage of them in any generation will tend to become, or in some generation will become drug-resistant. And then in the presence of the drug, they will be selected for. If you keep bombarding people with penicillin, you will be selecting for the bacteria that isn't sensitive to penicillin in those people. And so the overuse of antibiotics and the overuse of antibiotics in our food chain is also part of this picture, right? So it's the fact that we are, I don't know what the percentage is, but it's more antibiotic use certainly in cattle and pigs than in people. And the same evolutionary principles are happening there too.
Starting point is 01:54:45 So you don't know what we're doing to ourselves. I mean, it can't be good to be using antibiotics everywhere in agriculture and then kind of waiting to see what happens. Well, we know for a fact that we get diseases. I mean, swine flu, avian flu. We have xenoviruses, yeah. Yeah, and a lot of these are coming out of these facilities where they're processing cattle, processing pigs.
Starting point is 01:55:12 I mean, that's another element to it. So it's just that most of infectious disease over the ages has been born of proximity to animals, and that's the result of agriculture. So the fact that you have people in bird markets has been born of proximity to animals, and that's the result of agriculture. So the fact that you have people in bird markets in China who, you know, they're dealing with chickens and ducks endlessly, you know, in confinement, and then you've got, you know, wild birds flying overhead, dropping their droppings into that space.
Starting point is 01:55:46 And you have viruses that jump species from birds to pigs and back again. And some of this stuff only stays in those animals and doesn't become active in people. become active in people. But again, it's just, it's just like this, you know, this is, you know, evolution is just this endless lottery wheel where it's just, you just got change and change and change upon change. And something makes the jump in this case between species and thrives or not based on the opportunities to thrive. So you have something that becomes airborne, right? You have a virus that is absolutely deadly in people, but isn't airborne and is difficult to contract, right? Well, then it's a fairly well-behaved, it could be scary, but it's not going to become a global pandemic. But then suddenly you get a mutation on that virus or that bacteria that allows it to be, you know, aspirated, you know, become airborne in a cough and inhaled and spread that way. Well, then you have this,
Starting point is 01:56:58 this, you know, the possibility of a pandemic. And also the time course of an illness is relevant. So if you have something which kills you very quickly and horribly, well, then that's the kind of thing that is going to be harder to spread because people become suddenly so sick. They're not getting on airplanes. They're not, you know, going to conferences. They're not getting on airplanes. They're not going to conferences. They're not starting new relationships. They're in bed dying, right? But if you had something that had a time course that you felt great for a month, but yet you're infectious, and then it kills you, well, then its spread is only limited by the damage you can do in that month of gallivanting around, right? And again, these mutations are just happening spontaneously. So it really is a matter of good and bad luck.
Starting point is 01:57:58 And they all have one function. They kill things. They devour. They overcome. They overcome bodies with They devour. They overcome. They overcome bodies with their life form. They spread. There are other functions. Obviously, this is all blind
Starting point is 01:58:13 and there's no intention or purpose behind it, but there are viruses and other infectious diseases that produce functions which are behaviorally relevant. So there's a spread. I forgot.
Starting point is 01:58:33 Is it toxoplasmosis that makes mice less fearful in the presence of cats? Yeah. So they're behavioral changes, right? It makes them attracted to urine. Right. They get erect around of cats? Yeah. So they're behavioral changes, right? It makes them attracted to urine. Right. They get erect around cat urine. Yeah. So it leads them to their demise, so it spreads to cats.
Starting point is 01:58:52 Yeah. And there are theses that there are various infectious diseases that change human behavior. The depression is the result of infectious illness that we're not aware of. There's a lot that could be going wrong with us that we haven't attributed to viruses and bacteria, which in fact is at bottom a matter of viruses. Actually, Alzheimer's, there was recently a report that suggested that Alzheimer's is the result of a brain's immune response to infection, infectious illness.
Starting point is 01:59:38 I think it was bacterial. This was just in the last week or so. bacterial. This is just in the last week or so that the plaques associated with Alzheimer's that you see throughout the brain might in fact be the remnants of an immune response to something having invaded across the blood-brain barrier. So if Alzheimer's is the result of infectious disease, that is... That's insane. Score that as a major problem that would be nice to solve
Starting point is 02:00:15 with the right antibiotic regime. Could you imagine? I mean, that would have been fascinating if that existed during Reagan's time. They could just clear him up. Because you remember when someone publicly starts to go like that and it's a guy like Ronald Reagan who was an actor and president and you see him starting to lose his grip on his memory you hear all the
Starting point is 02:00:38 reports about it that it's a particularly disturbing because it's exhibited it means that's the head guy know, to think that that was just a disease. That's crazy. Yeah. Yeah. I don't remember when that became at all obvious. I remember I know people were trying to do about the possibility that he was not all there. And I don't remember it happening actually during his presidency, but we were both young at that point.
Starting point is 02:01:16 I do remember comedians doing jokes about it. Yeah. That's like one of my main points of references. They'd have this like weird sort of out of it grandpa type character that they would do. Right. And that was towards his second run, you know. Right. I don't know if that was years before they were referencing.
Starting point is 02:01:37 Like when was it all going bad for him? Yeah. I mean, neurologists can spot neurological illness really well. I mean, they're walking around seeing neurological illness all over the place. I'm sure there were neurologists who were talking about him long before anyone else was in those terms. Well, there was this old joke that Jimmy Tingle did. Jimmy Tingle is a hilarious political comedian from Boston. He had this joke about Ronald Reagan's trial where he couldn't remember if he sold arms to Iran.
Starting point is 02:02:11 He goes, Mr. President, next time you sell arms to Iran, jot it down, make a note, put it on the refrigerator. But it was just, that was his excuse when he was in trial. And everybody thought that this is bullshit. This is his defense. He doesn't remember. And then it started coming out that he, and then the conspiracy theory was he was always fine. He was like, what's that guy's name? Vincent the Chin Gigante, who would walk around in a bathrobe and pretend to be crazy.
Starting point is 02:02:43 Remember that guy, the mob guy? No. gigante who would walk around in a bathrobe and pretend to be crazy remember that guy the mob guy no there's a famous mob guy who was running the mob but pretended to be a crazy old man so he would walk around with people i forget how they busted him but he had it nailed he'd walk around the bathrobe and talk to himself and he would put on an act like go out on the street and act like a crazy person and then he would go on walks with like these capos and tell them kill this fucking guy and get me a million bucks and all that would go on walks with like these capos and tell him kill this fucking guy and get me a million bucks and all that kind of crazy shit but all the while he had some somehow or another they caught him i don't remember exactly how they caught him but everyone knew like they kind of knew he was doing that and so some people were thinking that's
Starting point is 02:03:18 what reagan was doing he was getting to like i don't remember what i did what i ran matter of fact i don't remember shit okay i think i got a disease. I can't remember anything. And just started pretending he just was out of it. Both could be true. I mean, he clearly did have Alzheimer's in the end, but he could have also been lying. What if he didn't? What if it was a big conspiracy? That would be a great movie.
Starting point is 02:03:38 Like, how good of an actor was Ronald Reagan? He was such a good actor. Through the latter days of his years, he avoided interviews by pretending to have Alzheimer's. He felt like that was the only way out. So just him and Nancy, they prepped their lines. And when, you know, you go out there to the public, he just started acting like he couldn't remember anything. Alzheimer's, unfortunately, is going to be a bigger and bigger story.
Starting point is 02:03:59 It's really the baby boomer moment is coming. And it's just going to this is something we need a full cord press on too. Yeah. And if you can find that that's actually a disease and you can cure that disease, that's insane. An infectious disease. An infectious disease. Well, have you ever seen the Sapolsky stuff on the toxoplasma? Robert Sapolsky, the guy from Stanford.
Starting point is 02:04:22 He's the guy that's like one of the forefront researchers and one of the guys who's really vocal about it. Stress. Yeah, they were also talking about a direct relationship between motorcycle crashes and people testing positive for toxoplasmosis. Oh, I didn't know that. And they felt that it might have either hindered reaction time or loosened inhibitions, the same way it sort of triggers these mice to go near cats. Yeah, yeah, yeah. That's what I was referencing before. So I think it's a lot of speculation, but there's a strong correlation apparently to motorcycle crashes. a strong correlation apparently to motorcycle crashes and that is i guess one of his professors had told him that when he was younger and he had remembered it while they were uh dealing with some
Starting point is 02:05:10 guy who came into the er victim of a motorcycle crash but it kind of makes sense yeah well you know the underlying biology of you know risk avoidance and not risk-seeking, that's fairly well conserved in mammals. There's a reason why we do most of our research in things like mice. I mean, it's not a totally analogous brain, but mice are similar enough to us that doing research on dopamine receptors in mice allows us to extrapolate to humans. And yeah, so it wouldn't be a surprise that it's having an effect on people. Was it Steve? Remember, I was supposed to bring this up to you before.
Starting point is 02:06:05 on people. Was it Steve? I remember I was supposed to bring this up to you before, um, when you were talking about, um, plants and plants having some sort of a consciousness, was it Steven Pinker? See if you could find this, who, uh, gave a speech where he talked about how some plants, you can actually use sedatives on them and that some of them actually produced certain neurochemicals like dopamine, if that makes any sense. I think it's Steven Pinker. No, it didn't make any sense, right? Well, it would surprise me if Pinker said anything about this, but I haven't heard anything about it. I think it was a speech he was giving about something, but I think it's controversial.
Starting point is 02:06:38 That's one of the reasons why I wanted to bring it up to you. Because I had never heard that before, that a plant could produce, it was either dopamine or serotonin, and that somehow or another sedatives would be effective on a plant. That doesn't even make any sense. I don't know what effective means. I don't know either. That's why I waited for you. The plant is pretty well sedated as far as I'm concerned. I remember as soon as I saw this, I'm like, must get this to Sam Harris.
Starting point is 02:07:06 Please decipher. But I actually, everything I, almost everything I remember about or that I learned about plant biology, I've forgotten. So I don't know. But I'm pretty sure that the plant does not have a brain. I see stuff with Steven Pinker in plants and and Consciousness but nothing with sedatives specifically coming up with that too. Maybe it was two maybe I can combine two different articles.
Starting point is 02:07:32 Maybe I did. The Pinker one I know What did Pinker say about Plants and Consciousness? Oh he was just talking about the surprising amount of calculations. I think that was one of the, you have to read the entire piece but I think it was just, they were just highlighting what we know so far. I've reached the limits
Starting point is 02:07:48 of a human bladder. Oh, how dare you? This is early. You're too healthy, man. Coffee and water. So what have you found on it, Jamie? Nothing really. To be honest with you, nothing specific about it but I did have something earlier that was kind of interesting. I'll show
Starting point is 02:08:04 it to you here. When you guys were talking about AI about ai i pulled up something on minority report and it pulled me to this article which uh microsoft has an app that can it's actually developed by hitachi it's called predictive crime and that analytics um they can predict crimes up to 91% accuracy. And then it's also already being enacted in Maryland and Pennsylvania. As of 2013, they have crime prediction software that can find out if an inmate that's going to be released is going to predict or commit another crime. And so they're using that to follow them. And there's some civil rights people that are saying like you can't do that obviously that's not good whoa hold on scroll down just a little bit what is it professor burke says his algorithm could be used to help set bail amounts and also decide sentences in the future and then
Starting point is 02:08:56 i got down to this part in chicago they're doing something and they have it's called a heat list in chicago they have 400 residents that are listed as potential victims and subjects with the greatest propensity of violence. And they go and knock on their door and tell them that they're being watched. And I've clicked on this thing, and it's an actual Chicago directive from the police.org. It's a pilot program about going and telling people that they're being watched for someone might be after you or some shit like that. It's really crazy. I didn't want to interrupt you guys to tell you about this. Custom notification under the Violence Reduction Initiative in partnership with the John Jay College of Criminal Justice Community Team.
Starting point is 02:09:37 We will serve as outreach partners within the social service and community partners. Show him this. This is crazy. partners within the social service and community partners. Show him this. This is crazy. While we were doing this, Microsoft has these programs. Scroll up to the top,
Starting point is 02:09:52 Jamie. They revealed an application they think that can predict crimes in the future and decide if inmates get parole. It uses all sorts of data, public data from Twitter, closed circuit camera feeds, public Wi-Fi signals.
Starting point is 02:10:09 There's like in L.A., there's all sorts of microphones all over the place listening for gunshots and whatnot. There's those new light poles I told you about that are adding 4G connectivity to the city. Yeah. That obviously can be added to this probably if they needed it. I'm sure. Yeah, that doesn't surprise me at all. I think that is a, all of that's coming.
Starting point is 02:10:29 I mean, just look at just consumer behavior. I mean, just look at how much someone can understand about you based on your zip code and your last three Netflix movies you watched to the end and just a few other data points, right? Netflix movies you watch to the end, and just a few other data points, right? And then we basically know, we can predict, you know, with some horrendous accuracy, what you're going to like, given, you know, the menu of options. I mean, we can advertise to you with immense precision.
Starting point is 02:11:18 Facebook, obviously, is at the forefront of this. But when you talk about, when you add everything else that's coming, the more intrusive technology of the sort we've been talking about, it's, you know, none of that's surprising, right? Nothing's surprising anymore. If you had read that 30 years ago, it would have looked like an article in The Onion, right you'd be like what they're gonna have this is judge dread this is crazy well here's the onion version which did just happen right i think this was microsoft where they put out a what they were calling an ai bot on twitter there was an ai right twitter account that just became a hitler loving sex spot because it was being tuned up by its interaction with people trolling it on Twitter. Did you see the guy
Starting point is 02:11:50 who got arrested for falling asleep inside his Tesla when it was on auto drive? No, no, no. He got busted. I don't know if he got arrested. He got busted, though. They have camera photos of him rolling through an intersection completely unconscious on his way to work.
Starting point is 02:12:05 Look at this guy. Look. Look at this. The car's driving him, and he's asleep while it's on autopilot. This is insanity, man. This is going to be a great moment to see, and I'm sure this is coming, where once self-driving cars become just obviously the only truly safe alternative, then you'll be arrested for the opposite violation. You'll be arrested if your hands are on the wheel, if you are driving a car that is ape-piloted as opposed to robot-driven. And that's a—I mean, we've got 30,000 people a year dying every year based on ape driving.
Starting point is 02:12:50 So the moment we crack that, which we're very close to doing, it's just going to seem, I mean, you've got your old muscle cars or whatever you're into. It's going to be like, that's going to be the equivalent of like, you know, celebratory gunfire. You're going to reserve the right to like at your wedding, you know, shoot your AR-15 into the air and not care where the bullet lands. Yeah. I'm going to have to get a license to operate them. You're going to have to move out of a big city.
Starting point is 02:13:18 Take them somewhere. You're going to take them to the hills and unload them out of the back of a truck. Right. Driving for a very short distance. That's right. Probably monitor how much. Put a governor on them that they can go five miles an hour. Yeah.
Starting point is 02:13:36 Yeah, I mean, if you look at it in terms of safety, for sure. And it seems to be the thing that's the recurring theme, right? You give up your privacy for safety. You give up your ability to drive a car wherever you want, whenever you want, however you want it. You give that up too. You give that up for safety. And people are really reluctant to give up fun shit like lying and driving their car fast. Like those two things. People are going to have a hard time with you like actually getting into their mind, seeing their actual mind, and being able to do that so we can know without a doubt whether or not someone's guilty or innocent.
Starting point is 02:14:09 But my question to you is, if you could get inside someone's mind, and it was like that really super suggestive guy that you were talking about earlier that just confessed all the horrific demonic possession stuff and eaten babies what if it's like getting to the to that guy's mind what if you can't tell well the the case that that uh worries me and this is perhaps an inept segue to uh politics but the um But the, I mean, we're in people's minds. You get someone talking long enough, you know their minds. I mean, they can only conceal what they're about only so well, right? Maybe you can because you're a super smart wizard type dude. But Jamie, I don't know about, I don't know if he's a mind reader. But the question is, will people care?
Starting point is 02:15:04 You know, it's like if you have, we don't even need a lie detector if you have someone who's openly lying. Right. Who just gets caught in lies again and again. You can see it too. You feel it, right? But people don't seem to care, right? At least in a political process. I mean, I'm thinking in this case of Trump, where you have someone who is in some cases lying or just changing his mind in such an incoherent way that it's the functional equivalent of lying. I mean, it's someone who becomes totally unpredictable.
Starting point is 02:15:38 He has a stance that is A on Tuesday and it's B on Wednesday. And when the discrepancy is pointed out, he tells you to go fuck yourself, right? So there is no accountability to his own states of consciousness that he's going to be held to. And the people who love him don't know so many of these people personally, but based on social media and seeing the few articles where someone has explained why they love Trump, people view this as a kind of this sort both dishonesty and a kind of theatrical hucksterism, a person who's pretending to be many things that he probably isn't, they see it as a new kind of authenticity, right? Like this guy, he's just letting it rip. He doesn't care what is true. He doesn't care what your expectations for coherence are. He's just going to tell you to fuck yourself every which way. And this is the new way of being honest, right? This is the new form of integrity. It's amazing to watch. I'm someone who actually, I remember on my own podcast, I think I was talking to Paul Bloom, this Yale psychologist who's great. And we had talked, we got into
Starting point is 02:17:05 politics, and this is at least a year ago. But at that point, I said, there's no way we're going to be talking about Trump in a year. I mean, this is going to completely flame out. And I was, this is a, I don't tend to make predictions. But this was a clear moment that I remember of making a prediction, which is now obviously false. But I just couldn't imagine that this was, false. But I just couldn't imagine that this was this what people were going to find this compelling enough to for the for him to be on the on the cusp of getting elected. It's it is terrifying. Have you talked have you talked this issue to death on your podcast? I guess we kind of have. Well, I think this is how everybody feels. Everybody feels like you're supposed to be like with their person whether it's bernie or whether
Starting point is 02:17:45 it's for hillary or whether you're a trump supporter whatever it is like you have to be all but like if you look at the the choices that were given none of these could really be described as ideal no no like hillary clinton you could want a woman in the white house and you want to you want to show everyone that a woman can do that job just as well as a man and she's got the most experience and she certainly got the most experience. She certainly has the most experience dealing with foreign governments. And she certainly has the most experience in politics. You can make an easy...
Starting point is 02:18:12 But she's also involved in two criminal investigations. She had an observer in her bathroom. There's all this squirrely stuff going on. She's terrible in many ways. She was anti-gay marriage until like 2013 and then then wouldn't admit the change of mind either yeah it's a she's a politician she's probably a brilliant woman but she's also set in her ways and a politician and a politician to the end and part of being a politician is being a fucking huckster. You got to be able to get those people to see your side.
Starting point is 02:18:46 And the way you do that is to talk like this. You can't talk like a normal person. She needs a speech coach. They all do. He's terrible too. Trump's not even good at it. And he kicks their asses. Well, no, but her voice, she has a kind of, to use the sexist trope, she has a shrill voice. How dare you? When you get her in front of a mic and there's a crowd and she thinks she's talking over the crowd which she doesn't have to do because she's in front of a mic
Starting point is 02:19:11 the sound you get is just, it's she's yelling when she doesn't need to yell and someone has to teach her how to dial that back. What you just did is called mansplaining and it's a terrible thing. I'm explaining to the men in her crew who should talk some sense into her. But she's a bad candidate, right? I have no doubt that she's
Starting point is 02:19:35 very smart and she's well-informed and she's qualified and she is absolutely who I will vote for given the choices. But I you know, I totally understand people's reservations with her. She's a liar. She's an opportunist. She is just almost preternaturally inauthentic. I mean, she's just like, she will just focus group every third sentence and you feel that from her, right? And this is all true. And yet I also believe the people who say, I've never met her, but people who know her and have met her say that behind closed doors, one-on-one, she's incredibly impressive and great. But that doesn't translate into her candidacy. She probably thinks she has to do it old school, you know?
Starting point is 02:20:18 I mean, the way she's doing it. But she, I mean, the thing is, she's, when you look at the, what worries me is, I went out on Facebook the other day, and I've said very little about this, but I've made enough noises of the sort that I just made that people understand that I'm for Clinton, despite all my reservations about her. And what I got on my own Facebook page, which you have to assume is filtered by the people who are following me on Facebook and already like me in some sense, just like a thousand comments of pure pain. I mean, no one loves Hillary.
Starting point is 02:20:55 No one said, oh, thank God someone smart is for Hillary. It was all just Bernie people and Trump people flaming me for the most tepid possible endorsement of Clinton. All I said was, listen, I understand Clinton's a liar and she's an opportunist and I completely get your reservations about her, but at least she's a grown-up, right? And she's going to be the candidate. It's not going to be Sanders. So now is the moment to put your political idealism behind you if you're a Sanders person
Starting point is 02:21:32 and recognize that there is a vast difference between Clinton and Trump. And no, she's not going to change the system, but she's also not going to run civilization off a cliff, right? And I forget how I said it on Facebook, but it really just the most, it was just, it really was a lesser of two evils argument. And it just, it was, it's amazing to see how energized and passionate people are in defense of Trump and Sanders. And there's almost
Starting point is 02:21:59 none of that for Clinton. It's like people are just sheepishly saying that they're just divulging that they will vote for Clinton. But they are maybe somewhere that I haven't noticed someone absolutely loves Clinton. But it's just it's she does not have her defenders the way these guys do. Have you seen the man enough to vote for her campaign? No, no. It's with like hipster dudes with tattoos and beards that are going to vote for Hillary. No. I hope it's fake because it's? No. I hope it's fake.
Starting point is 02:22:25 Because it's so brilliant. I hope it's not real. Is it fake? Thank God. Is it? Thank God. It's so good, though. Because it's not that fake.
Starting point is 02:22:37 It's pretty good. Like, you could almost see. So, wait a minute. I mean, this is not bad for her, right? No, no, no mean this is not bad for her right this is no no no it's not bad for her it's just funny that someone would like make a joke political ad right you know that you have to be man enough to vote for hillary like there's guys out there that would buy that they would they would like i'm not enough bro they'd do it It's a scary time because it doesn't seem like anybody that you would want to be president wants to be president. And so we're left with, all right, what do you pick?
Starting point is 02:23:13 It's like as if we're going to play the Super Bowl with three of the shittiest teams we could find. We're just going to go get some drunk high school kids. We're going to get some inmates with club feet. We're just going to throw whoever. We're going to have the worst game ever. And that's what this game is this is not a good game this is not a game where you've got like uh a john f kennedy versus a lyndon johnson or it's not it's not like like powerful characters it's trump i guess is a really powerful character, but in more ways like a showman character. Yeah.
Starting point is 02:23:46 Like what he's doing is like he's putting on a great show and he's going to win probably because he's putting on such a great show and people like a great show. I do think I'm now of among the people who think something new or witness or witnessing something new we're witnessing we're witnessing something new with trump it's not just the same old thing where the process is so onerous that it's selecting for the kind of narcissist or you know thick-skinned person who is willing to submit to the process and then there are many most of the good people just aren't going to put up with this. I mean, yes, there's that too, but there's something. It's a moment among the electorate where there's enough of an anti-establishment mood and vote now. This is happening with Sanders too, where people just want to jam a stick in the wheel of the system just to see what happens. The main gripe against Hillary really is that she's politics as usual. She's not going to change the system, right? People want to
Starting point is 02:25:01 change the system, but they're not really thinking about the implications of radically changing the system. And in the case of Trump, I mean, here is someone who is advertising his lack of qualifications for the office in every way that he can. It's like, there's no, I mean, I'm not even bothered by his racism or his misogyny or his demagoguery or his bullying. I mean, all of that, I'm willing to, to guess is an act, right? That he's, he's decided that that's somehow pandering to his base. And he's actually, in truth, he doesn't have a racist bone in his body. So I'm willing to willing to believe that. I don't know why I would think that's plausible, but I have a hunch that he's far more liberal than he
Starting point is 02:25:52 seems and is just pandering. But the thing that can't be true is there's no way he's actually brilliant and well-informed about all the issues and is saying the things he's saying. He's not pretending to be as uninformed and as incoherent and as irresponsible as he's seeming. Um, cause you wouldn't withhold information. It would make you a better leader. Well, it's just, it's just, yeah. And it's just, well, it's just the vacuousness of his speech. It's like, he's, he'll say, he'll say the same thing three times in a row and it was meaningless the first time, right? He'll say, it's going to be amazing. It's going to be very, very amazing. Trust me, it's going to be so amazing. And he does this with
Starting point is 02:26:39 everything. If you look at the transcripts of his speeches and the fact that he can't, the transcripts of his speeches. And the fact that he can't, he has never, I mean, so far as I've seen, he has never once strung together a string of sentences that was even interesting, right? But he's never, he just, there's never a moment where I say, oh, this guy is smarter and better informed than I realize.
Starting point is 02:27:07 That moment never comes. I keep expecting to see that happen. And it's a little bit like, I have this image of, like, imagine you have an urn, right? And you just keep pulling things out of it. And all you pull out of it is junk, right? You pull chicken bones and broken marbles and gum and, and it's, it's still possible that you, that if you root around in that urn long enough, you're going to find the hope diamond. I mean, in each round that you pull something out, that really has no logical implication for the next thing you might pull out of the urn.
Starting point is 02:27:40 But minds aren't like that. When I see what this guy says, he does not say anything that a well-informed, intelligent person would say. And ideas are connected, right? It's like you can't fake this stuff. You can't fake being this uninformed, and you can't fake being really well informed. And he's just, I mean, we just look at one policy that he wants, the rounding up of illegal aliens, right? Round up 11 million illegal aliens. Now this gets stated as, yeah, we're going to round them up and send them back to Mexico. And what worries me is no one seems to care that if you just look at the implications of doing this, this one policy claim alone is so impractical and unethical. It's just, what are we talking about here? You're talking like your gardener, your housekeeper, the person who works at the car wash, the person who picks the vegetables that you buy in the market is going to get a knock on the door in the middle of the night by the Gestapo and get sent back to the vast majority of these people are law-abiding people who are just working at jobs that Americans by and large don't want to do.
Starting point is 02:28:58 And many of them have kids who are American citizens, right? Someone's got kids under the age of 10 who are American citizens, and what, you're going to send that person back to Mexico? And you're going to do this by the hundreds of thousands and millions? It's just that one point alone, held in isolation from all of the other things he said, the crazy things like climate change is a hoax, you know, concocted by the Chinese to destroy our manufacturing base. And, you know, the fact that he likes Putin, I mean, everything else he said, right? This one policy claim alone should be enough to disqualify a person's candidacy. It's so crazy the moment you look at it. And yet no one seems to care. In fact, it's just more energizing to the people who already like him.
Starting point is 02:29:45 I know that he said that he wanted to build a wall, but I didn't know that he said that he wanted to get rid of the illegal aliens. And round them up. And round them up. And do what with them? Send them back to their country. Oh, that is so crazy. That's such a crazy idea, and it's so brutal. The idea that, I mean, it's a subhuman thing.
Starting point is 02:30:04 The only reason why people would come to America is because they felt like it would make their life better. So people take a big risk. There's not an easy way to do it if you're poor. You don't have any qualifications for any unusual job. I mean, you're trying to get across to Mexico. But everybody who does it does it because they want to improve their life. it does it because they want to improve their life. And the idea that one group of people shouldn't be able to do it and one group should
Starting point is 02:30:25 just because they were born on the right side of some strange line that is only a couple hundred years old. But actually, I'll go further in meeting him in the middle. So I think we should be able to defend our borders. I don't have a good argument
Starting point is 02:30:41 for having a porous border that we can't figure out how to defend and we don't know who's coming into the country. Right. So I think building the wall is almost certainly a stupid idea among his many stupid ideas. But I think it would be great to know who's coming in the country and have a purely legal process by which that happened. I mean, ultimately, that's got to be the goal. Right. Right. And we are, you know, we are imperfectly doing that. And so I don't have an argument for open borders or porous borders. But the question is, what do you do with 11 or 12 million people who are already here doing jobs we want them to do that help our society, and the vast majority of them are law-abiding
Starting point is 02:31:26 people who, as you say, are just trying to have better lives, the idea that you're going to break up families and send people back by the millions, and the idea that you're going to devote your law enforcement resources to doing this when you have real terrorism and real crime to deal with is just pure insanity, right? And also totally unethical. And yet he doesn't get any points docked for this aspiration. I mean, it's just, it's one of the things around which people are rallying. But the climate change thing is also insane and dangerous. Well, he was a birther. Right. Yeah, right. He was one of the original birthers. He insane and dangerous. Well, he was a birther. Right, yeah, right.
Starting point is 02:32:06 He was one of the original birthers. He was saying that Obama's birth certificate was bullshit. He was born in Kenya, right? Wasn't he one of those guys? Oh, yeah. He was self-funding that for a while. I would love it if he got into office and just said, Listen, folks, I am nothing like this person I pretended to be to win the presidency.
Starting point is 02:32:26 I just wanted to show you that you can be manipulated and get it together. And Pete just punked all of civilization. I'm going to hire some people who actually know how to run things. Well, see, the smart people who are voting for him think, and this is, I think, an actually a crazy position, but they think that he is just pandering to the idiots who he needs to pander to, to get into office. So he's, he's not disavowing the white supremacist votes, you know, with the the alacrity that you would if you were a decent human being, and you found out that David Duke supported you. Because he needs to, he just, he needs those votes. And he knows that most of the people in his base aren't going to care and he can just kind of move on in the news cycle.
Starting point is 02:33:16 And he's doing this on all these issues where smart people see that he looks like a buffoon and the people who don't like him are treating him as a comic figure who he can't really believe that stuff. He's too sophisticated to really believe that stuff, so he's just pandering. One is people aren't seeing, if that's true, just how unethical and weird that is. The guy has no compunction about lying and demonizing people. It's like, let's say he thinks that Clinton really isn't
Starting point is 02:33:53 guilty, Bill Clinton isn't really guilty of a rape, right? And now he's calling him a rapist, right? Now, at the time, he was saying he wasn't a rapist, and he's just being defamed, and this is outrageous. He was taking the side of a friend who he invited to his wedding. But now he's calling him a rapist, right, a sexual predator who harmed women's rights more than anyone. So which is true, right? So there's no version of the truth here that makes Trump look at all acceptable as a person. makes Trump look at all acceptable as a person. It's like either he knew he was a rapist and was defending him because he was just cozying up to power at that point, right? Didn't care that he's a rapist. Or now he knows he's still the guy who thinks he wasn't a rapist, but now he's just for
Starting point is 02:34:40 purely opportunistic reasons, he's willing to call a guy a rapist who he knows isn't. They're both horrible, right? And it's not like there's new evidence has come forward in the intervening years that would have changed his mind about what happened in Clinton's presidency. But I think people think that he's got to be much more sophisticated than he is, and that if he got into office, he would just be a totally sober and presidential person. There's just no reason to believe that. I mean, if he thinks climate change is a hoax and that we should pull out of the Paris Accords and we should ramp up coal production and bring back the coal jobs. I mean, this is what he's saying, right? There's no reason to think he doesn't believe this at this point. That's just, you know, it is a disastrous thing for a president to think.
Starting point is 02:35:37 The only fascinating versions of this that I've been hearing from people that I respect are that the idea that he is like the political version of the asteroid that killed the dinosaurs. Like he's going to come down and smash it and it's going to be so chaotic that they're going to be forced to reform the system and people are going to respond in turn. Like the way people are responding against factory farming and more people are going vegan, like that kind of a thing. They're going to see it and they're going to respond in turn. That is such a. So it's like he's going to toss the apple cart up in the air. He's just going to fuck this whole goofy system up and then we'll be able to rebuild after Trump has dismantled all the all the different special interest groups and lobbyists and all the people that we really would like to get out of the system. We really don't like the fact that there's such insane amounts of influence
Starting point is 02:36:25 that big corporations and lobbyists have had on the way laws get passed. This might be the way to do it. You have some wild men. Everyone's fired. You're fired. You're fired, Jetson. It's like a character. He's coming in. His hair's plastic. He's all fired up.
Starting point is 02:36:42 He's a billionaire. He made all his own money, sort of. Dad gave him some money, but he turned into a lot of money. Point being, he doesn't need anybody's money. The truth is, he's all fired up he's a billionaire he made all his own money sort of dad gave him some money but he turned into a lot of money point being he doesn't need anybody's money the truth is he's probably lying about the amount of money he has too is he's but he's a baller for sure though right at the very least he's got to be worth some cash yeah i mean there could be a big difference between what he's claiming and what is in fact true but he's he's a um i mean there are many pieces here i mean people assume that because he's a and what is in fact true, but he's, he's a, um, I mean, there are many pieces here. I mean, people assume that because he's a successful businessman, he must understand the economy, right? Which has no necessary connection there, right? Um,
Starting point is 02:37:13 there's a lot of rich people who are totally confused about economics. Um, and they're, you know, most economists don't have a lot of money, so there's no real connection there. But the – I mean, so what you're describing is a kind of just random – like you're just – let's just smash the window and then see what happens, right? Like I'm going to light a fire to this place and see what happens. And that's – almost any process by which you would change the system is more intelligent than that, you know? And, and it's also not valuing how much harm one bad president could do, right? Like there's no, I think even I, I haven't tested this, but I'm imagining that even Trump supporters would answer this question the way
Starting point is 02:38:04 I would hope, which is like if I had a crystal ball, they could tell you. We can't tell you who's going to be president, but it tells you how it works out for the next president. So like if I if I look in this crystal ball and it says the next president, United States is a disaster. I just it's like the worst president we've ever had. is a disaster. It's like the worst president we've ever had. Just think of failures of governance and the toxic influence of narcissism and hubris that comes along just like once every thousand years, just a disaster. I think you know, even if you're a Trump supporter, which candidate that was. It's like only Trump is likely to screw things up that badly. Clinton is going to be almost perfectly predictable.
Starting point is 02:38:55 She's going to be a politician. She's going to be basically centrist on foreign policy and domestic policy. She's going to be liberal on social issues um she is not going to to try to dismantle nato and to get into a war with north korea or you know get into an alliance with putin or i mean she's there's she's not going to do something insane. So alliance with Putin. Yeah, he's said basically only favorable things about Putin. They're homies. Yeah.
Starting point is 02:39:30 They're tight. So we should get, hopefully we'll see pictures with both of them on horseback shirtless. I think Donald's probably going to keep his shirt on. He'll probably keep his shirt on. I don't see him as being the shirtless kind of guy. Yeah. on he'll probably keep his shirt on i don't see him as being the shirtless kind of guy i yeah when when you just look at the landscape between bernie and hillary and him and uh you know to me it looks like the last gasps of a dying system it's like but that's representative
Starting point is 02:40:02 government system but a lot of people are saying that things like that but they're not hearing just how nihilistic that is if true right right it's like like we there's so much stuff we have to get right right and there's so much and and the only tool to get it right is having your mind actually understand what's going on in the world and how to manipulate the world in the direction you want it to go. So you have to understand, like, whether or not climate change is true, your beliefs about it have to be representative of that truth, right? So, like, let's say, you know, let's say it's, let's say I'm mistaken and there's, you know, there is no human cause, climate change is not a problem. And every moment spent thinking about it, worrying about it, correcting for it is just a waste of time that's just throwing out, you know, the wealth of the world, right?
Starting point is 02:41:03 That would be a terrible thing, right? So it really matters who's right about that. And the fact that we have a president or a candidate who is coming in saying, this is all bullshit, you know, in defiance of all of the science is, and it's true, but it's on every other point. It's a bit of a problem. He doesn't know anything about, you know, I guarantee you he doesn't know the difference between Sunni and Shia Islam or which countries are Sunni predominantly and which are Shia
Starting point is 02:41:34 predominantly. And I mean, I'm sure he's going to do, I don't know when he's going to cram for this final exam. I'm sure before one of those debates, he's going to get, you know, someone's going to sit down with him and give him some bullet points he's got to have in his head. But his head is just not in this game. It's never been in this game. It's obvious from everything he says. And that is something you can't say about Clinton, right? For all of her flaws as a person, I don't care how much you hate her as a person, she understands what's going on in the world. Right. And that difference is so enormous. Forget about all the other character flaws of this guy who is just obviously going to, I mean, he's.
Starting point is 02:42:13 But are we attached too much to this idea of one person being the figurehead? Well, it's not a figurehead. Someone has to. Right. Someone is the decider. If we all woke up today, if everybody woke up and there was just no government, there was nothing, we're all just, what happened? I don't know, but we've got to figure out how to run this thing.
Starting point is 02:42:30 We had no previous understanding of government. Do you think anybody would say, we need one dude to just run this whole giant continent filled with 300 million people? Most likely, if we woke up and we had technology like we have today, we had the ability to communicate like we have today with social media and whatever, we would probably say we need to like figure this out amongst each other and find the people that are the most qualified for each one of these positions and start running our government that way. Well, that's what we're attempting to do, but it's just, and I totally agree with you that it is astonishing that out of a nation of 300 million people, these are the choices.
Starting point is 02:43:06 You would think, starting from your zero set point of just now we're going to reboot civilization, you would think that if you had this kind of process, each candidate would be more impressive than the next. I mean, you'd be like, I can't believe that. Each person who came to the podium would be so impressive. Like looking at LeBron James. Oh, yeah. It'd be like the dunk contest for the NBA. It'd be like, oh, my God. Just when you thought you saw the best dunk in your life,
Starting point is 02:43:37 the next guy comes along. Exactly. And it would be that on every topic, right? It'd be like you'd be talking about the science of climate change. You'd be talking about the actual dynamics of the war on terror. So topics that seem to have no relationship where you would have to be amazed that anyone could be an expert in all of them. You would find someone who was an expert, a functional expert in all of them. A Jeopardy winner, dude.
Starting point is 02:44:04 Yeah, yeah. find someone who was an expert, a functional expert in all of them. A Jeopardy winner dude. Yeah, but someone who was also ethically wise, who wasn't obviously an asshole. Right. And who had
Starting point is 02:44:16 a mature relationship to changing his or her mind, right? So like this whole bit about flip-flopping and not, you know, being caught, like someone who could honestly represent changes of mind across a political career, right? I mean, there's no, it's nowhere written that it's a good thing to believe today what you believe 20 years ago. In fact, if you do that on every topic, it means
Starting point is 02:44:43 basically you haven't been in dialogue with the world. But there's something, it's so taboo to change your mind that either you have to lie about it or you have to pretend it was always that way or it's just a, I mean, the system is broken in that respect, but given the choices, you know, and when you have a choice between someone who is, for all her flaws, been in the game for long enough to be really well informed and capable of compromise and capable of not just breaking the entire machine. And you have someone who's just, he just kind of stepped off the set of his reality TV show and then lied about everything and elbowed his way onto your television set and never left because CNN couldn't figure out
Starting point is 02:45:43 how to give the mic to someone else. It's an amazing situation. Well, he's a product of attention because they realize that there's a heated race, right? The heated race, this guy was really famous. And in a heated race, this guy would say some crazy stuff and so they would tune into him. So everybody had to tune into him. So because of him saying crazy stuff, he accelerated the amount they were talking about him. So they were constantly talking about him and barely talking about other people. But he's created a wormhole in our political process now
Starting point is 02:46:13 where there's nothing so crazy that could disqualify him among the people who like him now. So he can just keep, it's like nuclear bombs of craziness that the press can't ignore. That every time they think, okay, this is the crazy thing he said that's going to harm his candidacy, so let's shine a light on it. It just helps him. You know, he could get on Twitter right now and say, you know who I'd like to fuck? I'd like to fuck Nicki Minaj. And it would work for him. It would work for him. It would work for him.
Starting point is 02:46:46 You would see a tweet storm of a billion people who say, I'd like to fuck Nicki Minaj too. Go get her. And it's insanity. That's where we are. But in a sense, we do admit that this is a fucked up system. It's not ideal. It should definitely be reworked.
Starting point is 02:47:05 And it's so hard to rework. Wouldn't the best way to rework it, a Trump asteroid just slams right into the White House. Boom! Blows the whole thing sky high. Who knows what terrible things have to happen. But maybe that would be enough. The thing is, we have those. Those asteroids are coming anyway.
Starting point is 02:47:23 So when you look at, like 9-11 was an asteroid, right? So it's like or a superbug that becomes a pandemic. These are things that are coming and we need people who are in touch with
Starting point is 02:47:40 reality to deal with them, right? So like the moment someone advertises not only their ignorance, but the fact that they don't care that they're ignorant, and they do this again and again, they keep doubling down,
Starting point is 02:47:54 that person is like, if you put that person at the helm, what you have done is basically put chaos at the helm, right? It's like this person's going to believe whatever he believes regardless of the information coming in and regardless of the helm, right? It's like, this person's going to believe whatever he believes, regardless of the information coming in and regardless of the consequences.
Starting point is 02:48:09 That's just, I mean, it's worse than having no one in charge. I mean, because you've put the power in this person's hands, right? So this person is, everything has to go through this node in the network that is just like an information scrambling device, right? So no matter how good the information is coming in, you've got a bottleneck here, which just screws up the signal, right? That's what you're doing if you're hiring someone like this who, I mean, yeah, in the best case, what you stated earlier would in fact be true, which he'll get into the Oval Office
Starting point is 02:48:49 and even he will be scared of the prospect that he's now running the better part of human civilization. And he will hire the best people or some semblance of the best people he can get access to and say, tell me how to not screw this up. And that will be, and then it'll essentially be business as usual, right? Insofar as you've hired, the best people will be people who are deeply in this game already, right? He'll defer to the
Starting point is 02:49:18 generals when it comes time to make war. Being really pragmatic about how they pick politicians and how they push certain people and decide not to push others do you think that something like trump completely changes how they move forward now they realize that this can happen like now that you see that people are so goofy we're so wwe'd out that you can get this guy you know i mean this is this is where we're at where we've got a guy i told him the wall just got 10 foot higher. Yeah! Everybody gets crazy.
Starting point is 02:49:48 Like, how could you say that? Once they realize that that's possible, how long before you get like some motivational speaker type dudes? How long before they start jumping in there? Tony Robbins for president. That's what we have. I mean, this is way worse than Tony Robbins for president. Oh, yeah, way, way worse. Tony Robbins is a positive dude. but I'd vote for him. Very
Starting point is 02:50:08 positive guy. That's not what I mean. I mean, but that sort of ability to excite people, like if we're going to get like one of those motivational speaker dudes, like one of those guys that wears a lot of yoga pants and he's going to, he's going to be the next president. He's going to get us in shape. It's going to be a reality show. America's a reality show. We're there. We're awesome. We're the best.
Starting point is 02:50:32 Did you see this press conference he held? I think it was yesterday. No, I did not. It was a very funny moment where there was one journalist, I didn't recognize who it was. So Trump was being very combative with the press pool, and he was basically shouting them down, not answering any of the questions. And one journalist just aghast said, is this what it's going to be like when you're president? Is this what it's going to be like to be in the White House press corps and deal with you?
Starting point is 02:51:02 And he said, yes, this is exactly what it's going to be like. But you could just see that the journalists, they turn the camera on the room of journalists, and they are astonished by what is happening here. They don't know. They're participating in this process. In some sense, they have created this process. No, not all of you, just many of you. All right, fine. Enough of us. Is this what you did?
Starting point is 02:51:32 Is it is this what it's going to be like covering you with your president? Yeah, it is. Let me have this kind of conversation in the press room. OK, yeah, it is going to be like this, David. If the press writes false stories like they did with this because you know probably half of you were amazed that I raised all of this money if the press writes false stories like they did where I wanted to keep a low profile I didn't want the credit for raising all this money for the vets I wasn't looking for the credit and by the way more money is coming in I wasn't looking for the credit but I had no choice but to do this because the press
Starting point is 02:52:03 was saying I didn't raise any money for them. Not only did I raise it, much of it was given a long time ago. And there is a vetting process, and I think you understand that. But when I raise almost $6 million, and probably in the end we'll raise more than six because more is going to come in and is coming in. But when I raise $5.6 million as of today, more is coming in. And this is going to phenomenal groups. And I have many of these people vetting the people that are getting the money and working hard. You played the moment I was referring to. I mean, so here is a case where he's probably almost certainly lying about his history of
Starting point is 02:52:40 giving to Veterans Affairs. And he gave money very recently after people started fishing around to see if he actually had given the money that he claimed to have given to veterans. But, I mean, this is—what's difficult about this is that, I mean, yes, there are—the press is highly imperfect, and also partisan, and there are false stories and there are exaggerations and they screw people over, yes. And there are reasons to not trust anything that he's done wrong. And the lack of acknowledgement that he pays no price for it among the people who like him. And so the press is powerless. But the net result of a press conference like this, if you're a Trump follower, is he just showed how biased and petty
Starting point is 02:53:57 the press pool is. And the press do need to just be beaten up by a strong man who who's not going to stand for their bullshit but it's it it's a um i mean if you it's unbecoming at the very least that kind of communication it's kind of unbecoming that person that we expect and that's pretty mild no that's compared to his you know his parodying of the of the um the disabled reporter who he i mean you saw that bit where he you know did like a a um a cerebral palsy imitation at at one of his speeches no he was yeah oh yeah yeah he he was interviewed by i don't happen to know who the and it was about it was oh yeah he was making he was making fun of somebody with cerebral palsy um i mean he's done so many things that you would think would be fundamentally canceling of a person's political aspirations. Like, you caught Marco Rubio pretending, just goofing on someone's cerebral palsy at one of his campaign events. Oh, it's so true.
Starting point is 02:54:59 Just be the end, right? No, one of the things that sunk Ted Cruz was just that video of him with his family, the outtakes where they were like oh you didn't see it it's a gem uh-huh it's spectacular it's him with his mom and he's like my mom prays for me often for hours every day and she's like right she looks i'm like what the fuck are you talking about hours every day no i don't you can't even say that and so they have all these really awkward moments like, okay, I'm going to go in for a hug. I'm going to say I love you. It's all like weirdly mapped out. And that got online and people were like, oh, Christ.
Starting point is 02:55:35 Okay. See, this is a bad game. Like you're not even good at this game. You're terrible at this game. He was objectively terrible. Well, that's trump's competition well the thing the thing about cruz that never even got out which was the reason to be scared about a cruz presidency was his level of religious craziness i mean no one was even pushing on that because there wasn't just enough to push on before he even got to that door yeah
Starting point is 02:56:01 you have to hold on to those weapons yeah but i But I mean, had it been a had Cruz been the nominee, it would have been all about religion. What's what's odd is that that's not a handicap in 2016, that you can have that and people consider it an asset. Well, the one thing that's surprising and actually hopeful in Trump's candidacy is the fact that he has dissected out the religious, social conservative component of the Republican Party. So evangelicals, for the most part, were going for Trump over Cruz when it was pretty clear to them that Trump was just pretending to be religious. And so Trump gave one speech at, you know, I think Liberty University where he spoke. He said, you know, Corinthians 2. And, you know, that's not the way any Bible reader would speak about 2 Corinthians. How would you say it? 2 Corinthians.
Starting point is 02:56:58 That's how you would say it? Yeah. Yeah. And so he said Corinthians 2 as though this is something he just opened every night before he went to sleep. And so it was clear to them that he is just miming the language or impersonating a person of faith. But they don't care, really, as long as he does it. And that is, if you're going to look for a silver lining to this, it shows that they just want a space where their religious convictions are not under attack, and they don't really care
Starting point is 02:57:42 that the person in charge share them just you just if you pretend to share them that's good enough and that's um that's better than actually caring that this person really believe in the rapture or anything else that is quite obviously crazy um but uh yeah so i don't think any christian who's voting for trump thinks i mean they'll say i'm not going to judge another man's faith right or? Or I'm not, who am I to say what's really in his heart, right? They'll say that, but he's just given, if you've been paying attention to who he's been, and if you just look at how he talks about these things, I don't think he's fooling any Christian. I don't think he's fooling any Christian. So I think they're willing to vote for someone. Now, for other reasons that are fairly depressing in their own right, they're willing to vote for someone who doesn't really play the game the way they do.
Starting point is 02:58:44 You have to believe in God to be president in 2016, right? Wouldn't you say that that's— You have to pretend to believe in God. You have to believe in God to be president in 2016, right? You have to pretend to believe in God. But I think with Trump, I think the pretense is obvious enough that I don't think he's fooling the better part of the people who are voting for him, who would say they care about a person of faith being in the White House. So if anything, he might be one thing he might be breaking is the barrier on having an atheist president. Because I think he you know, it's just nobody thinks he is a person
Starting point is 02:59:17 of faith. I don't think anyone really thinks that. So he might be our first atheist president. He would help us in that regard as well. Another Trump meteor right into the White House. I'm starting to sound like a Trump supporter. Occasionally an asteroid does something good. Who would be the ideal president? I mean, like what kind of a person?
Starting point is 02:59:35 I mean, it would probably be a person who doesn't seek attention. Probably be a person that... Well, I don't think it could be that. I mean, the process is, even an optimized process will be, will require enough sacrifice of what ordinary people want most of the time that it will be an unusual personality who has to get promoted. I mean, you will be on some metric be, I mean, it's almost by definition narcissistic to think that you should be in this role, right? Who are you to think that you should be running civilization at this moment in human history? And for you to honestly stand at the podium and say, I'm the guy, you know, or the woman, right? I am the most qualified. I should be doing this, right? I can help. You know, if you're going to scrutinize the kind of personality that could give rise to those opinions, it's not, yeah, there are some dials you would probably want to tweak if you had to be married to this person. It's not an optimal personality.
Starting point is 03:00:46 So there's a kind of pathology of power-seeking that might be just intrinsic to it. But you want someone who is actually wise ethically. I mean, just try to map that onto Trump, right? Imagine someone saying, the thing I like about Trump is that he is so deeply ethical and wise. It's just not. I mean, it's like saying it's because his hair looks so natural. I mean, there's just no—it's the antithesis of what he is, right? know, it is the antithesis of what he is, right? The thing I like about Trump is that he is so well-informed about the way the world works. And where he's not informed, he recognizes his
Starting point is 03:01:36 ignorance so quickly, and he remedies it as fast as possible. He, you know, he seeks out the best experts, defers to them. And, you know, he's just, he's as mindful of the limits of his knowledge as he is about his expertise, and his expertise is vast, right? You'd want to be able to say that about a president. You could not begin to say that about Trump, right? You could probably say that, honestly, you could probably say that about Clinton, right? For all her defects, she's very knowledgeable. And I'm sure she will just try, where she doesn't feel like she's got the knowledge, she's going to try to go to the source of the knowledge, right? Just grab the best experts she can find. Go to the source of the knowledge, right?
Starting point is 03:02:24 Just grab the best experts she can find. I think she will be as aware as you or I would be of the consequences of not knowing what's going on, right? She's just going to want to attitude he's the guy is the guy is winging it i mean it could not be more obvious that i that this guy is winging it on every level i mean it's like it's just it is it is it's not there'd be no way for him to signal the fact that he's winging it more clearly than he is with everything he's doing. And yet there's no penalty. Do you think it's possible that in this age of information, the way we can communicate with each other, that we're going to experience these cycles, these waves, these in and outs,
Starting point is 03:03:28 experience these cycles, these waves, these in and outs, these high and low tides of really smart presidents and really stupid presidents. And we just, people revolt. And there's just, it's so easy to stay alive. There's plenty of stupid people out there. And so they're only willing to vote for other dumb folks. So the other dumb folks get into position. They send out the frequency that only the dummies here and everybody else is going, what the fuck is everybody voting for this guy for? What is happening? And then it makes the smart people rebound in four years and challenge themselves anew because they need some sort of an enemy to rally against to reach their full potential.
Starting point is 03:04:00 And then without the low tide, you cannot have the high tide, Sam Harris. potential and then without the low tide you cannot have the high tide sam harris that's hopefully that's not an analogy that applies to the maintenance of civilization the smell yeah yeah maybe man maybe well at the very least it's a wake-up call for the political establishment this silly game you've been running of two candidates just doesn't work someone can co-opt your candidacy get in in there, throw the fucking monkey wrench into the gear system, and guess what? Trump's running for president now. He's the head guy
Starting point is 03:04:32 for the Republicans. How is that even possible? They don't know. I mean, what's amazing is it is a way, if nothing else, it is a total wake-up call for the Republicans. They are just aghast. It's June. It's June. Everything's decided. Lockdown. So we have July, August, September, October, November.
Starting point is 03:04:51 We're that close. But he's not someone who has been who is aligned with the Republican platform in most ways. Right. So it's like he's been. in most ways, right? So it's like he's been... The truth is, virtually no one knows what his policies are because he keeps changing his position on things like taxation. It's like there actually is no...
Starting point is 03:05:14 He's said, he's talked on both sides of core issue, core Republican issues. But, I mean, in many ways, he's left of Hillary, right? He's, you know, he's left of Hillary in terms of being an isolationist. Like he's, you? We're going to be isolationist, which is deeply anti-Republican. But I'm going to be the maniac who you're never going to know who I'm going to bomb next, right? We're going to wipe out ISIS. We're just straight away, right? Not a man left standing. And I'm not going to take any shit from anyone, including China and North Korea. And so he's that, but we're going to pull back in a huge way and not be in anyone's business, right?
Starting point is 03:06:08 He said both of those things. It's way too interesting. I mean, we don't want politics to be this interesting. And it's going to be, I mean, November is going to be, if the polls are close and watching those debates and, and, and, and, uh, waiting for a swing in the polls as a result, it's just going to be, it's going to be way too interesting. It's gonna be like watching the Superbowl, those first debates, you know, it's going to be, it'll be a hundred million people watching those debates. I have a prediction. I think
Starting point is 03:06:45 I think it's entirely possible that this whole thing was a plot that didn't work out. I think he probably came out of the gate saying crazy shit thinking he would
Starting point is 03:06:58 tank the Republican Party and get his friend Hillary Clinton into the White House. No, that wasn't Well, she didn't get It didn't work out. He kept trying to insult her,
Starting point is 03:07:07 kept trying to make stuff up about Mexicans, and it just kept making him get better and better, and now he's stuck. He can't pull out. That would be a great moment. That would change the system. Well, we're going to have to go through something like this in order for us to realize that this is crazy,
Starting point is 03:07:23 that a guy can just do this, can just not really have any interest in politics. But if he pulled out, then he should get the Nobel Prize for everything. If he pulls out at this point and says, listen, I took you to the precipice here just because I wanted you to recognize how unstable this situation is. I mean, you guys could elect a demagogue who is actually an incoherent demagogue. I haven't even been playing an incoherent authoritarian, right? I'm, on the one hand, very liberal, right, and tolerant. And on the other hand, I'm like getting ready to be Hitler, and you guys can't figure out who I am. And yet you're still prepared to vote for me.
Starting point is 03:08:07 Right. I mean, yeah, for him to do a postmortem on on his punking of the culture, that would be the best thing to ever happen. But I don't think that's that's what's happening. Do we need someone like this so that we realize how silly this whole thing is? Do we need someone like this so that we realize how silly this whole thing is? Do we need someone like this? No, no. We need a qualified person to deal with all of the other hassles and dangers that are coming our way that have nothing to do with what we do. Right.
Starting point is 03:08:34 But that person's not there. be the tsunami of risk and hassle and waste and just all the rest of the world's chaos that is coming our way. Even if we had our house in order in every respect, we still have terrorism and global climate change. You've got China and India, and what are they doing in terms of complying with climate goals? You have all the things we've been talking about, the virtual certainty that there's going to be a pandemic. We're not talking about bioterrorism. We're talking about just the sheer fact that in 1918, there was a killer flu and there's going to be another killer flu, right? There's just no way there's not going to be another killer flu. And we need people, smart people to change,
Starting point is 03:09:40 to optimize the system to deal with these kinds of things. And if we're promoting religious maniacs and crazy narcissists and liars and ignoramuses and only those people, how could this
Starting point is 03:09:59 end well? Maybe this is just a weird year for like heavyweight boxing. You know, they have those weird years for heavyweight boxing where Tony Tubbs is a champ. Or you could be the heavyweight champion of the world. They went through a period of time in the early 80s before Tyson came around. It was a series of these champs that were sort of like journeyman fighters. And then Tyson came along. Maybe that's what it is.
Starting point is 03:10:22 But only with heavyweights, right? Yeah, mostly with heavyweights. Yeah, and the lighterweights, they were always badass. But I think that maybe that's what's going on. Maybe we need to have this bad season, get the season out of our way, realize the danger of having an inept person
Starting point is 03:10:38 in office, whether it's a liar or a dude who hates money or Trump, whoever it is. Just go through it and realize how silly it is that we have it set up this way still. Yeah, except people thought that of Hitler. I mean, any comparison to Hitler obviously brands you as an exaggerator. They thought, let him get in there and fuck it up and then we'll have somebody better? Hitler was a comic figure for a while for a good long while and people were including the american press were incredibly
Starting point is 03:11:10 slow to recognize what a sinister character he was and he was he was considered a buffoon and uh and there was you know there was like a this is maybe jamie could find this i think it was home and garden i think it was home and house and I think it was Home and Garden. Was it House and Garden? Home and Garden? You know, there was a, like a write-up on his, you know, Eagle's Nest or his house that was just, it was just this pure puff piece of, you know, Hitler love in our architectural magazine. At Home with the Fury. Yeah.
Starting point is 03:11:45 I think this is it. So this is like a Guardian write-up of it or something? Homes and Gardens. Yeah. So the people, you can see the actual pages. I think there are PDFs online of the actual pages of the article. But this is probably the actual text of the article. But it's hilarious.
Starting point is 03:12:01 It's just like Architectural Digest does the Eagle's Nest. but it's hilarious it's just you know like architectural digest does you know the eagle's nest um and but it's at a time where it's not too far away from a moment where it should have been absolutely obvious to every thinking person that this guy was going to try to you know conquer the world for evil right um and yet it wasn't obvious and and when you look at how it wasn't obvious, it's pretty humble. I mean, you don't know you would have been necessarily different. Because up until this conversation, practically, I've been looking at Trump as a clown, right? What would this clown actually do with the power of the presidency? You know, I don't know that he couldn't be. I mean, he's given voice to a kind of authoritarianism that, you know, some people are, his enemies are noticing, his friends are discounting. But he's talked about, you know, going after the press. And I mean, he's bragged about how many people he's going to torture. Right. He's talked about, you know, well, of course, we're going to do waterboarding and we're going to do worse. And well, maybe we'll kill the kill the families of terrorists. Right. And he taught he but there's a kind of a. I mean, it's just he's.
Starting point is 03:13:18 It's going to make America great again. What would he do if he actually had more power than anyone in the world? I think it's a legit question. The transition from comedy to, oh, my God, we can't take this back in anything like short order, that could well be terrifying. To go back to the question of heavyweights, why do you think you could be a fake heavyweight and not a fake middleweight? There's not that many really good athletes that go to boxing when they're really large. They tend to go to football or basketball if they're really tall. It's like if you look at the amount of money that like guys in the nba can make or guys in the nfl can make i mean the really top level guys can make a tremendous amount of money
Starting point is 03:14:10 so when you get the really super athlete guys um they tend to gravitate towards the the big name i mean there's no bigger name sport than football so getting someone to abandon the whole team thing and having the the balls to go one-on-one in a cage and having that mentality that's also very different because it's not necessarily the smartest thing to do but it's the most challenging thing to do and there's some really smart people that do it so even though cage fighting isn't it's not the safest way to get through life for a lot of people that engage in it becomes like an extreme extremely difficult pursuit and then that engage in it, it becomes like an extremely difficult pursuit. And then that's what it becomes to them.
Starting point is 03:14:49 And in the heavyweight division, those guys were being lured into other ways. And boxing was just kind of, it went through like a peak in a valley. It went Ali and then it went Larry Holmes. And even though Larry Holmes was amazing, people didn't appreciate him for how good he was. So that doesn't happen at the middleweight level? It's not the same competition for that kind of athlete? They get a little lulls in the middleweight division, but it's always pretty fucking strong. But why wouldn't you have the same competition for the high-level athlete at the 165 weight?
Starting point is 03:15:22 Our favorite sports require bigger people, like basketball and football and I think Baseball doesn't really apply here the amount of cultures that produce heavyweights First of all are fairly limited like very few heavyweights have come from Asia Except like Polynesian guys, which I guess is kind of Asian, but like Samoans. Samoans known to be great fighters, but giant, sturdy heavyweights. Like the Chinese don't really produce them that often. The people don't get that big in Japan. Well, they're sumo, but... Yeah, but they're very fat.
Starting point is 03:15:57 There's never been like a guy who looks like Mike Tyson that came out of Japan in the 80s. Right. We're seeing more of that. You know what I mean? It would be. We're seeing more of that. You know what I mean? It would be. We're seeing more of that now. But, I mean, if we had a guy that was like a Japanese version of Mike Tyson, just a super fast blinding knockout fighter with a fucking head like a brick wall
Starting point is 03:16:15 and a giant neck that started above his ears and went down to his traps. You remember Tyson when he first came on the scene? Oh, yeah. He was unbelievably terrifying. So there's never been a Japanese person that has that kind of physical strength. Remember Tyson when he first came on the scene? Oh, yeah. He was unbelievably terrifying. So there's never been a Japanese person that has that kind of physical strength. So I think it's limited genetically. And I think in a lot of the competitive boxing countries, they tend to be poorer countries.
Starting point is 03:16:40 And I think also in a lot of poorer countries, you'll see much smaller men. Like you'll see like some men are like flyweights. Like it's very rare you find an American flyweight most americans are larger they get more food right i think probably has a lot to do with it or just just the genetics in general but like uh south america produces a lot of flyweights like uh the philippines that's of course where manny pacquiao came from and he was like eight weight classes lower when he first started right and if you're a great athlete at 120 pounds or 130 pounds is not a lot of sports yeah what else can you do especially if you i mean some of these guys are really tiny but they're amazing boxers like um i mean there's there's a ton of them but the united states uh johnny tapia was a smaller guy
Starting point is 03:17:21 i think uh what it would wait to johnny tapia fight out? See if you can find that out. But there were some lightweight guys that were just so incredible. They brought so much attention to those divisions. But there was never, there's like little peaks and valleys where greatness comes in
Starting point is 03:17:34 and then people have to recover and then new people come along that are great. But there's always been pretty steady. What did he fight at? Super flyweight. Yeah, so he was one of the rare Americans. Mexican-American Johnny Tapia. He was a bad motherfucker? Super flyweight. Yeah, so he was one of the rare Americans, Mexican-American Johnny Tapp. He was a bad motherfucker.
Starting point is 03:17:49 Super flyweight, which is, what is that, like 126 or something? Maybe 130? I don't even know what that means. Because Bantamweight, I think it's, in the UFC, it's different. There's different weight classes. 115 pounds. Wow. Crazy. That's tiny weight classes. 115 pounds. Wow. Crazy.
Starting point is 03:18:06 That's tiny. Yeah. And he was a wild, wild guy. They did a documentary about him. How did he die? So he died? Yeah, he died. I don't remember.
Starting point is 03:18:17 But he had a lot of problems with drugs and crime and craziness. And he had like Mi Vida Loca tattooed on his chest. Right. Well, that might get you the presidency now he was a wild man but just an amazing fighter to watch just so much fun I think you know in where the bigger people are you know I just think they tend to gravitate towards other sports I think that's all it is and boxing boxing, always like 160, 147 to 160 has always been like the promised land. That's Sugar Ray Leonard, Marvin Hagler, Roberto Duran, Floyd Mayweather's in there, Sugar Shane Mosley's in there.
Starting point is 03:18:55 So many guys are in that mix. That's the sweet spot. It always has been. There's great fighters in pretty much every weight class. Strength and speed at that point. Yeah. I think you see it in the UFC, too. There's, I think, well, when it comes to just freak movements, I always think that the flyweights and the bantamweights, the 25 and 35s are the fastest and the best guys.
Starting point is 03:19:17 They're moving like 20% faster than anybody. But I always wonder, like, how much of that is because they're just not affected by gravity as much? And they're also not affected by the blows that are being landed by the other guy it's like it's unless it's mighty mouse mighty mouse is the one of the few guys in that division that consistently stops people yeah did you see his last fight with henry cejudo no it was it was incredible it was insane i mean he um he fought this guy henry cejudo's an olympic gold medalist one of the best wrestlers to ever compete in mma i mean he is just a stud wrestler and a really good kickboxer too and mighty mouse uh clinched up with him and hit him with these knees to the body that were
Starting point is 03:19:55 just out of this world technical just so perfect no wind up no slop just drilled them in on each side with perfect precision and he just crumpled he was like what the fuck he got the victory on knees knees to the body just need the shit out of his body need him in the face but it was the fluidity of the way he was moving his knees into perfect position i mean the it was they were so perfectly oiled like everything was going down a path that it had gone a million times it was just like a wham wham oh yeah but it was it was better than i've ever seen i mean it was without a doubt the most one of the well there's two there's another one between anderson silva and rich franklin but that was like a prolonged brutal beat down where anderson just keep beating
Starting point is 03:20:41 him up beating him up in the clinch and broke his nose. There's one where, yeah, I remember a victory. Was it Weidman or was it Silva? I can't remember. Someone just won on a knee to the chest against someone who was, can you knee someone who's? Yes, who's down. It was Chael Sonnen and Anderson Silva. Right, yeah, Chael Sonnen, yeah, yeah. Yeah, that was pretty brutal.
Starting point is 03:21:03 That was a brutal but what what mighty mouse did in this fight that was crazy was the precision of the placement of the knees and how quick they came just bam bam bam bam just controlled him and he's controlling a guy who's an olympic gold medalist wrestler just a stud wrestler it was really pretty impressive stuff like really really amazing sharp technique but i wonder like could a heavyweight even move like that? Yeah, probably not. No, I mean, as you get bigger, there are just things you can't do. Clearly, there's a limit to the size you can be and be not only athletic but even ambulatory.
Starting point is 03:21:40 I mean, you couldn't have a 30-foot tall person who could walk around and your bones would break because mass goes up with a cube of just the size. drop an ant off the Empire State Building and it'll fall and hit the ground and be fine. You drop a horse off the Empire State Building, it's going to be a liquid horse. It's a... I mean, there you have the air resistance with
Starting point is 03:22:18 surface area. The air resistance goes up by the square, the surface area. But that doesn't counteract for the mass going up with a cube, the volume. So the horse is bigger. You'd think it might be able to kind of act like a wing as much as the ant would. It's got a lot of air resistance.
Starting point is 03:22:40 It's a giant. So it's a horse. But its mass is going up with a with a cube of its with a cube of its size so it's um uh the the air doesn't resist its fall at all compared to what it's doing for an ant um but yeah but yeah we have the limit on this this is one function like if you if you're going to engineer the super athlete you know if we're going to give you like chimpanzee muscle proteins or whatever to make you super explosive and strong um you'd have to get that right with your you know your connective tissue and your bones and everything else because you can rip your own arm off with your you know ballistic moves yeah if you imagine
Starting point is 03:23:22 if you had chimp strength and human tendons like good luck just break your hand off yeah yeah one fight you take your own arm off and and beat the person with it well you i'm sure you saw that video of the little boy who got into the gorilla cage yeah is that how to shoot the gorilla that seemed i heard a lot of i didn't pay a lot of attention to the commentary but it seemed pretty straightforward to me. They have to shoot the gorilla? Yeah. Like, I'm with the zoo on that one.
Starting point is 03:23:51 I mean, it's totally tragic. And it's, I mean, how is a zoo, I mean, the zoo is certainly culpable for having an enclosure that a three-year-old or four-year-old can get into, right? How the hell did that happen? So you've got to fix that. And, but it's totally tragic. But once you have a 400 pound gorilla that has a human child and is not letting it go,
Starting point is 03:24:15 you know, just kind of dragging it around. I mean, it wasn't looking aggressive toward the child, but it just the fact that it moved it around with that kind of force, who knows what was going to happen? That looked like you had to end that as quickly as possible. We have to assume that that gorilla that it moved it around with that kind of force, who knows what was going to happen? I mean, that looked like you had to end that as quickly as possible.
Starting point is 03:24:30 We have to assume that that gorilla is going to know that a baby is more fragile than a baby gorilla. We'd have to assume. I don't assume anything. We can't assume. Oh, yeah, no, no. There's no way it could know. It never has experienced it. He could have just ripped her arms off.
Starting point is 03:24:40 Torn his head off. Yeah, easily, accidentally. Oh, yeah. Easily. No, it's totally tragic. And I'm sure the parents and the zoo are reaping sufficient criticism. But once that situation is unfolding, I think you can't tranquilize it because it doesn't work fast enough. Right.
Starting point is 03:25:02 No. And it might grab the baby in fear and think it's being attacked. Jesus Christ, that's scary, though. But if it was your kid, I think you'd probably, like, shoot that fucking gorilla. Oh, yeah?
Starting point is 03:25:14 If it was your kid? Yeah, I mean... I mean, if more people were carrying guns, you know, it's... It was in Cincinnati, right? If that had been in Texas, he probably would have had some innocent bystander
Starting point is 03:25:27 who was going to take the law into his own hands. Yeah. Chuck Norris pants. Yeah. Jump right over the railing. Open fire. Yeah.
Starting point is 03:25:37 I saw some horrible comments, too, where people were like, they should have just shot the parents while they were at it. Well, it's easy to be outraged. fucked up it's a little kid you shouldn't have been able to get in first of all there's a guy you got a gorilla enclosure it is an architectural failing i mean it should be impossible yeah it's not like it's not just like a three-year-old could do it how could a three-year-old like it should be impossible for an adult to get in there with it with the gorilla? Yeah
Starting point is 03:26:05 well, but it's been happening with like It's pretty regular lately like guys have been breaking into zoos and it's like some guys are killed by the Lions recently, right? Jesus Christ I mean you got to realize you got a lot of responsibility when you have monsters in the cage in your city You can't let babies get in there with them I mean that's a gorilla as awesome as it is, if it wanted to attack you, it's a monstrous beast. It's a thing with power that you couldn't even fathom. A gorilla could literally pick you up and throw you like you could a football.
Starting point is 03:26:39 I mean, they can launch you. They're so strong. Oh, yeah. Didn't we Google it? They get to be like 500 pounds or something crazy yeah well this was like 400 pounds oh god but not but not not 400 pounds like a person sap 400 pounds i mean he was 300 pounds but it's like a 300 pound gorilla is much stronger than bob sap yeah um i mean they're not equivalent not equivalent pounds. It's unfathomable the amount of physical strength they must have.
Starting point is 03:27:07 Yeah. It just sucks that they keep them in zoos in the first place. It sucks that they had to kill them, but it really sucks that they keep doing this zoo thing with smart stuff. Like, if you want to have giraffes in the zoo, and I had a whole bit about it, that they look like real relaxed because there's no lions around. You know, like they don't care. They just give them some food. But there's some animals that look tortured yeah and primates in particular they just look so freaked out in this enclosure in this weird
Starting point is 03:27:35 place where people are staring at them they're pacing and trying to get away from people's gazes it's just i think it's very very. I think, well, it's a hard question of what to do given certain of these species are on the verge of extinction, right? So how do you preserve them? I mean, obviously you can preserve them in all kinds of technical ways like have their DNA frozen and be able to reboot them at a certain point when we figure out how to preserve their habitat. But I've got to think there's a role for good zoos. Because also you just want to maintain the public's connection to these animals because the decision to destroy habitat is made by people who don't really care about the prospects of extinction, right?
Starting point is 03:28:32 It's a very good point when you present it that way, because the people that are over there are facing, I mean, any people that are over in Africa trying to save gorillas and chimps, and I mean, that is an unbelievably difficult struggle and they might not make it. I mean, that is an unbelievably difficult struggle, and they might not make it. I mean, there's real concern that if there was no regulation at all and there was no one telling anybody what to do, that they could just go in there and wipe them all out. Well, historically, that's what we've done, right? With kind of everything that we've profited from. Anything that you can make money off of, anything that you can. I mean, I don't know what the hell they use chimps and gorillas for well there's the whole trinkets and stuff trade well no they just i mean there's bushmeat they eat them right then there's the why do they call it bushmeat well it's just
Starting point is 03:29:15 you go into the bush you go into and it's like hunting but you just shoot everything well it's just it's just your they're hunting species that you know you don't think of as food species but they're you know they're eating monkeys and gorillas and and that's why they're hunting species that you don't think of as food species, but they're eating monkeys and gorillas. And that's why they call it bush meat? Well, I mean, bush is like the jungle. Right. But it's just hunting species that are not—I mean, there's the other component of it, which, the crazy ideas that the Chinese have about the medicinal properties of tiger bone wine, right, or rhino horn. So you have these species that are being hunted by poachers because there's a market for their parts, you know, like the ivory trade. But some people just eat species that are endangered, too.
Starting point is 03:30:06 The term bushmeat is always associated with primates for some reason. I was always trying to figure out why. I don't know. Is that true? No. It's probably not. But in my mind, it was. I don't know.
Starting point is 03:30:17 I read about people eating chimps and how common it was. My friend Steve Rinella did a show in Bolivia where they shot and they ate a monkey. And it's so weird to watch. These people, Yanu Mami, is that how you say them? Yanu Mami? They live in Bolivia and they live in very much like the way they probably lived hundreds of years ago.
Starting point is 03:30:41 They have handmade bows and arrows, these long spear-like arrows. It's not like a regular bow and arrow. It's like a five-foot-long arrow. Very strange. They walk around barefoot, and they have some Russian shotgun, and their favorite thing to do is eat monkeys. They have a Russian shotgun?
Starting point is 03:30:58 Yeah, they got a Russian shotgun that somehow they got from somebody, and it made it all the way deep into the jungle. But the shells are very precious. But their favorite thing to do is shoot and eat monkeys. It's like their number one favorite thing to eat. And they cooked it on the show and it's really weird to watch. You talk about some strange genetic connection that you have with a very human shape. Like the lips and the eyes and the face.
Starting point is 03:31:23 You could recognize a certain amount of human in that little little fella when they or fella female whatever female version of fellow would be but um so they they cooked it over this fire and uh then made like this uh stew out of it right yeah well actually to go back to our cultured meat conversation, one thing that's weird about that prospect is that if in fact, so if you're making, if you're just growing cells in a vat, then it's just as grotesque as, at least to my palate, it's a fairly grotesque thing to contemplate. But these are just the distinction. I mean, you're just talking about, I mean, there is no in principle human DNA, right? And at the cellular level, the difference between human muscle protein and bovine muscle protein, it's not, I mean, if this was never attached to an animal, we're dealing with concepts here. And it's like, I mean, if you bite a fingernail
Starting point is 03:32:43 concepts here. And it's like, I mean, so you, I mean, when you, if you bite a fingernail and swallow it, are you practicing auto cannibalism? Right. I mean, it's, it's a, um, at some level it's, it is a, the concept is doing a lot of work. It's, it's unignorable when you're talking about depriving another person of life. But if you're talking about spinning up cells in a vat, then, then it becomes, well, does it really matter whether this was a person? Maybe people would decide that would be the only ethical choice. To eat cultured human meat?
Starting point is 03:33:13 If you can eat meat, you have to eat human meat. There's the end of this economy. You can't harm any other animals. Not only can you not, you have to eat yourself. The cattle industry could do to just quash this. Just spread the rumor that there are human cells in those vats. Yeah, you go to your local center, you get scraped, and then they start making your monthly supply of you.
Starting point is 03:33:36 Soylent green is people. And they just have it in a vat, and they break you off a cube every week. Right, yeah. You take it back to your flat yes and you eat yourself and that's the only meat you're allowed you could eat yourself well we figured out a way to live in harmony with nature we just have to kill everything except us and then eat ourselves tell us which part of your own body you want to eat for the rest of your life and we will culture those cells well i know it was you that i was having this conversation with once i
Starting point is 03:34:03 believe where we were talking about how when areas become more educated and women become more educated, it tends to slow the population down. Like people tend, and they tend to even worry that if these graphs continue further on, that people in industrialized parts of the world, as they get into the first world, they do they'll more likely to have less and less people oh yes less and less children fertility goes down with literacy and education with among women yeah um and there's you know so just to kind of map that on to you know life as you know it here so women given all the choices available you know, life as you know it here. So women, given all the choices available, you know, educational, economic, and an ability to plan a pregnancy. So here we have women who want to have
Starting point is 03:34:56 careers, want to go to college, and they delay pregnancy to the point where they have realized a lot of those aspirations. And so pregnancies come later and later and later, and families get smaller and smaller. And, I mean, so virtually no one chooses to have 10 kids in the face of all of this other opportunity that were the things they also want out of life, right? Right. If they can avoid it. If you can't avoid it, well, then you just find yourself with 10 kids, right?
Starting point is 03:35:31 Or if you have some religious dogma which says you, though it's possible to avoid, you shouldn't avoid it because you were put here to have as many kids as possible. But are you allowed to bring that up when you talk about the population crisis? Are you allowed to, I mean, that's a fascinating piece of information. Which population crisis are you thinking of? Because there are two different ones. There are two opposite ones. Right.
Starting point is 03:35:55 Which is China where they don't let you have girls. Or you don't want, have they lessened that? I think they've relaxed that. I think they've relaxed the one child policy, but I'm not sure. I remember hearing something about that. And the other one, you would say India? Well, no, no. The United States?
Starting point is 03:36:10 No, the other, there's an overpopulation crisis in certain countries and disproportionately in the developing world. And there is underpopulation in the developed world. I mean, most of Western Europe is not replacing itself. So you're having these senescent populations who have to, they just have to import, they rely on immigration for just to carry on the functions of society because they're not at anywhere near replacement rate. The most surprising detail that brings this home is that there are more adult diapers, now this is Japan,
Starting point is 03:36:58 there are more adult diapers sold in Japan than baby diapers. Now, just think about that, the implication of that for a society, right? How do you have a functioning society barring perfect robots that can tend to your needs where you have just a disproportionate number of people who are no longer economically productive
Starting point is 03:37:23 relying on the labor of the young to, you know, keep them alive and cure their diseases and, and, um, uh, defend them from crime, all that. But the, the ratio is totally out of whack, right? Um, so you need, we need, like the world is a giant Ponzi scheme on some level. You need new people to come in to maintain it for the old people, barring, you know, apart from having some technology that allows you to do that without people. But I think everything I've heard about population recently suggests that we are on course, you know, globally, to peak around 9. a half billion and then taper off. I don't think anyone now is forecasting this totally unsustainable growth where we're
Starting point is 03:38:14 going to wind up with, did I say a million? Nine and a half billion people. You said billion. Billion. Where we're going to, you know, hit something like 20 billion people, right? I don't think anyone, even the most Malthusian people, are expressing that concern at the moment, which was the case like 20 or 30 years ago where they thought this is just going to keep going and we're going to hit the carrying capacity of the earth, which is something like 40 billion people. Wow. I don't think anyone thinks so. Because we're just, which is something like 40 billion people. Wow.
Starting point is 03:38:45 I don't think anyone thinks, because we're just, fertility is falling everywhere, but it has fallen actually below replacement in the developed world. Do you think in our lifetimes, or in our children's lifetime, it's feasible that we figure out a way, in some way, to, I'm not endorsing like taking people's money and giving it to other people, but in some sort of a way to eliminate poverty. Is that even possible? Is it ever going to be possible to completely eliminate poverty worldwide and within like a lifetime? Well, I think we talked about this the last time when we spoke about AI, but I mean, this is the implication of much of what we talked about here. If you imagine building the perfect
Starting point is 03:39:33 labor-saving technology, right? Where you imagine just having a machine that can build any machine that can do any human labor, you know, powered by sunlight more or less for the cost of raw materials, right? So you're talking about the ultimate wealth generation device. And now we're not just talking about blue-collar labor. We're talking about the kind of labor you and I do, right? So like artistic labor and scientific labor and just a machine that comes up with good ideas, right? So we're talking about general artificial intelligence.
Starting point is 03:40:08 This, if in the right political and economic system, this would just cancel any need for people to have to work to survive, right? It just would be there'd be enough of everything to go around. And then the question would be, do we have the right political and economic system where we actually could spread that wealth? Or would we just find ourselves in some kind of horrendous arms race and a situation of wealth inequality unlike any we've ever seen? It's a, we don't have the, it's not in place now.
Starting point is 03:40:46 I mean, if someone just handed us this device, you know, if, and it were, you know, all of my concerns about AI were gone. I mean, there's no question about this thing doing things we didn't want.
Starting point is 03:40:59 It would do exactly what we want when we want it. And there's no, there's just no danger of its interest becoming misaligned with our, and there's just no danger of its interests becoming misaligned with our own. It's just like a perfect oracle and a perfect designer of new technology. If it was handed to us now, I would expect just complete chaos, right? If Facebook built this thing tomorrow and announced it, or rumor spread that they had built it, right? I would explain if Facebook built this thing tomorrow and announced it, or rumor
Starting point is 03:41:26 spread that they had built it, right? What are the implications for Russia and China? Well, insofar as they are as adversarial as they are now, it would be rational for them to just nuke California, right? Because having this device is just a winner-take-all scenario. I mean, you win the world if you have this device. You can turn the lights off in China the moment you have this device. I mean, it's just the ultimate. Because literally, we're talking about, and many people may doubt whether such a thing is possible, but again, we're just talking about the implications of intelligence that can make refinements to itself over a time course that bears no relationship to what we experience as apes. So you're talking about a system that can make changes to its own source code and become better and better at learning and more and more knowledgeable,
Starting point is 03:42:26 has instantaneous, if we give it access to the internet, it has instantaneous access to all human and machine knowledge. And it does, you know, thousands of years of work every day of our lives, right? It does thousands of years of equivalent human-level intellectual work. It's just a—I mean, our intuitions completely falter to capture just how immensely powerful such a thing would be, and there's no reason to think this isn't possible. I mean, the most skeptical thing you can honestly say about this is that this isn't coming soon, right? It's like this is not,
Starting point is 03:43:06 but to say that this is not possible makes no scientific sense at this point. There's no reason to think that a sufficiently advanced digital computer can't instantiate general intelligence of the sort that we have. There's no reason to think that. I mean, intelligence has to be at bottom some form of information processing. And if we get the algorithm right with enough hardware resources, and the limit is definitely not the hardware at this point, it's the algorithms, there's just no reason to think this can't take off and scale, and that we would be in the presence of something that is like having an alternate human civilization in a box that is making thousands of years of progress every day. Right? So just imagine that. the 10 smartest people who have ever lived, and every time, every week,
Starting point is 03:44:06 they make 20,000 years of progress, right? Because that is the actual, we're talking about electronic circuits being a million times faster than biological circuits. So even if it was just, and I believe I said this the last time we talked about AI, but this is what brings it home for me. Even if it's just a matter of faster, right?
Starting point is 03:44:26 It's not anything especially spooky. It's just this can do human-level intellectual work, but just a million times faster. And again, this totally undersells the prospects of superintelligence. I think human-level intellectual work is going to seem pretty paltry in the end. But imagine just speeding it up. If we were doing this podcast, imagine how smart I would seem if between every sentence I actually had a year to figure out what I was going to say next. And so I say this one sentence and you ask me a question. And then in my world,
Starting point is 03:45:07 I just have a year. I'm going to go spend the next year getting ready for Joe, and it's going to be perfect. And this is just compounding upon itself. Not only can I, not only am I working faster, ultimately I can change my
Starting point is 03:45:23 ability to work faster. I mean, we're talking about software that can change itself. You're talking about something that becomes self-improving. So there's a compounding function there. But the point is it's unimaginable in terms of how much change this could affect. And if you imagine the best case scenario where this is under our control, right, where there's no alignment problem,
Starting point is 03:45:51 where it's just this thing doesn't do anything that surprises us, this thing will always take direction from us. It will never develop interests of its own, right, which is, again, the fear. But let's just say this is totally obedient. It's just an oracle and a genie in one. And we say cure Alzheimer's, and it cures Alzheimer's.
Starting point is 03:46:13 You solve the protein folding problem, and it's just off and running. And to develop a perfect nanotechnology, and it does that. This is all, again, going back to David Deutsch, there's no reason to think this isn't possible because anything that's compatible with the laws of physics can be done given the requisite knowledge, right? So you just, you get enough intelligence, as long as you're not violating the laws of physics, you can do something in that space. But the problem is this is a winner-take-all scenario. So if Facebook does it tomorrow and China and Russia find out about it, they can't afford to wait around to see whether the U.S. decides to do something not entirely selfish with
Starting point is 03:46:59 this, right? Because their worst fears could be realized. If Donald Trump is president, what's Donald Trump going to do with a perfect AI when he has already told the world that he hates Islam, right? We would have to have a political and economic system that allowed us to absorb this ultimate wealth-producing technology. And again, so this may all sound like pure sci-fi craziness to people. I don't think there is any reason to believe that it is, but walk way back from that edge of craziness and just look at dumb AI, narrow AI, just self-driving cars and automation and intelligent algorithms that can do human-level work, that is already poised to change our world massively and create
Starting point is 03:47:57 massive wealth inequality, which we have to figure out how to spread this wealth. What do you do when you can automate uh 50 percent of of human labor were you paying attention to the uh artificial intelligence go match yeah yeah i mean i don't actually play go so i wasn't paying that kind of attention to it but i'm aware of what happened there do you know the rules of go um Not so that I know. Actually, I don't play it. No, I don't. I don't need that. I know vaguely how it looks when a game is played, but I don't actually know how to play it.
Starting point is 03:48:31 It's supposed to be very complicated, though. Oh, yeah. More complicated and more possibilities than chess. Oh, yeah. And that's why it took 20 years longer for a computer to be the best player in the world. It is. Did you see how the computer did it too well i didn't i i know i mean this is the company that did it is um deep mind which is was acquired by google and they're at the cutting edge of ai research and yeah well it's
Starting point is 03:49:02 the cartoons are unfortunately not so far from what is possible. But the yeah, I mean, there's again, this is not this is not general intelligence. Like we're talking. So these are not machines that can even play tic-tac-toe right now. There's some there have been some moves away from this. tic-tac-toe, right? Now, there have been some moves away from this. Like, DeepMind has trained an algorithm to play all of the Atari games, like from 1980 or whenever, and it very quickly became superhuman on most of them. I don't think it's superhuman on all of them yet, but it could play Space Invaders and Breakout and all these games that are
Starting point is 03:49:50 We play Space Invaders and Breakout and all these games that are highly unlike one another. And it's the same algorithm becoming expert and superhuman in all of them. And that's a new paradigm. And it's using a technique called deep learning for that. And that's been very exciting and will be incredibly useful. The flip side of all this, I know that everything I tend to say on this sounds scary, but this is all like, the next scariest thing is not to do any of this stuff. It's like, we want intelligence. We want automation. We want to figure out how to solve problems that we can't yet solve. So intelligence is the best thing we've got. So we want more of it. But we have to have a system where, I mean, it's scary that we have a system where if you gave the best possible version of it to one research lab or to one government, it's not obvious that that wouldn't destroy humanity, right? That wouldn't lead to massive dislocations where you'd have, you know,
Starting point is 03:50:48 some trillionaire who's trumpeting his new device and just, you know, 50% unemployment in the U.S., you know, in a month, right? Like, it's not obvious how we would absorb this level of progress. And we definitely have to figure out how to do it. And of course, we can't assume the best case scenario, right? That's the best case scenario. I think there's a few people that put it the way you put it that terrify the shit out of people. Right. And everyone else seems to have this rosy vision of increased longevity and automated
Starting point is 03:51:23 everything and everything fixed and easy to get to work and medical procedures would be easier. They're going to know how to do it. But everybody looks at it like we are always going to be here. But are we obsolete? I mean, is this idea of a living thing that's creative and wrapped up in emotions and lust and desires and jealousy and all the pettiness that we see celebrated all the time we still see it it's not getting any better right if if are we obsolete
Starting point is 03:51:52 i mean what if this thing comes along and says listen there's a way to do you can abandon all that stupid shit you can abandon all that makes you all the stuff that makes you fun to be around yeah it also fucks with you can live three times as long without that stuff. I think it would, in the best case, would usher in the possibility of a fundamentally creative life. of life where i mean on the order of something like the matrix whether it's in the matrix or it's just in the world that has been made as beautiful as as possible based on what would functionally be an unlimited resource of intelligence i mean just it's just like for there to be a an ability to solve problems of a sort that we can't currently imagine i mean it's just it really is like a place on the map that you can't you can't i mean you can indicate it's over there you know it's like the blank spot on the map
Starting point is 03:52:59 this is why it's called the singularity right it's like this like this is a, it was John von Neumann, the inventor of game theory, a mathematician who, along with Alan Turing and a couple of other people, is really responsible for the computer revolution. He was the first person to use this term singularity to describe just this, that there's a speeding up of information processing technology and a cultural reliance upon it beyond which we can't actually foresee the level of change that can come over our society. It's like an event horizon past which we can't see. Our society is like an event horizon past which we can't see. And this certainly becomes true when you talk about these intelligent systems being able to make changes to themselves. And again, we're talking mostly software. I'm not imagining—I mean, the most important breakthroughs are certainly at the level of better software.
Starting point is 03:54:05 important breakthroughs are certainly at the level of better software. I mean, we have, in terms of the computing power, the physical hardware on Earth, that's not what's limiting our AI at the moment. It's not like we need more hardware. But we will get more hardware, too, up to the limits of physics. And it will get smaller and smaller as it has. And if quantum computing becomes possible or practical, that will—and actually David Deutsch, the physicist I mentioned, is one of the fathers of the concept of quantum computing— that will open up a whole other area, you know, extreme of computing power that is not at all analogous to the kinds of machines we have now.
Starting point is 03:54:55 But it's just, when you imagine, people don't, people seem to always want to, I mean, I just had this conversation with Neil deGrasse Tyson on my podcast. He. Name dropper? Yeah, right. No, it was just, I'm just keeping people, I'm just, I'm just attributing these ideas to him. He's not at all, he doesn't take this line at all. He's not at all, he thinks it's all bullshit, right? He's not at all worried about AI.
Starting point is 03:55:24 What does he think? He thinks that we just use—he's drawing an analogy from how we currently use computers, that they just keep helping us do what we want to do. Like, we decide what we want to do with computers, and we just add them to our process. And that process becomes automated, and then we'll find new jobs somewhere else like you don't you don't need a stenographer once you have voice recognition technology and um that's not a problem a stenographer will find something else to do and so the economic dislocation isn't that bad and um computers will just get better than they are and you know eventually siri will actually work will actually work, and she'll answer your questions well, and it's not going to be a laugh line, what Siri said to you today.
Starting point is 03:56:11 And then all of this will just proceed to make life better, right? Now, none of that is imagining what it will be like to make, because there will be a certain point where you'll have systems that are, you know, it's like the best chess player on earth is now always going to be a computer, right? There's not going to be a human born tomorrow that's going to be better than the best computer. I mean, like it's already, like we have superhuman chess players on earth. I mean, like it's already, like we have superhuman chess players on Earth. Now, imagine having computers that are superhuman at every task that is relevant, every intellectual task, right? So the best physicist is a computer. You know, the best medical diagnostician is a computer. The best prover of math theorems is a computer.
Starting point is 03:57:04 The best engineer is a computer, right? There's no reason why we're not headed there. I mean, it would be the only reason I could see we're not headed there is that something massively dislocating happens that prevents us from continuing to improve our intelligent machines. But if you just, the moment you admit that intelligence is just a matter of information processing, and you admit that we will continue to improve our machines unless something heinous happens, because intelligence and automation are the most valuable things we have, at a certain point, whether you think it's in five years or 500 years, we are going to find ourselves in the presence of super intelligent machines. And then at that point, the best source of innovation for the next generation of software or hardware or both will be the machines themselves, right? So then that's where you get what the mathematician I.J.
Starting point is 03:58:04 Goode described as the intelligence explosion, which is just the process can take off on its own. And this is where the singularity people either are hopeful or worried, because there's no guarantee that this process will be remain aligned with our interests and and every person who i i meet even you know very smart people like neil um who says they're not worried about this when you actually drill down on why they're not worried you find that they're actually not imagining machines making changes to their own source code. And they're not, or they simply believe that this is so far away
Starting point is 03:58:54 that we don't have to worry about it now. And that's actually a non sequitur. I mean, to say that this is far away is not actually grappling with, it's not an argument argument this isn't going to happen and um and it's based on what too and it's and it's based on i mean first of all there's no there's no reason to believe jamie you want to find out where that is um there's no i mean we don't know how long it will take us to prepare for this, right? So, like, if you knew that it was going to take 50 years for this to happen,
Starting point is 03:59:31 is 50 years enough for us to prepare politically and economically to deal with the ramifications of this? And to do it, and to say nothing of actually building the AI safely in a way that's aligned with our interests, I don't know. I mean, so 50 years is like we've had the iPhone for, what, 10 years, 9 years? I mean, it's like 50 years is not a lot of time, right, to deal with this. And there's just no reason to think it's that far away if we keep making progress. I mean, it's not, it would be amazing if it were 500 years away.
Starting point is 04:00:10 I mean, that seems like, it's more likely, from what I, in the sense I get from the people who are doing this work, it's far more likely to be 50 years than 500 years. Like, you know, I mean, the people who think this is a long, long way off are, I mean, they're saying, you know, 50 to 100 years. No one says 500 years. As far as I know, no one who's actually close to this work and some people think
Starting point is 04:00:48 it could be in 5 years the people who are the deep mind people who are very close to this are the sorts of people who say because the people who are close to this work are astonished by what's happened in the last 10 years we went from a place of
Starting point is 04:01:04 very little progress to, you know, wow, this is all of a sudden really, really interesting and powerful. And again, progress is compounding in a way that's counterintuitive. People systematically overestimate how much change can happen in a year and underestimate how much change can happen in 10 years and you know as far as estimating how much change can happen in 50 or
Starting point is 04:01:30 100 years i don't know that anyone is good at that how could you be with giant leaps come giant exponential leaps off those leaps and it's it's almost impossible for us to really predict what we're going to be looking at 50 years from now. But I don't know what they're going to think about us. That's what's most bizarre about it. We really might be obsolete. If we look at how ridiculous we are, look at this political campaign, look at what we pay attention to in the news, look at the things we really focus on. We're a strange, ridiculous animal. And if we look back on some strange dinosaur that had a weird neck, why should that fucking thing make it? Why should we make it?
Starting point is 04:02:15 We might be here to make that thing. And that thing takes over from here with no emotions, no lust, no greed, and just purely existing electronically. And for what reason? Well, that's a little scary. There are computer scientists who, when you talk about why they're not worried, or talk to them about why they're not worried, they just swallow this pill without any qualm. We're going to make the thing that is far more powerful and beautiful and important than we are. And it doesn't matter what happens to us. I mean, that was our role. Our role was to build these mechanical gods and, and it's fine if they squash us. Um, and I've, yeah, I've literally heard a, a, I've heard someone give a talk.
Starting point is 04:03:06 I mean, that's what woke me up to how interesting this area is. I went to this conference in San Juan about a year ago. And there were, you know, like the people from DeepMind were there. And the people who were very close to this work were there. The people who were very close to this work were there. And to hear some of the reasons why you shouldn't be worried from people who were interested in calming the fears so they could get on with doing their very important work, it was amazing. Because they were highly uncompelling reasons not to be worried. You know, it was just, um,
Starting point is 04:03:47 so, so they had a, they had a desire to be compelled. They're not, they're not at all. Well, no, that there are people,
Starting point is 04:03:55 people want to do this. There's a deep assumption in many of these people that we can figure it out as we go along. Right. It's like, you know, it's just like, we're going to, we're just going's just like we're going to get closer. We're far enough away now, even if it's five years, we'll get there. Once we get closer, once we get something a little scary,
Starting point is 04:04:14 then we'll pull the brakes and talk about it. But the problem is everyone is essentially in a race condition by default. I mean, you have Google is racing against Facebook, and the U.S. is racing against China, and every group is racing against every other group. with incredibly powerful narrow AI is to be the next, you know, multi-billion dollar company, right? So everyone's trying to get there. And if they suddenly get there and sort of overshoot a little bit,
Starting point is 04:04:57 and now they've got something like, you know, general intelligence, you know, or something close, what we're relying on every, and they know everyone else is attempting to do this, right? We don't have a system set up where everyone can pull the brakes together and say, listen, we've got to stop racing here. We have to share everything. We have to share the wealth.
Starting point is 04:05:18 We have to share the information. This truly has to be open source in every conceivable way. We have to, this truly has to be open source in every conceivable way, and we have to diffuse this winner-take-all dynamic. You know, I think we need safe AI, and just to work out all of these issues. Because I think we're going to build this by default. We're just going to keep building more and more intelligent machines. And this is going to be done by everyone who can do it. And with each generation, if we're even talking about generations, it's going to be,
Starting point is 04:06:11 it will have the tools made by the prior generation that are more powerful than anyone imagined 100 years ago. And it's just going to keep going like that. Did anybody actually make that quote about giving birth to the mechanical gods no that was just me yeah but it was there was a scientist that actually was thinking and saying
Starting point is 04:06:32 that but that was that was the content of what he was saying he's like we're going to build the next species that is far more important than we are and that's's a good thing. And actually, I can go there with him. I mean, actually, the only caveat here is that unless they're not conscious, right? The true horror for me is that we can build things more intelligent than we are, more powerful than we are, and that can squash us, and they might be unconscious, right?
Starting point is 04:07:08 Right. There might be nothing, like the universe could go dark if they squash us, right? Or at least our corner of the universe could go dark. Right. And yet these things will be immensely powerful. So if, and this is just, you know, the jury's out on this, but if there's nothing about intelligence scaling that demands that consciousness come along for the ride, then it's possible that, I mean, nobody thinks our machines are, you know, very few people would think our machines that are intelligent are conscious, right? So at what point does consciousness come online? Maybe it's possible to build super intelligence that's unconscious.
Starting point is 04:07:44 You know, super powerful, does everything better than we do. It'll recognize your emotion better than another person can, but the lights aren't on. That's also, I think, possible, but maybe it's not possible. That's the worst-case scenario because the ethical silver lining, and speaking outside of our self-interest now but just from a bird's-eye view, the ethical silver lining to building these mechanical gods that are conscious is that, yes, in fact, if we have built something that is far wiser and has far more beautiful experiences and deeper experiences of the universe than we could ever imagine. And there's something that it's like to be that thing. It's just, you know, it has kind of a godlike experience.
Starting point is 04:08:36 Well, that would be a very good thing. Then we will have built something that was, you know, if you stand outside of our narrow self-interest, I can understand why he would say that he was just assuming what was scary about that particular talk as he was assuming that consciousness comes along for the ride here and i don't know that that is a safe assumption well and the the really terrifying thing is who if if this is constantly improving itself and it's under the beck and call of a person then, so it's either conscious. Well, not necessarily. It's either conscious where it acts as itself, right?
Starting point is 04:09:14 It acts as an individual thinking unit, right? Or as a thing outside of it. It's aware, right? Either it is or it isn't. And if it isn't aware and some person can manipulate it, like imagine if it's getting 10,000. How many how many thousands of years in a week did you say? If it was just improvement, it was just a million times faster than we are. It's 20,000 years, 20,000 years in a week, in a week, in a week.
Starting point is 04:09:39 So with every week, this thing constantly gets better at even doing that. Right. So it's reprogramming itself so it's all exponential presumably just just imagine again you could keep it in the most restricted case you could just keep it at our level but just just faster just a million times faster but if it did all these things if it kept going and kept every week was thousands of years right we're going to control it? A person. No, I know.
Starting point is 04:10:06 A regular person. That's even more insane. Just imagine being in dialogue with something that lived the 20,000 years of human progress in a week. And you come back on Monday and say, listen, that thing I told you to do last Monday, I want to change that up. And this thing has made 20,000 years of progress. And if it's in a condition where it has access, I mean, so we're imagining this thing in a box, air-gapped from the Internet, and it's got no way to get out, right? Even that is an unstable situation. But just imagine this emerging in some way online, right?
Starting point is 04:10:46 Already being out in the wild, right? So let's say it's in a financial market, right? That's, again, this is what worries me most about this. And what is also interesting is that our intuitions here, I think the primary intuition that people have is, no, no, no, that's just not possible or not at all likely. But if you're going to think it's impossible or even unlikely, you have to find something wrong with the claim that intelligence is just a matter of information processing. I don't know any scientific reason to doubt that claim at the moment, and very good reasons to believe that it's just undoubtable. And you have to doubt that we will continue to make progress in the design of intelligent machines.
Starting point is 04:11:42 But once you... Then all that's left is just time, right? If intelligence is just information processing and we are going to continue to build better and better information processors, at a certain point, we are going to build something that is superhuman. And so whether it's in five years or 50,
Starting point is 04:12:10 it's the biggest change in human history I think we can imagine, right? And people, what I find, I keep finding myself in the presence of people who seem, at least to my eye, to be refusing to imagine it. Like they're treating it like the Y2K virus or whatever, or the Y2K bug, where it just may or may not be an issue. Like it's a hypothetical.
Starting point is 04:12:45 Like this is just, we're going to get there and it's either not going to happen or it's going to be trivial. But if you don't have an argument for why this isn't going to happen, then you're left with, okay, what's it going to be like to have systems that are better than we are at everything in the intellectual space? at everything in the intellectual space. And what will happen if that suddenly happens in one country and not in another, right? It has enormous implications, but it just sounds like science fiction. I don't know what's scarier, the idea that an artificial intelligence can emerge, it's conscious, it's aware of itself, and then acts to protect itself.
Starting point is 04:13:29 Or the idea that a person, a regular person like of today, could be in control of essentially a god. Right. Because if this thing continues to get smarter and smarter with every week and more and more power and more and more potential, more and more understanding, thousands of years. I mean, it's just this one person, a regular person controlling that is almost more terrifying than creating a new life. Or any group of people who don't have the total welfare of humanity as their central concern.
Starting point is 04:14:02 So just imagine, I mean, what would China do with it now, right? What would we do if we thought China, you know, if Baidu or some Chinese company was on the verge of this thing, what would it be rational for us to do? If North Korea had it, it would be rational to nuke them,
Starting point is 04:14:19 given what they say about their relationship with the rest of the world. So it's... destabilizing. Well, that kind of power just isn't rational. That kind of power, it's so life-changing. It's so paradigm-shifting. Right. But to wind this back to what someone like Neil deGrasse Tyson would say
Starting point is 04:14:39 is that the only basis for fear is, yeah, don't give your super-intell intelligent AI to the next Hitler, right? That's obviously bad. But if we don't, if we're not idiots and we just use it well, we're fine. And that I think is an intuition that is just, that's just a failure to unpack what is entailed by, again, something like an intelligence explosion a process that be once once you're talking about something that is able to change itself and you have to get so what would it be like to guarantee let's say we decide okay we're just not going to build anything that can make changes to its own source code you know any change to to to software at a certain point is going to have to be run through a human brain.
Starting point is 04:15:30 And we're going to have veto power. Well, is every person working on AI going to abide by that rule? It's like we've agreed not to clone humans, right? But, you know, are we going to stand by that agreement for the rest of human history? Are we going to stand by that agreement for the rest of human history? And is our agreement binding on China or Singapore or any other country that might think otherwise? It's just we have it. It's a free-for-all. And at a certain point, we're going to be close enough.
Starting point is 04:15:57 Everyone's going to be close enough to making the final breakthrough that unless we have some agreement about how to proceed, someone is going to get there first. That is a terrifying scenario of the future. You know, you cemented this last time you were here, but not as extreme as this time. You seem to be accelerating the rhetoric. Yeah, exactly. You're going deep. Boy boy i hope you're wrong i'm on team yeah i did neil degrasse tyson right this one yeah go neil um and well and so in defense of of the other side too i should say that you know
Starting point is 04:16:39 so david deutsch also thinks i'm wrong but he thinks i'm wrong because we will integrate ourselves with these machines. I mean, so there'll be extensions of ourselves, and they can't help but be aligned with us because we will be connected to them. That seems to be the only way we can all get along. We have to merge and become one. Yeah, but I just think there's no deep reason why. deep reason why, like, even if we decided to do that, right, like in the U.S. or in half the world, one, I think there are reasons to worry that even that could go haywire. But there's no guarantee that someone else couldn't just build AI in a box. I mean,
Starting point is 04:17:16 if we can build AI such that we can merge our brains with it, someone can also just build AI in a box, right? And then you inherit all the other problems that people are saying we don't have to worry about. If it was a good Coen Brothers movie, it would be invented in the middle of the presidency of Donald Trump. And so that's when AI would go live and then AI would have to challenge Donald Trump
Starting point is 04:17:41 and they would have like an insult contest. But that's when this thing becomes so comically uh terrifying where it's just just imagine donald trump being in a position to make the final decisions on topics like this for the country that is actually is going to do this almost certainly in the near term. It's like, should we have a Manhattan Project on this point, Mr. President? You know, the idea that anything of value could be happening between his ears on this topic or a hundred others like it, I think is now really inconceivable.
Starting point is 04:18:24 like it, I think is now really inconceivable. And so what price might we pay for that kind of inattention and self-satisfied inattention to these kinds of issues? Well, this issue, if this is real and if this could go live in 50 years, this is the issue. Yeah. this is real and if this could go live in 50 years this is the issue yeah i mean unless we fuck ourselves up beyond repair before then and shut the power off if it keeps going yeah no i think it is i think it is the issue but unfortunately it's the it's the issue that doesn't it sounds like a goof yeah it does it just sounds it does you sound like a crackpot even worrying about this issue sounds completely ridiculous but that might be what's how it's sneaking in. Yeah. Yeah. I mean, it's just, it just imagine that the tiny increment that would
Starting point is 04:19:10 make suddenly make it compelling. I mean, just imagine, I mean, chess doesn't do it because chess is so far from any central human concern, but just imagine if your, if your phone recognized your emotional state better than any, than your best friend or your wife or anyone in your life and it did it reliably. And it was your buddy like that movie with Joaquin Phoenix? Oh, her? Yeah. He falls in love with his phone, right? I mean, that's just not, you know, that is not that far off.
Starting point is 04:19:42 It's a very discreet ability. I mean, you could do that without any other ability in the phone, really. It's like it doesn't have to stand on the shoulders of any other kind of intelligence. You could do this with just brute force in the same way that you have a great chess player that doesn't necessarily understand that it's playing chess. You could have the facial recognition of emotion and the tone of voice recognition of emotion. And the idea that it's going to be a very long time for computers to get better than people at that, I think, is very far-fetched. I was thinking, yeah, I think you're right. I was just thinking, how strange would it be if you had like headphones on and your phone was in your pocket and you had rational conversations
Starting point is 04:20:33 with your phone, like your phone knew you better than you know you. Like, I mean, I don't know what to do. I mean, I don't think I was out of line. She yelled at me. I mean, what should I say? And it would listen to every one of your conversations with your friends. Exactly. Train up on that. And just talk to you about it and go, listen, man if your conversations with your friends exactly train up on that just talk To you about it go listen man. This is what you gotta do You got defensive you got defensive why are you so defensive Joe apologize relax?
Starting point is 04:20:54 Let's all move on if you could accelerate it or you okay you're right man right man And like you're talking this little artificial maybe that's the first version of artificial intelligence that we suggest We say all right Let's give it a shot. And like self-help guys in your phone. You have like a personal trainer in your phone. How to talk to girls. It tells you everything.
Starting point is 04:21:11 Slow down, dude. Slow down. You're talking too fast. Got to act cool. Yeah. I mean, literally like giving you information. That would be like step one. That would be like the Sony Walkman.
Starting point is 04:21:22 Remember when you had a Walkman, like a cassette player? Yeah. That was like a vcr yeah we're on our way to what we have today where you have fucking 30 000 songs in your phone or something i think i remember the first walkman the first thing uh it's when i back when i skied there was something called it was called astral tunes or something it was like it was like a car radio that you could just put in a pack on your chest. If they kept coming out with those, they would get smaller and smaller. So then that little dude would start telling you, yo, man, dude, listen, they keep replacing me every year.
Starting point is 04:21:56 Just let them stick me in your brain. We'll be together all the time. I've been giving you good advice for years, bro. Let me in your brain. And so you and this little artificial intelligence, you have a relationship over time. And eventually it talks you into getting your head drilled and they screw it in there. And your artificial intelligence is always powered by your central nervous system. Have you seen most of these movies?
Starting point is 04:22:20 Did you see Her? No, I didn't. No. And Ex Machina? I saw Ex Machina. That was one of my that was good top 10 all-time favorite movies yeah i loved that movie actually i like it i saw it twice i said i i was slow to realize how well uh they they did it i mean it was just the first time i saw it i thought um i wasn't as impressed i watched it again, and they really, first of all, the performance of, I forgot the actress's name.
Starting point is 04:22:51 Vikander, Elisa Vikander or something. The woman who plays the robot in Ex Machina is just fantastic. Scary good. She can tuck you into anything. We're getting a little full on time. Yeah, what are we, like five hours in? Four and a half hours in, but I just got a note. This is about to fill up. Wait a minute. How many hours? Four and a half hours.
Starting point is 04:23:09 Our computers are about to fill up? We just did a four and a half hour podcast? Yeah. We're ready to keep going, too. Jesus. Jamie didn't cock-bocket. You know what, man? Once you opened up that box, that Pandora's box of artificial intelligence... I have a small question about AI that I haven't heard you guys discuss yet and I've looked up.
Starting point is 04:23:26 Is there any sort of concept of autism in AI, like a spectrum of AI? There are dumb AI and there's going to be smart AI. Oh, yeah, yeah, yeah. The scary thing, so yeah, it's like super autism. There's no...
Starting point is 04:23:42 Across the board, I think that super intelligence and motivation and goals are totally separable. So you could have a superintelligent machine that is purposed toward a goal that just seems completely absurd and harmful and non-commonsensical. And this is the example that Nick Bostrom uses in his book, Superintelligence, which was a great book and did more to inform my thinking on this topic than any other source, he talks about a paperclip maximizer. You could build a superintelligent paperclip maximizer. Now, not that anyone would do this, but the point is you could build a machine that was smarter than we are in every conceivable way, but all it wants to do is produce paperclips, right?
Starting point is 04:24:27 Now, that seems counterintuitive, but there's no reason, when you kind of dig deeply into this, there's no reason why you couldn't build a superhuman paperclip maximizer that just wants to turn everything, you know, just literally the atoms in your body would be better used as paperclips. Literally, the atoms in your body would be better used as paperclips. And so this is just the point he's making is that superintelligence could be very counterintuitive. It's not necessarily going to inherit everything we find as, you know, commonsensical or emotionally appropriate or wise or desirable. It could be totally foreign, totally trivial in some way, focused on something that means nothing to us but means everything to it because of some quirk in how its motivation system is structured. And yet it can build the perfect nanotechnology that will allow it to build more paperclips.
Starting point is 04:25:26 At least I don't think anyone can see why that's ruled out in advance. I mean, there's no reason why we would intentionally build that. But the fear is we might build something that either is not perfectly aligned with our goals and our common sense and our aspirations, and that it could form some kind of separate instrumental goals to get what it wants that are totally incompatible with life as we know it. And that's, you know, I mean, again, the examples of this are always cartoonish, like, you know, how Elon Musk said, you know, if you built a super intelligent machine and you told it to reduce spam, well, then it could just kill all people. And that's a great way to reduce spam. Right. But see, the reason why that's laughable, but you can't assume the common sense won't be there unless we've built it. Right. Like you have to have anticipated all of this. You can't if you say take me to the airport as fast as you can.
Starting point is 04:26:24 If you say, take me to the airport as fast as you can, again, this is Bostrom, and you have a super intelligent automatic car, a self-driving car, you'll get to the airport covered in vomit because it's just going to go as fast as it can go. So our intuitions about what it would mean to be super intelligent necessarily are, I mean, we have to correct for them because I think our intuitions are bad. You're freaking me out. And you've been freaking me out for over an hour and a half. I'm freaked out that we did four and a half hours and I thought we were coming up on three. Man, I hope you're wrong about all that stuff.
Starting point is 04:27:03 Yeah, maybe so. It doesn't seem that, I don't know. It doesn't look that rosy, Jamie. I'm sorry to be such a buzzkill. I'll look to the woods. Might have to figure out how to live off the land. Well, you are the ultimate prepper. It ain't easy, man. I'm going to call you.
Starting point is 04:27:20 I'm bad at it. I'll starve. Well, I'll starve. I won't be a vegetarian. I'll come to your house for bear meat It might get ugly folks Let's hope Sam Harris is wrong Thank you brother
Starting point is 04:27:29 Appreciate it And your podcast Tell people how to get yours Waking Up is my podcast And you can find it on my website SamHarris.org Or on iTunes You can get one of his books
Starting point is 04:27:40 If you go to audible.com forward slash Joe Right? Isn't that? Get one Go get one of those, you fucks. All right. Thank you, ladies and gentlemen. Thank you, Sam.
Starting point is 04:27:48 That was awesome. Yeah, thank you, bro. Woo. More hat.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.