The Joe Rogan Experience - #940 - Sam Harris & Dan Harris

Episode Date: April 4, 2017

Sam Harris is a neuroscientist and author of the New York Times bestsellers, The End of Faith, Letter to a Christian Nation, and The Moral Landscape. Dan Harris is a correspondent for ABC News, an anc...hor for Nightline and co-anchor for the weekend edition of Good Morning America.Dan is also the founder, author, and host of the book/app/podcast called "10% Happier" available on Spotify - http://www.10percenthappier.com/

Transcript
Discussion (0)
Starting point is 00:00:00 3, 2, 1. And we're live. What's up? All right. How are you, man? What's going on? Doing great. Sam Harris, ladies and gentlemen. We've got Harris and Harris. Dan and Sam. No relation, obviously.
Starting point is 00:00:13 No relation. Brother from another mother kind of thing. Oh, sweet. Actually, no, we worked this out. We are deeply unrelated because your Harris is the Jewish side of your family, right? No, no, no, no. My Harris is the... Yes, actually, no, you're right. My Harris is the Jewish side of your family, right? No, no, no, no, no. My Harris is the... Yes, actually, no, you're right.
Starting point is 00:00:28 My Harris is the Jewish side of the family. It was changed at Ellis Island from, allegedly from Addis, which doesn't sound Jewish either, but yes. Right. And your Harris is from the Nunders. My Harris is the Goyim side of the family. Isn't it funny how many people's names were changed at Ellis Island? Yeah.
Starting point is 00:00:44 Like, where they're like, nah, were changed at Ellis Island? Yeah. Like, where they're like, nah, not American enough. Yeah. Meanwhile, Schwarzenegger made it. Right.
Starting point is 00:00:51 Proudly. Yeah. Odd. So, anyway, thanks for coming, you guys. Thanks for having us. This is a weird time.
Starting point is 00:00:57 You know, I've been extra weirded out over the last couple months and I just got back from Mexico. I was on vacation and I didn't do shit for a week and in not doing anything for a week, I really got a chance to sit down and think about stuff. And I'm more weirded out by life today than I think I ever have been before. So I'm excited
Starting point is 00:01:14 to have you on because I want to hear your story because Sam has been telling me about it and I looked into it and it's, please explain like what happened happened to you like where you were and what happened to you okay so it was uh you're talking about the panic attack yeah yeah so it was 2004 i was on a little show that we do at abc news called good morning america that's a big show that's a big show that's not a little show it's not a little show uh and i was i was doing the job that i i was filling in as the newsreader. That's the person who comes on at the top of each hour and reads the headlines. Right.
Starting point is 00:01:49 And I just freaked out. I just lost it. Like how? So I was a couple seconds into it, and I started to get really scared, and it just, if you've ever had a panic attack? Have you ever had a panic attack? No. So it's like anxiety on steroids.
Starting point is 00:02:05 So you start to worry, but then your fight or flight instincts kick in. So your lungs seize up, your palms start sweating, your mouth dries up, your heart is racing, your mind is racing. I couldn't breathe and therefore couldn't speak. So a couple seconds into reading what was supposed to be six stories right off of the teleprompter, I just lost the capacity to speak, and I had to kind of squeak out something about, you know, back to you, back to the main anchors. Wow. Yeah, it sucked uncontrollably. Now, what caused it?
Starting point is 00:02:40 Do you know? I do, yeah. I definitely know. Some dumb behavior in my personal life is what caused it. I had spent a lot of time as a war reporter at ABC News. I was in Afghanistan, Pakistan, Israel, the West Bank, Gaza, six trips to Iraq. And I had come home from a long run. of, I had come home from a long run and I covered kind of the pre-invasion, invasion, and then insurgency in one kind of six month run. And I came home after that and I got depressed.
Starting point is 00:03:13 And I didn't actually know I was depressed, but I was having some obvious symptoms in hindsight. I was having trouble getting out of bed, felt like I had a low grade fever all the time. And then I did something really smart, which is I started to self-medicate with cocaine and ecstasy. And even though I wasn't doing it all the time, I like to say it wasn't like that, you know, you ever see the Wolf of Wall Street? Yes. Where they're popping lewds. That wasn't me. And I wasn't getting high on the air or anything like that.
Starting point is 00:03:41 But, you know, I was partying in my spare time because it made me feel better. So after I had the panic attack, I went to a doctor who's an expert in panic, and he started asking me a bunch of questions, trying to figure out what had caused the panic attack. And one of the questions was, do you do drugs? And I was like, yeah, I do drugs. And he leaned back in his chair and gave me a look that communicated the following sentiment. Okay, asshole, mystery solved.
Starting point is 00:04:08 And he just pointed out that, you know, you raise the level of adrenaline in your brain artificially, you make it much more likely to have a panic attack. And I, at baseline, I'm a jittery little dude. So it doesn't take much to put me in that zone. I mean, he just offered me coffee and I said no, because even that will freak me out. Well, it's weird that ecstasy and cocaine was the combination, because ecstasy is something that they actually give to a lot of soldiers that have PTSD, and there's been quite a few tests on that.
Starting point is 00:04:37 Yeah, I don't actually think ecstasy was the problem. The coke? I think it was the coke. Yeah, well, it makes sense. But those are the two drugs i was mostly doing how often were you doing it i would say you know there would be months where i wasn't doing it at all because i would i was off i covered the 2004 presidential campaign and i didn't have a lot of time to be you know snorting coke so um but when i was home and around my
Starting point is 00:05:01 friends you know on a good on a busy week, you know, two, three times a week. Wow. That's a lot. Yeah. That'll do you. There's a, there's a comeback right now that cocaine is experiencing. Did it ever go away?
Starting point is 00:05:16 I don't know. I've never seen it. I've never done it. Was there some sort of cocaine recession that it had to bounce back from? I believe there was. Really? This is what this, I'm talking totally ignorantly. I've been out of the game for a long time. The cocaine game? Yeah. I'm kind of boring now. But it feels to me like
Starting point is 00:05:32 it's kind of a perennial favorite. Yeah. I don't know. I mean, I feel like it went through a recession. Maybe it's just my perception. I had a buddy of mine when I was in high school and his cousin was hooked on Coke. And I watched while we were in high school and his cousin was hooked on coke and I watched while we're in high school he started selling it and he withered away lost like 30 pounds or something like that and just him and his girlfriend just hide out in the attic they lived they had an attic apartment they would just hide out there and watch tv and do coke and sell coke to people and I was like well fuck that drug like whatever that drug's doing like these people it's like it was almost like knowing someone who'd gotten bitten by a vampire and become something different. It was very strange. So my experiences
Starting point is 00:06:09 seeing people do that led me to never do it. Yeah. I mean, it certainly was not like that for me, but you know, I, I could see it over the horizon. You can, it's an incredibly addictive drug. Um, so I think you made the right call. Yeah. It it seems a little it's got a little too much gravity attached to it yeah i mean you you you can get hooked and it will bring you down there are other drugs you can do i'm not recommending drugs but uh the way my friend sam my my half brother sam over there does let me get to it yeah uh there are other drugs you can do that have vastly lower addictive character. What's the word I'm looking for here, Sam? Characteristics.
Starting point is 00:06:51 Yes, thank you. So how did you recover? So I wasn't actually doing it that long. I actually had never done hard drugs until my early 30s when I came home from the war zones. And that's what started it? Yeah. You just were freaked out by seeing too much? No.
Starting point is 00:07:08 You know, it wasn't PTSD. It was I was addicted to the adrenaline. Oh, wow. It was not that I was traumatized. It was that I was enjoying it too much. And I would come home and the world would seem gray and boring. Wow. Yes.
Starting point is 00:07:23 That was the problem. Did you watch Hurt Locker? I'm sure you did, right? Yeah. Did that resonate with you? Absolutely. It's been a while since I watched it, but absolutely. I want to just be clear that the experience of a journalist is so different from the experience,
Starting point is 00:07:38 so much more mild than the experience of an enlisted man or woman. Sure. So I don't want to compare my experience to the Hurt Locker. I'm an observer on the side, and I don't even want to compare my experience to more experienced war correspondents out there. I'm thinking of guys like Richard Engel on NBC. Sebastian Junger.
Starting point is 00:07:58 Absolutely. I just actually sat down with him the other day. He's got a new documentary coming out. My experiences are much more mild than that but certainly enough to really get a sense of how thrilling it is there's an expression there's nothing more thrilling than the bullet that misses you and in my case luckily they all missed that was not true for some of my friends um but you so i had a real sense of the stakes, but it is exciting. It's also thrilling on an idealistic level. I mean, I believe in the importance of bearing witness to the tip of the spear, to what we're doing, to what our military is doing in our name.
Starting point is 00:08:37 So all of that is a heady mix. So you knew people over there, journalists that got killed? Oh, yeah, absolutely. A very good friend of mine, the guy who actually ultimately set me up with my wife, is a guy named Bob Woodruff, who was the anchor of World News Tonight on ABC News. He had only been in the chair for about a month when he was on a trip to Iraq, and he literally got his head nearly blown off when he was in the top of an Iraqi tank. Almost died. He's an Iraqi tank. Almost died.
Starting point is 00:09:06 He's an absolute miracle he's alive. There are pictures of him on the Internet with basically half a head. Traumatic brain injury, was brought back to life, is to this day a walking miracle that he's alive. And after he recovered, he then introduced me to the woman I married. So he's a close friend, and I saw cases like that, lost friends, both Iraqi friends and journalist friends. The woman I was dating at the time when I was spending a lot of time in Iraq, she got hit by a tank shell. She was in the hotel Palestine where all the journalists were staying she was
Starting point is 00:09:47 on a balcony and one of her colleagues in the balcony below her he got a direct hit and died she carried him to the hospital and she basically uh got the reverberations and couldn't hear still can't hear as far as i know when you're a journalist and you're over in iraq or in Hmm. How difficult is it trying to, because I think so many people have this almost dramatic television movie slash view of war where they don't ever experience it. Like I would imagine probably 99% of the people that are in this country will never experience it. No, and that's good. Yeah. No, I don't know that you can describe it in a way that will really give the full picture of its absurdities and and horrors and long stretches of boredom punctuated by terror. And also this other piece, which is taboo to talk about, which is the pleasure and excitement that people get fighting wars.
Starting point is 00:11:07 I mean, this is Sebastian Junger's thesis as well. I mean, just the camaraderie is the most intense camaraderie they ever experienced. And so that's something they come back to civilian life and are missing it. And it's part of what's so difficult about coming back. And that's why you see a lot of risk-taking behavior among vets, because you're looking for another way to get that hit of adrenaline, for sure. There was a book, and I'm blanking on the name, it was a great book written by a much more experienced war correspondent than me. He used the phrase, war is a drug.
Starting point is 00:11:44 And that, to me, sums it up, at least in my experience. I got hooked on the experience of being in these really elevated situations, heightened situations, cinematic, dramatic situations, and I would come home and I just, I didn't know what to do to replace it. And so this synthetic squirt of adrenaline that you can get from cocaine seemed to do it for me. Obviously, it had tremendously negative consequences. And so I wouldn't recommend it. But I see why people do this. So how did you bounce back?
Starting point is 00:12:22 That was actually the question you asked me before that i somehow neglected to answer so the doctor who pointed out that i was an idiot and doing drugs and it caused the panic attack uh i agreed he didn't think i needed to go to rehab because it was the it was pretty short-lived it was i was in my early 30s when i started and still in my early 30s when i had the panic attack and so it was only a couple. He said, I want you to come see me once or twice a week forever. Forever? How convenient. He said basically indefinitely. So I still see him, but not.
Starting point is 00:12:55 Good business model. Yeah, it's a good business model. But not that. I mean, it's been well north of, it's been about 13 years. So I don't see him that often now. But for a long time time I saw them intensively so that you know it wasn't easy it's not easy and I wouldn't
Starting point is 00:13:11 I wouldn't call what I mean there are people who have had drug addictions that are vastly more severe than mine but it sucks to stop a habit that is giving you pleasure on you know on pretty in pretty prominent areas of your brain.
Starting point is 00:13:29 Well, your situation, what you're talking about, is a very, very, very extreme situation. Like being a journalist, a war correspondent, going over there, experiencing that intense sort of adrenaline rush and then having your issues with it. But it seems like there's a tremendous amount of people today that are stimulating themselves um adderall was a big one i mean it's just uh i know i've found out recently like four or five people that i didn't know that were on adderall it seems like you just sort of start asking questions and you find out how many people i mean all these my kid goes to school with a bunch of other kids and you get to meet the parents and like fucking half of them are on Adderall. Like it's very weird.
Starting point is 00:14:11 And Adderall is a form of amphetamine. And it seems like it just is mind boggling how many people are doing this stuff. We're dosing ourselves with all sorts of things. So it can be stimulants, but it also can be benzos, cousins of Valium. It can be shopping, gambling. It can be whatever, except Twitter. I think it's just speaking. We have a neuroscientist in the room, so I'll let him say more about this. And also a guy who's a more experienced practitioner of Buddhism than I am. But, you know, it does speak to the nature of the human mind that we're always on the hunt for the next little hit of dopamine.
Starting point is 00:14:58 And now there are lots of ways to get it. Well, it's also very bizarre that you can just do that. I mean, I don't think there's ever been a time in history where you could just take a pill and you'll be elevated for five or six hours. I mean, and that your doctor will give you this pill and they'll encourage you to take it. And then you'll find out that 50% of the people in your community are taking it. I've done stories about parents who steal it from the kids. Also, just caffeine.
Starting point is 00:15:21 I mean, I'm reaching for the coffee here, having slept poorly last night. But that is, and that's just as much of a drug, it's just not as potent a drug as taking methamphetamine or Adderall or anything else that's a drug drug. This had civilizational consequences when humanity more or less switched from alcohol in the morning to caffeine in the morning. That's just – things got a lot different. For hundreds of years, people were just drinking ale and wine in the morning before coffee and tea became huge in Europe. in Europe and colonialism, the engine of colonialism to a significant degree was coffee, tea, sugar, and, you know, things, our behavior changed. People that were drinking ale and wine weren't, wasn't a big part of it. The reason why they drank it with food is because water would get stagnant. Yeah.
Starting point is 00:16:24 Well, there's the issue with clean water too. Yeah. Yeah. Yeah. water would get stagnant. Yeah. Well, there's the issue with clean water, too. Yeah. Yeah.
Starting point is 00:16:31 But just imagine the consequences of you and everyone you know getting up in the morning and just starting with beer or wine. I know people like that. That's a long day or a very short one. But the fundamental point of the underlying neuroscience is that all of these drugs, anything you're putting into your body that's modifying the behavior of your brain is only modifying the existing available neurochemistry get your brain to secrete more of an existing neurotransmitter or they mimic an existing neurotransmitter binding to the same receptor site or they keep something in play longer than it would otherwise have been. They block the reuptake of neurotransmitters or neuromodulators. So it's not – a drug is never getting your brain to do something your brain is incapable of doing, right? And that's the most extreme thing, like DMT or LSD. I mean, the brain is still doing all of that.
Starting point is 00:17:39 And so it stands to reason that there are potentially other ways of getting the brain to do that, whether it's meditation or whether it's computer interface ultimately to the brain. it's meditation or whether it's computer interface ultimately to the brain. People who are interested in brain-computer interface that not only allows a quadriplegic to move a robotic arm or gets a Parkinson's patient to be able to move, but to the ultimate degree, I mean, actually augmenting human function, that's or opening landscapes of mind, psychedelic and otherwise, that have been unexplored. I mean, all of that is in principle possible because, again, we're just talking about electrochemical phenomenon happening in our heads, which is there to be modulated. Now, when you were talking about being depressed, and Sam, you were talking about reuptake inhibitors, I want to know, what are your thoughts on the massive amount of people that are on SSRIs now? I mean, there's another thing that I know, how many people I know that either are on or have been on some sort of antidepressants. And it seems, I mean, to me, someone who's never taken them or doesn't have personal experience with it, massively overprescribed.
Starting point is 00:18:57 Yeah, well, anything I say is with a caveat that this is, I mean, I'm not a neurologist. I don't have no, I have zero clinical experience. And this is certainly not my area. I'm not up on the recent literature on the efficacy of antidepressants. But there's clearly, it's like anything, there's a spectrum. There are people who have been unambiguously helped by antidepressants. And there are people who are on them who shouldn't be on them. And there are people who are on them who want to get off them and find it surprisingly difficult to get off them. because anything that's modulating serotonin, in this case, is effective everywhere serotonin is effective. And there's no magical property of finding the neuromodulator where only the symptoms you want to relieve are affected because these chemicals do a lot of things in a lot of places, even in your gut. Hence the side effects you get with almost any medication. And we would have to get, in many respects, it's a matter of luck to find a pharmacological target that actually does just what you want it to do, which is to say that those receptors are not elsewhere that are going to produce side effects
Starting point is 00:20:32 for you. So that's why a different kind of intervention, something like ultimately some Ultimately, some electrical or magnetic or machine-based intervention could be more targeted because then you're not just putting something in the bloodstream that spreads everywhere. By machine-based, you mean something like electrodes that they put on the mind or the surface of the head to stimulate areas of the brain? Yeah, yeah. And again, what we have now is also still pretty primitive and anything that you would have that would be super futuristic wouldn't seem to require that you, you put something actually inside your head, right? So whether that's neurosurgery or, um, putting something into the bloodstream that somehow gets inside your head, an injectable. So for instance, Elon Musk just mentioned something which he called neural lace, which is,
Starting point is 00:21:36 I believe, a term that came from a sci-fi novel. I don't think it originates with him. I'm not a big science fiction reader. But he announced investment in a company called Neuralink, which is looking at some advanced brain-computer interface based on the idea that you could get a... With these new microelectrodes, you can get an injectable mesh, like a wire mesh that just integrates with the brain, or very likely just the cortex. And I believe this work has already been done in mice. And the mice have survived and are living with this mesh in their brains. And again,
Starting point is 00:22:22 this is not research I'm close to at all. I mean, he just announced this a couple of weeks ago. But in principle, you're talking about having, whether it's a mesh or whether it's magnetoelectric particles, I mean, something that is on site around individual neurons or assemblages of neurons, which can both around individual neurons or assemblages of neurons, which can both read out and input wirelessly signal from those neurons. So, you know, just both, you know, putting your thoughts into the world by, you know, influencing effectors, robotic arms or cursors on screens or whatever it is, and also influencing your mind based on whatever inputs you want to put in there from the world, whether that's- Until the Russians hack you and-
Starting point is 00:23:11 Yeah, exactly. Well, it opens all of those concerns. It's always the Russians. It's always the Russians. Chinese, there's gambling. I think the Chinese people are responsible for a lot of the propaganda that makes us think about the Russians. It's like, push it off on them.
Starting point is 00:23:24 Push it off on them. I'm prepared to blame the Russians for a lot at this propaganda that makes us think about the Russians. It's like, push it off on them. Push it off on them. I'm prepared to blame the Russians for a lot at this point. Blame them all. What a slippery slope, though, for humans becoming cyborgs. I mean, we're already like some weird form of symbiote right now carrying our cell phones like it's a baby. You know, like you leave your cell phone and all like, oh my God, I forgot the baby. I mean, it's a very strange thing that we already have.
Starting point is 00:23:44 And there's not a whole lot of steps between that and Snapchat glasses. Jamie's got the Snapchat glasses. Have you ever seen those things? Yeah, I didn't know they existed. Oh, they're very strange. They have little cameras on them. Go ahead, throw them on. He knows how to use them.
Starting point is 00:24:00 Google Glass just was totally stillborn, right? Yeah, it didn't work. So he's transmitting or he's making a video right now with that left side. See how the left side is spinning? Yeah. What does Snapchat do, like 15 seconds? 10 seconds at a time, that little counter. It's just flashing with the last three seconds.
Starting point is 00:24:16 You can hit it again, get another 20 seconds, or you can hold it and do 30 seconds or something like that. Yeah, step one. Now, is that spinning just to alert the people you're looking at that you are recording them? I see it, too. So it's going to alert me, too, to let me know that my recording is done. Probably a little bit to let you know, too, but that's probably the only notification
Starting point is 00:24:33 you know that I'm recording. And it just posts. It doesn't post automatically, and now I have to link it to my phone and then post it from there. Oh, okay. The Google Glass thing made people very uncomfortable. I mean, I tried a very early prototype.
Starting point is 00:24:45 I have a good friend of mine who was an executive at Google at the time, and she got a hold of one of the really early ones that actually had to be tethered by a cord. And, you know, you talk to it and swipe it. And I played with it a couple of times, and we used it once at a UFC weigh-in where I put it on and I broadcast from the weigh-in. It's very, very odd, But it made people very uncomfortable. Like you could see the difference when they saw you with that thing on. All of a sudden there was all this apprehension.
Starting point is 00:25:10 They're being recorded or transmitted. But how long? How long do we have where we're still people? Well, long before you asked that question, our sense of privacy, I remember what it was like to be neurotic about the sound of your voice on a voicemail or an answering machine, right? Like re-recording the outgoing message and just being worried about your voice showing up in someone else's tape. Yeah. And now we're living in this panopticon surveillance society where you just assume you're on camera virtually every moment you're in public, although I guess people don't think about it all that much. Yeah, I mean, I think the norms around privacy shift
Starting point is 00:26:01 just because we get so much value from having the data ultimately i guess but it just seems also like there's just this inexorable pull towards this connection that we're going to have it's it just seems like if you just if you take where we are now and sort of look at all the data points and extrapolate it doesn't look good it It's just... Well, but it's both. It's both bad and good. Yeah. The fake news thing is horrible, but the ability to fact check, to be able to pick up your phone and find out what's true, has also never been better.
Starting point is 00:26:39 So it's like we're both vulnerable in a way that we've never been, and we're empowered in a way that we've never been. It's always been true about technological progress. Yeah. It just seems like there's a certain amount of time we have left before we give birth to some new thing. We're talking about just integrating ourselves biologically with our machines. And also something that's independent of us. Some artificial intelligence is independent of us.
Starting point is 00:27:05 You want to look for some fear around that. You've got the right guy right here. Yeah, well, we've talked about it in depth. I've expressed those fears here, yeah. Yeah, I mean, and Josh Zeps freaked me out this morning. He sent me some articles about these self-driving trucks that are already going in Australia that are as big as a 767, and they're driving down the road by themselves with cargo, probably nuclear waste or something, you know, tooling down the road.
Starting point is 00:27:29 But people are so bad at driving that the robots just have to get reliably better than people and then you'll just feel nothing but relief. They probably already are. Well, yeah. I mean, I think they probably are. Yeah, I mean, I think they probably are, as far as I know from what I hear from Tesla, that, yeah, the man hours they have of people using the autopilot, and the autopilot's not at all perfect. Obviously, two people have died already from using it badly. But still, they have something like some millions of man hours of autopilot assisted driving.
Starting point is 00:28:06 And I think that has been safer than just pure ape. Did you see the video of the guy who fell asleep in traffic in San Francisco? He's literally out cold and his car is driving him on the highway. No, no. It's kind of fantastic. I mean, he's just some guy on his way to work, just passed out completely, mouth open, and people are filming him while his car's driving on the road. Yeah, and actually probably the people filming him are doing the more dangerous thing. You're right, yeah.
Starting point is 00:28:33 Well, texting and driving drives me crazy. Oh, yeah. To be in an Uber now, you can look around, you can see how many people are texting. Yeah, including your driver sometimes. Yeah, that drives me crazy. Here's a guy. Oh, yeah. That dude's out cold. His car's creeping along and stop and go
Starting point is 00:28:52 traffic and he is completely out cold. Wow. Well, it works. The autopilot works. Yeah, no, it does work. Yeah, texting and driving scares the shit out of me. That Pokemon Go thing, thank God that died off. It stopped, yeah.
Starting point is 00:29:08 I was driving on the highway, and there was a woman to the left of us, and I noticed that her face was illuminated by her cell phone. So I look over, and she's playing Pokemon as she's driving. So I guess I don't know how Pokemon works, but I guess you pick up things, and as you're driving, you can get stuff. And so she was playing the game while she was barely paying attention to the road. but I guess you pick up things and as you're driving you can get stuff. And so she was playing the game while she was barely paying attention to the road, looking at her phone.
Starting point is 00:29:34 This is the argument for robot drivers. Yeah. And also it will open up when you are ultimately being driven safely by a car that you trust more than you trust yourself, then just imagine the information consumption entertainment options that open up there. You'll watch movies, you'll listen to podcasts, you'll get work done, and it'll become a space of, we're not going to miss having to pay attention to the road. Maybe some people want to drive recreationally for some reason, but
Starting point is 00:30:06 it will just be a new space where you won't believe that 40,000 people every year were dying because we couldn't figure out how to drive safely. But isn't that a slippery slope? That's a great thing that 40,000 people are not going to die.
Starting point is 00:30:23 But the idea that you're going to stop people from driving your car. You're going to have to live with these 40,000, that's a great thing, that 40,000 people are not going to die. But the idea that you're going to stop people from driving a car. You're going to have to live with these 40,000 people. That's not what I was going to say. But I was going to say, I mean, pretty much all the things that people do, and then it's going to get down to why have people? I mean, ultimately, that's the, when I look at the event horizon of artificial intelligence, it's why? Why would we, we're so flawed.
Starting point is 00:30:44 We're not going to get our shit together by the time artificial intelligence is given birth to. Well, that all goes to what we build. I mean, if we build artificial intelligence that is independent of us and seems conscious and is more powerful than us, well, then we have built a, in the limit,
Starting point is 00:31:09 we have essentially built a God that we now have to be in relationship to. And hopefully that works out well for us. And it's very easy to see how it might not. I think there are even scarier cases than that, that we could build something that that has godlike power, but there's no reason to think it's conscious. It's just, it's no more conscious than our current computers, which is to say that intelligence and consciousness may be separable phenomenon. Intelligence can scale, but consciousness need not come along for the ride. And that, for me, is the worst case scenario, because we inherit all of the danger of the power of this system being misaligned with our interests. We could build something that's godlike in its power, and yet we could essentially be canceling the prospects of
Starting point is 00:32:00 the evolution of consciousness, because if this thing wipes us out, I think it's Nick Bostrom, the philosopher, who wrote a great book on this entitled Superintelligence. I think he calls this the Disneyland without children. Basically, we could build this incredibly powerful, intelligent landscape that continues to refine itself and its own powers in who knows what ways, what ways perhaps that are unimaginable to us, and yet the lights aren't on. You know, there's nothing that it's like to be this machine or system of machines
Starting point is 00:32:38 in the way that it's probably nothing that it's like to be the Internet right now. You think of all that's going on on the Internet, you know, I don't think the Internet is conscious of any of it right now. The question is, could the internet become conscious of what it's thinking? And I think there's no reason to think it couldn't. It's just we don't understand the physical basis of consciousness yet. The real question is, why would it do anything? I mean, if it doesn't have any of the biological motivations that people have to breed and to stay alive and to you know fight or flight and to be nervous and this desire to carry on our genes i mean if you really did build the ultimate
Starting point is 00:33:13 supercomputer artificial intelligence that was beyond our capacity for reason and understanding wouldn't it just do nothing because everything everything is pointless. Well, no, but we would do... We would program it. It would do whatever we asked it to do initially, right? I mean, things we program now have goals, right? Your thermostat is trying to regulate the temperature in the room. And when it gets too warm, it kicks on the air. And when it gets too cold, it kicks on the heat.
Starting point is 00:33:41 And I mean, that's a goal, right? It's too cold. It kicks on the heat. And, I mean, that's a goal, right? And so everything we build that's automated has goals explicitly programmed into it. And when you're talking about a truly intelligent machine, it will discover goals that you have never programmed into it that are intermediate to the goal that you have programmed. So if the goal of this machine is to, you know, pick up all the trash in this room, and you physically try to stop it, well, then it's going to try to get around you to pick up the rest of the trash in the room, right? So it's, you know, this is probably already true of a Roomba,
Starting point is 00:34:23 right? I actually don't have a Roomba, but if you put something in the way of the Roomba, it's going to get around the thing you have put in its way so that it can get to the rest of the room. So that's an intermediate goal. And some of these goals need never have been explicitly thought about or represented, which is to say programmed into it, and yet they're formed by the fact that the thing has a long-term goal. And one of the concerns is that we could build something that has a long-term goal, build something that's super powerful, that has a long-term goal, which in principle is benign, right? This is something we want, and yet it could discover instrumental goals that are deeply hostile to what we want. I mean, the thing doesn't have common sense, right? We haven't
Starting point is 00:35:12 figured out how to build common sense into the machine. So, I mean, there's just cartoon examples of this kind of thing, but like one example that Elon used when he first was expressing fears about this is, If you built a machine, the only goal of which was to cancel spam, right? We want no more spam. Get rid of the spam. Well, an easy way to get rid of spam is just kill all the people, right? Now, that's a crazy thing to think, but it's not, unless you've built, unless you've closed the door to that intermediate goal, there's no reason to think that a super powerful machine couldn't form such a goal. My question would be, if this super powerful machine has the ability to create new super
Starting point is 00:35:55 powerful machines, would it use the same mandate? Would it still try to follow the original programming or would it realize that our original programming is only instrumental to the success of the human race? And it might think the human race is ridiculous and preposterous, and why not just program something that it thinks is the ultimate intelligence, something beyond our capacity for reason and understanding now? And that thing, I would wonder, in the absence of any sort of biological motivations, in the absence with, I mean, if you think of all the things that we do, I mean, you break
Starting point is 00:36:30 it down to what motivates people to get out of bed, what motivates people to do good, our sense of community, the desire to breed, the social status, all these different things that motivate people to do things. Remove all of those, and what actions would it take and why? Well, I think you want to build it in a way that is focused on our well-being. And so, for instance, I had Stuart Russell on my podcast. He's a computer scientist at Berkeley who, unlike many computer scientists, takes this problem really seriously and has thought a lot about it. And in his lab, I believe they're working on a way of thinking about the safety that is open-ended and flexible without pretending we have any of the right answers in the near term or are likely to
Starting point is 00:37:26 have them. So you want to build a system that wants to know what you want, right, at each point, like so that's tracking what humanity wants in terms of its goals and wants to stay aligned with whatever it is we want. Wants to learn from what we seem to want based on our behavior. And so there could be some kind of clarifying function where we can get our priorities more aligned in dialogue with the super intelligent machine. But the bottom line is it doesn't think it knows what we want, and it continually wants to keep approximating better and better what we want. And so from my point of view, the most crucial thing is you always want the door to remain open to the statement, wait, wait, wait, that's not what I wanted, right? You want to be in the presence of this godlike superpower that will always take direction from you when you say, wait, wait, wait, that's not going in the right direction.
Starting point is 00:38:38 And so the fear is we could build something that is just not amenable to being controlled in that way. Is it not the case that computer scientists are starting to come around to the idea that there's real danger, there's dangerous potential in AI? I mean, I was listening to when you talked to Will McCaskill a few weeks ago, who was saying, you know, it was just a few years from starting to think about how to split the atom to actually having a bomb. a few years from starting to think about how to split the atom to actually having a bomb. Are they not coming around to the idea that technology can progress much faster than we think of? Yeah. Oh, yeah. Well, it's a whole spectrum of the people who think that this is
Starting point is 00:39:17 never going to happen or it's so far away that thinking about it now is completely irrational to people who are super worried and think that huge changes are imminent. And so it's just the spectrum. I'm with the latter. Yeah, as am I. My concern is not even with the initial construction, like the initial AI. My concern is with what the AI creates. If we give the AI ability to improve upon itself and look at our irrational thoughts and how we've programmed itself to support the human race.
Starting point is 00:39:48 And then it might go, well, why the fuck would I do that? Like, you guys are ridiculous. Like, this is a new life form. This is a new, we've given birth to some incredibly potent new thing that we think of it as artificial. But, I mean, is it really? It's just a form of life. It's a form of life that we've created. Human beings have sort of, the analogy I always use is that we're some sort of an electronic caterpillar giving birth to some spectacular butterfly that we're not even aware of while we're building our cocoon.
Starting point is 00:40:18 We're just doing it. I mean, is a caterpillar fully conscious of what it's doing when it makes that cocoon? Probably not, but it just does. And there's plenty of examples of that in nature of something that's doing something that's going through some metamorphosis that's completely unconscious. My worry would be, I guess it's not even really a worry. It's more like looking at the possibility of the AI improving upon itself and making a far better version than we could create, like almost instantaneously, right? I mean, isn't that, if you give it the ability to be autonomous,
Starting point is 00:40:50 and you give it the ability to innovate and to try to figure out what's a better way around things and what's a better way to program things and then make its own version of what it is, it's going to be spectacular. I mean, it really will be just, I mean, it's obviously just talking shit, but it really would be a god. I mean, you're talking about something that if we give it the ability to create, we give it the ability to think, reason, rationalize, and then
Starting point is 00:41:13 build. Build something better. And by build, you should be thinking more software than hardware. Right. Obviously, anything is possible. Anything that can be built with intelligence can be built with intelligence. So you could be talking about armies of robots and nanotechnology and everything else that is the staple of the sci-fi scare scenario. But more likely and certainly faster and ultimately more powerful, you're talking about something that can rewrite its own
Starting point is 00:41:45 code, improve, I mean, it's the code that is dictating the intelligence. And so you're talking about something that could be, for the longest time, invisible, and just happening on the internet, right? You're talking about code that could be put into financial markets, right, which could be built to be self-modifying, right? And then it's already out in the wild. It's not sequestered in some air-gapped computer in a lab. It's out there, and it's changing itself. Now, that would be a totally irresponsible thing to do from a software designer's point of view, I think, at this point. But there's just no question that we're going to get to a place where, I mean, it either will be the province of one lab that gets there first,
Starting point is 00:42:40 or it will be open source but you're talking about software right of us figuring out how to write better it's better software which becomes the basis of general intelligence and then where that gets put and what gets done with that that's that's the question isn't it a real question of also the race to see who can come up with one first? I mean, once the idea gets put out there, like the idea of the nuclear bomb. I mean, obviously no one in their right mind thinks it's a good idea to make a nuclear bomb. That story is especially sobering because I may forget the details.
Starting point is 00:43:23 I think it was, I could have this backwards. I think it was Rutherford, there were two famous physicists involved. It was either Rutherford who said we're never going to unlock the, I think it was Rutherford gave a talk saying that we're never going to unlock the energy that we now know to be in the atom. saying that we're never going to unlock the energy that we now know to be in the atom. And Leo Szilard, the next day, produced the equations that unlocked it. The next day? And in direct response to this announcement, like, okay, let's just, that's bullshit. And the next morning, woke up and produced that, produced the math that gave us the atomic bomb. So it can happen really fast.
Starting point is 00:44:08 And if you want to get those details exactly right, listen to what Stuart Russell said on my podcast. Well, Oppenheimer, in a really ironic twist, wasn't he a Buddhist? Was he? No, he was a fan of Hinduism, technically. He taught himself Sanskrit, apparently, in three months. One never knows how much this is exaggerated. To what end? His publicist will tell you that he taught himself Sanskrit in three months to read the Bhagavad Gita, which is one of the texts of Hinduism.
Starting point is 00:44:44 The quote that he gave out as the first bomb was tested. Yeah, I have become death, destroyer of worlds. Yeah, what the fuck? And then when you hear him say it, it's even more creepy. Because you can see the sort of remorse. Have you seen that? No. Also, the photos, get the photos.
Starting point is 00:45:01 Every photo of Oppenheimer. He looks tortured. He just looks haunted. Yeah. Well, he was hanging out. It. I mean, every photo of Oppenheimer. He looks tortured. He just looks haunted. Yeah. Well, he was hanging out. It's weird to see him with this general. I was watching some documentary on the creation of the atom bomb, and he was hanging out with these generals.
Starting point is 00:45:15 Probably Curtis LeMay. Yeah. And you see the two of them together, and you're like, what a bizarre pairing. Like this one monkey needs this other genius to make this bomb so you can drop it on these people and the guy realizes that if he doesn't make it someone's going to make it and it could be dropped on them you know well and that's that's the thing we are we were genuinely in a race condition there and we and we didn't know how close the nazis were it turns out that they weren't as close as we feared. But yeah, just imagine Hitler having
Starting point is 00:45:45 gotten there first, right, with the help of Heisenberg and others. Play it, Jamie. We knew the world would not be the same. Few people laughed. Few people cried. Most people were silent. people cried, most people were silent. I remembered the line from the Hindu scripture, the Bhagavad Gita. Vishnu is trying to persuade the prince that he should do his duty and to impress him takes on his multi-armed form and says now i am become death the destroyer of worlds i suppose we all thought that one way or another he does not look like a happy dude I suppose we all thought that one way or another. He does not look like a happy dude.
Starting point is 00:46:46 What a burden. Yeah. But it's actually, when you read the history of that effort, the Manhattan Project and the Trinity Test, it is super sobering because there was, they move forward in a context of real uncertainty about what was going to happen. In terms of the yield of the first bomb, there was a range, I think, of a hundredfold difference of opinion of what they were going to get once this thing went off. once this thing went off. And there were some people who still placed some possibility on the prospect of it igniting the atmosphere
Starting point is 00:47:33 and just canceling all of life, right? Now, they had spent a lot of time to, I mean, they did something like due diligence where many of them were confident it wouldn't, but that was not beyond the realm of possibility for some of the people working on it. And so we have shown a propensity for taking possibly existential risks to develop new technology because there's a reason to develop it. And in this case, the destructive potential is so obvious because it's all destructive potential. I mean, we're building
Starting point is 00:48:12 the biggest bombs we possibly can build. And so it's not difficult to think about the danger. It's all danger, right? And these bombs now getting in the hands of the wrong people. With AI, it's so seductive because if it looked at in one light, it's just all upside. I mean, there's nothing better than intelligence. There's nothing more intrinsically desirable than intelligence. And so to get more of it is seems an intrinsic good and so it takes an extra step to say well wait a minute this could in fact be the most dangerous thing we've ever done and you have to you have to you have to spend a lot of time fighting that that ideological battle with people who just
Starting point is 00:49:01 think no this is this is just all upside this is what could be what could go wrong you're just you're just scare mong, this is just all upside. This is what could go wrong. You're just scaremongering. You've seen too many Terminator movies. Is that even possible? To see too many? No. Not with the first.
Starting point is 00:49:17 The first was good. The first was very good. The second was good. There's one in there that I don't think I've seen all of. There were three or there were four? I don't know. I lost touch I don't think I've seen all of. There were three or there were four? I don't know. I think I lost touch after the second. I've seen two. I haven't seen any more than two.
Starting point is 00:49:31 I don't know. I just feel like it's because of the race, because of the idea that there's a race to get to it. It seems like it's inevitable that someone actually does create it. And much like the atomic bomb, it'll probably launched'll probably be launched without a true understanding of what its potential is. That's my fear.
Starting point is 00:49:53 I'm terrified of it. I think about it all the time. I step back sometimes and I look at the city of Los Angeles, look at the skyline, all the lights go off. I'm like, this is all new. This has only been here for a few hundred years. There this, this, there was nothing here. 1700, there was nothing. This was nothing. Now look at it. It's all lit up and there's a gigantic grid you see from the sky. Like, what are we looking at 300 years from now? Like, what are we looking at with all these things that we're feverishly attempting to build and create? Yeah. Well, our dependence on the net is sobering. I mean, just forget about
Starting point is 00:50:29 all of these highfalutin fears of rogue AI. We don't have a backup for the internet. I mean, the internet goes down. What happens in the real world? A lot that is very difficult to recover from. Just what happens to your money, right? What is money when there is no... Right. Or just imagine malicious code just destroying the record of money, just getting into the banking system, right? So it's like you then have to go look for the paperwork
Starting point is 00:51:03 you may or may not have in your desk to argue that you have a certain amount of money because all those bits got scrambled. And we need some – minutes of this podcast and then produce a conversation we've never had in voices exactly like our own. And those edits will no longer be discernible. I mean, we're basically there now and we're almost there with video, right? Where you could just have our mouths moving in the correct way. Well, again, I go to Snapchat, these crazy Snapchat filters. I don't know if you know about these, but my daughter, pull up the one of my daughter being Abraham Lincoln
Starting point is 00:51:49 I mean, it's fucking crazy. I mean, it's really rudimentary right now, but my six-year-old loves it She thinks it's hilarious, and she constantly uses it all the time like she just like can I play with your phone? And she grabs my phone, and then she starts doing these little videos. Somehow or another, their little brains sync up immediately with the technology. If I gave it to my mom, she'd be like, I don't even know what this is. What do I do? That six-year-old can figure it out like that. Check this out.
Starting point is 00:52:17 Four squirrels seven years ago. Oh, that's hilarious. I know you had been pooping in your pants today. Thank you. I lost my tooth. If I showed you had my pooping in your pants today. Thank you. I lost my tooth. If I showed this to my daughters, you would just never hear from them again.
Starting point is 00:52:35 It would be the most captivating thing. I mean, this is obviously black and white, and she's being silly, and it's really obvious that it's fake. But man, this is a six-year-old girl who looks like Abraham Lincoln. I mean, it's mimicking the voice. You can't, like, you see her mouth, you can't discern where her mouth ends and Abraham Lincoln's face begins. Right, but here, but what's so insidious about this technology I'm talking about now is that someone can fake, I mean, it's basically what is already true of Photoshop, right? Where you can't tell, you have to be an expert to tell whether the person was really standing next to
Starting point is 00:53:10 Donald Trump at the time, because they could have just been put there, but with expert and expert use of Photoshop. But now we're talking about being able to produce content where, you know, you are saying the thing that completely destroys your reputation, but it's pure, it's just fake news. It's just fiction. And so we need, I mean, I don't know what the fix for this is. I've just spec, and this is, again, something I know very little about, but I've, clearly something like the blockchain has to be a way of anchoring each piece of digital content. So there's like a chain of custody where you see exactly where this came from in a way that's not fakeable. And so to take your podcast as an example, if someone is producing a clip that
Starting point is 00:53:58 purports to be from your podcast where you're saying something insane, there just has to be a clear fingerprint digitally, which shows whether that came from you and Jamie or whether this came from some Macedonian hacker who just decided to screw with you. Yeah, but there's not. I know, but clearly we need that tomorrow because the technology is here to produce totally fake content, which is... I am worried about that. I'm also worried about the idea that someone anywhere is completely in charge and knows what's going on. You know, I always point to...
Starting point is 00:54:34 You remember that guy that went on stage with Obama and pretended to be a sign language translator? Yeah, that was really adorable. The fact that this guy... That was South Africa or something like that? I believe it was, wasn't it? That sounds right, but... I forget where it was. But yeah, he really can find it.
Starting point is 00:54:51 I mean, that was amazing because it's just so few people read sign language that... Yeah. Wasn't it around a Mandela thing? I don't remember the details, but I remember thinking, this fucking guy is three feet away from the president. He was three feet away from the president, and you would think that they had vetted everyone out. So here's this guy talking, and that guy is just completely making things up. He has no idea how to do sign language.
Starting point is 00:55:21 And that can happen. There was another instance where a guy had gotten in an elevator with Obama. Yeah, a security guard with a gun. With a gun. Loaded gun in an elevator with Obama. Nobody screened him. It's just people aren't really on the ball. There was a bit from my last comedy special about the guy that broke into the White House.
Starting point is 00:55:41 There was a woman guarding the front door by herself. And they had shut the alarm off because it kept going off, so they said, ah, fuck it, just shut it off. And then there was a guy who was on the lawn who was supposed to be, he had canines, supposed to be guarding the lawn. He took his earpiece out to talk to his girlfriend on the cell phone. He had a backup walkie-talkie because they have a backup one, but he left that in his locker. So it's like all these steps, and this guy just hit the perfect sweet spot where he hopped
Starting point is 00:56:07 the fence, ran the whatever hundred yards plus to get to the White House, got to the door, it was unlocked, got through it. There was a girl by herself, threw her to the ground, and just ran through the White House and was in there for five, ten minutes. He had a knife, right? Yeah, he had a knife. He was basically trying to, it was basically a suicide by cop situation because they had caught
Starting point is 00:56:28 him. This is what's really hilarious. When people think that the government is watching out for you, they weren't even watching this guy. And listen to what this guy did. He got arrested. They pulled him over, I believe it was less than three months before that, with 800 rounds of ammunition. He had
Starting point is 00:56:44 two rifles, a shotgun, an axe, and a machete, and a map of Washington with a fucking X where the White House is. And they let him go. There has to be a crime there. But they weren't even watching that guy. The idea that they're watching you. But we're not built as species for perfect vigilance that's so not perfect I mean that's so ridiculous I'm thinking about even the TSA right you know we it's just we're just and with it we're just not built for that
Starting point is 00:57:20 but Twitter Twitter can confound even the perfect vigilance. I mean, just look what happened at the Oscars. That was just, you know, their only job is to get those envelopes right. You know, you got two people with the locked briefcases and one guy starts tweeting and, you know, produces the wrong envelope. I mean, it's just. Is that what happened? He was tweeting? Yeah.
Starting point is 00:57:41 Yeah. He was tweeting. I mean, this is brutal. happened he was tweeting yeah yeah he was tweeting they they've i mean this is it's brutal but he took a photo of i forget what actress but clearly he was i mean they have he then deleted the photo but people recovered it um yeah he he took a shot of somebody i don't know charlie's their own or somebody who was walking backstage and put it on his Twitter feed. And then that the very next moment was the moment when he had to hand over the right envelope. And he just handed over the envelope for the best actress. But their only job is to get this straight. Right. And they're just safeguarding the envelopes. It's all adorable. There's something I really enjoy about the folly of people. That's one of
Starting point is 00:58:31 the main things that I worry about with AI. The things that we find cute about us being ridiculous like that, or like the sign language guy, or any of these things. But that's the thing that computers are really good at. I mean, once computers get good at this sort of thing... Yeah, but we're programming them. Right. No, but we can program them to fill in the gaps for... Our own stupidity. Yeah.
Starting point is 00:58:58 What we want to do is automate repetitive behavior and error detection that we know we're not good at. So, I mean, the moment you have a TSA screening robot that doesn't get distracted, I mean, these machines don't get distracted, right? They don't get hungry. They don't have to take bathroom breaks. They're not looking at the person who's looking at them and getting captivated by some social glitch, right?
Starting point is 00:59:24 So if you have a bomb screening robot, that's all it will do, and it will be better once it, you know, whatever the visual signature of a bomb is, when you're talking about looking for one in a bag, once computers get better at that than people, they will be just, they will be reliable in a way that people can never be. And we know that about ourselves. So we want to outsource all of that stuff. I mean, driving is the perfect example.
Starting point is 00:59:53 You don't have self-driving cars that are falling asleep or reading billboards while they're driving at 80 miles an hour. And so we want that. And so we want that. But the scary stuff is when it can change itself. That's a principle that would allow it to kind of escape from our control. Or we build something where we haven't anticipated the consequences and the incentives are wrong. The incentives are not aligned to make us prudent in the development of that. And an arms race is the worst case for that because the incentives are to get to the end zone as quickly as you can because the guy next to you is doing the same and you don't know whether he's ahead of you or behind you, right? Right. And it's a winner-take-all scenario because the amount of wealth that will go to the winner of this race, if things work, if this person doesn't, or if this group doesn't destroy the world, is unimaginable. I mean, it's just, we're talking about a level of kind of windfall profits that we just haven't seen in any other domain. Well, a perfect example is the creation of the atomic bomb. I mean, look at 70 years from the creation of the atomic bomb to today,
Starting point is 01:01:16 which is literally a blip in human history, just a tiny little blink of an eye. And then the United States emerges as the greatest superpower the world's ever known. And that's going to be directly attributed to those bombs that were dropped in Hiroshima and Nagasaki. I mean, that from there on, that's where everything takes off. When you look at human history, if you look at us from a thousand years from now, it's very likely that they look at that moment and they go, this is the emergence of the American empire. But this is so much bigger than that because the bombs, the power in the bomb was just the implicit threat that you might use them if you get pissed off, right?
Starting point is 01:01:52 But you can't do anything with the bombs. Here, we're talking about the resource that does everything. We're talking about intelligence. We're talking about the cure for cancer. We're talking about the best virtual reality entertainment. We're talking about everything. Everything that a human being, once you're talking about general human intelligence and beyond, you're talking about now having access to, suddenly having access to the smartest people who have ever lived,
Starting point is 01:02:24 who never sleep, who never get tired, who never get distracted, because now they're machines, and the smartest people who have never lived, right? People who are a thousand times smarter than those people and getting smarter every hour, right? And what are they going to do for you when they're your slaves, right? Because now you're talking about, because you don't have to feed these people, right? They're just working for you. How powerful would you suddenly be if you had 10,000 of the smartest people in the world working for you full-time, they don't have to be fed, they never get disgruntled, they don't need contracts, and they just want to do whatever Joe Rogan wants to do and get done, and you flip a switch and that's your company, right?
Starting point is 01:03:10 Some version of that's going to happen. The question is, is it going to happen in a way where we get this massive dislocation in wealth inequality, where all of a sudden someone's got a company and products and a business model which obviates 30% of the American workforce over the span of three months, or you have some political and economic and ethical context in which we start sharing the wealth with everyone. And this is, again, this is the best case scenario. This is when we don't destroy ourselves inadvertently by producing something that's hostile to our interests. Dan, you look terrified.
Starting point is 01:03:54 Yeah, no, this is what hanging out with Sam does. I know. Just listen. We get to a high spot. We get a pitcher of margaritas and we just watch this whole thing. We're not going to make it anyway. None of this room is going to live forever when i hang out with sam usually his wife's there so it's like with the conversation is a little less apocalyptic yeah yeah well she
Starting point is 01:04:13 keeps me grounded yeah she it's actually awesome because when she's there she's like just tooling on him and um he giggles and stuff like that it's like a completely different sam than the guy who's under like being an undertaker here and telling you about how we're all going to die from AI. Yeah, she keeps me honest. But, I mean, the truth is, it's not... I mean, I'm cautiously... I mean, there's no way to stop.
Starting point is 01:04:40 I mean, there's no break to pull, right? So it's like this is... I mean, it's the inevitability you were describing before. We're moving toward this thing because intelligence is the best thing in the world. I mean, intelligence is the thing that allows us to solve
Starting point is 01:04:55 every problem we have or don't know we have. We're continually discovering new problems. And the question is, what are we going to do? Some global pandemic arrives and the challenge is, do you find a vaccine for this thing or not, right? Only intelligence solves that problem for us. We either have the data and we can't interpret it. We have to design the experiments to get the data. And so there's no question of us deciding well we've got
Starting point is 01:05:27 we're just going to stick with the intelligence we've got I mean we're just we're not going to do that we can't we can if there's other people that are working on it exactly right because there's just too much power even if
Starting point is 01:05:37 even if we decided that as an American policy you know we're not going to let China and North Korea and Singapore and Iran and Israel and all these other countries do it for us. So find a high spot right above Denver, big bucket of margaritas or the meditation that has helped both of us. Yeah, well, that definitely will help you while human beings are a real thing.
Starting point is 01:06:08 Yes, although I wonder if, you know, getting enough people meditating might improve the quality of whatever gets created in the AI community. In other words, if you have people who are a little bit more sane as a consequence of meditation than maybe the people designing
Starting point is 01:06:25 these products, rather these, I don't know if product's the right word, this stuff, then somehow the stuff is better. Well, I think it's especially relevant for the other side of that, which is when you, I mean, in the near term, we clearly have an employment problem. I mean, there are jobs that will go away that are not coming back. And this is a perennial problem where people need to find meaning in life.
Starting point is 01:06:53 But we've always had, it hasn't been such a pressing problem for thousands of years because there's always been so much to do just to survive, right? Now, in the ideal case, we'll get to a place where there's actually much less that has to be done because we have machines that do it, and we have seen the wisdom of mitigating wealth inequality to some acceptable level,
Starting point is 01:07:20 so all boats begin to rise with this tide to some degree uh see whether it's universal basic income or some so we have some mechanism to spread the wealth around but then there's the real question of what do you do with your life yeah yeah so what do you think people will do that would that's the easiest one it's not but it's not so easy because people well people are confused people those people need to get their shit together i mean that that's giving someone free time but find a hobby man i mean there's a lot of stuff to do. If you told me that I never had to work again and I would just have to find things to do all day and all my food and everything would be taken care of, that would be the easiest choice I'd have ever made in my life. I would just pursue
Starting point is 01:07:58 things. I would just learn how to speak a language. I'd learn how to play an instrument, I'd practice archery more, I'd do jiu-jitsu more. That seems to me so ridiculous. That seems to me that could be solved really easy. I feel the exact same way, but I think that the... I'm not convinced. I feel that I could fill endless swaths of free time with a number of things that I'm interested in, including but not limited to meditation or just hanging out with my two-year-old. However, I'm not sure that the vast majority of humans are like that. They just need guidance.
Starting point is 01:08:33 That seems to be the easiest thing to solve. Just let them know. Go find something fun to do. Go start running up hills, man. Go take up frisbee golf. There's shit to do. There's. There's shit to do. There's a lot of shit to do. My concern is that more and more it will take the form of being merely entertained.
Starting point is 01:08:52 So it will plug into some VR. The last scene of WALL-E. I never saw WALL-E. You said the movie WALL-E? No, I never watched it. You know, you guys, you should maybe pull this up, but in Wally, it's a dystopian future where everybody's riding around in those little jazzy mobile wheelchairs that people sit in now. Oh, yeah, Disneyland? Yes, and they've got huge buckets of soda and turkey legs and kind of hooked over the back of their motorized vehicle
Starting point is 01:09:27 as a monitor at the front, which is entertaining them. So it's just obese and entertained and immobile. Or mobile, but not actually ambulatory. It's Disneyland. Well, what's happened in our lifetime, the smartphone has made it virtually impossible to be bored. Like, boredom used to be a thing. Like, you'd be sitting in the waiting room of a doctor, right?
Starting point is 01:09:51 And they have crappy magazines. And then you're just sitting there. And if you didn't know how to meditate, you had to confront this sense of, I'm bored, right? Now, one thing you discover when you learn how to meditate is boredom is just an inability to pay attention to anything. And once you learn to pay attention to anything, even something seemingly boring as your breath, it suddenly becomes incredibly interesting. So focused attention is intrinsically pleasurable. But boredom is the state of kind of scattered attention looking for something that's worth paying attention to. And yet now with technology, you're never going to
Starting point is 01:10:31 be bored again. I mean, I've got, I have at least 10 years worth of reading on my phone, right? So it's like if I'm standing in line, I'm constantly pulling this thing out. Is that a bad thing? Well, it's potentially a bad thing because you, I mean, just to take this example of one interesting insight you get when you learn to meditate, it's incredibly powerful to cut through the illusion of boredom. I mean, to realize that boredom is not something. I mean, you can become interested in the feeling of boredom, and the moment you do, it bites its own tail and disappears. I mean, it's just like there is no such thing as boredom
Starting point is 01:11:14 when you're paying close attention to your experience. I think the bad thing about the hyper-stimulation that we get through our phone and all of technology is that we have lost the ability to just sit back and, for lack of a less cliched term, be. And we're just constantly stimulated. And that means we have trouble paying attention when we're holding our kid in our lap and reading him or her a book, or we find ourselves without our technology for a moment.
Starting point is 01:11:44 There was a recent study that asked people uh would you rather be alone with your thoughts or get electric shocks and a lot of people took the electric shocks and i actually think that is a fundamental problem uh in terms of not being able to get in touch with the raw and kind of powerful although obvious fact that you're alive and that you exist. Right, but don't you think those people are idiots? I mean, you don't want electric shock. I won't take an electric shock.
Starting point is 01:12:12 You're not going to take an electric shock. We're talking about children. This is the dude with the Snapchat glasses. Jamie's on the ball. I don't want those glasses either. But you know what I'm saying? He doesn't use the glasses. He just explores it because it's fascinating.
Starting point is 01:12:24 But you know what I'm talking about? I mean, we're not talking about rational people. We're talking about lowest common denominator people that would take an electric shock. You would take an electric shock? No, no, no. I'm just saying this is a study of, they're not just picking idiots. Right, but who are they asking and how are they phrasing the question? But it also doesn't have to be conscious. So, for instance, we willingly grant our attention to things which, retrospectively, we can judge produce more or less nothing but pain for us. Like what? Well, I mean, I'm continually thinking about this and rethinking about this with respect to social media. So, what's the effect on my mind of looking at my twitter ad mentions well i wondered that when you were asking people what kind of questions they should we should ask what was on this podcast you know sometimes it's incredibly useful but i was thinking what are you doing yeah you're opening yourself up to the green frogs yeah but i mean
Starting point is 01:13:21 i've had both kinds of experience i've i've had people send me articles that I would have never found otherwise. They're fascinating, super useful, and it's just like this is the perfect use of this technology. And then, again, then I get this river of green frogs and weirdness. But to the previous point, I pay attention to something for long stretches of time rather than doing something I know is good for me with my attention. I kind of spin the roulette wheel with Twitter. And rather often, there's just this kind of toxic undercurrent of mental activity that it produces in me that doesn't equip me to do anything better in my life, doesn't make me feel any better about myself or other people. If it has any net effect,
Starting point is 01:14:17 it basically grabbed a dozen dials that I can just sort of dimly conceive of in my mind and turn them all a little bit toward the negative. You know, I feel a little bit worse about myself, a little bit worse about my career, a little bit worse about people, a little bit worse about the future, a little bit worse about the fact that I just was doing this when I could have been playing with my kid or writing or thinking productive thoughts or meditating or doing anything that I know is good. So you still do this? I don't understand your
Starting point is 01:14:46 Twitter compulsion. You argue with people. I found that fascinating. Periodically. Periodically? I go for weeks without arguing with people. Well, but also you argue with people on your podcast. Well, yeah, but that's different. No, no, no, but I mean
Starting point is 01:15:02 it's just on the podcast you'll also be talking about people you're arguing with on Twitter or, so the arguing is a part of your... Well, I get trolled a lot on Twitter. I get trolled, I mean, you have guests, you have guests who come on your podcast and savage me on the podcast and then troll me endlessly on Twitter and... You mean like Abby? Abby Martin? Well, she was, no, she doesn't troll. I mean, she hates
Starting point is 01:15:26 me, but I haven't noticed her trolling me. If you got in a room with her, everybody would calm down. She just has... There's people that have radical misconceptions of who you are. I'm sure you heard the Josh Zepp's Patton Oswalt thing. Patton Oswalt... Don't listen to it.
Starting point is 01:15:41 Okay. He went off the rails. Patton went off the rails like Hannibal style. Oh, no, no. Are you thinking of Andy Kindler? Oh, yes, I am. I'm sorry. Did I say Patton Oswalt? I'm so sorry, Patton.
Starting point is 01:15:53 That's good. So sorry. How did I connect Patton Oswalt? Yeah, Patton's great. Good. Well, Patton's very smart and very reasonable. As is Andy for most of the time. Andy's kind of crazier.
Starting point is 01:16:04 I discovered this very late I did it so I kept seeing this guy I didn't know who he was I kept seeing him in my twitter feed Andy Kindler Andy Kindler and then I realized wait a minute he is actually he seems to be I mean he's an established comic he seems to be friends with people who I really respect who I don't know like you know like I mean Sarah Silverman who I don't know personally I mean we've we've you know communicated a little bit on twitter but it's like, I'm just nothing but a pure fan of Sarah. Jim Gaffigan, other big comics who I totally respect. I don't know how close he is with these people, but he basically, he has endless energy for vilifying me as a racist and
Starting point is 01:16:41 as a bigot, right? I mean, he's just, he's a madman. And I discovered, so I did a search of his Twitter feed, and he's got hundreds and hundreds of tweets where he's going after me in the most retail way. So for instance, like I tweeted out, can anyone recommend 500 seat halls in all these cities? And I think your friend, is Duncan Trussell a friend of yours? Okay, so Duncan wrote back saying, oh yeah, DM me. I've got a great place in Boston or something. And then Andy Kindler jumped into that thread talking to Duncan saying, don't help him, Duncan. He's a bigot, right? So Andy Kindler's just got endless energy for this.
Starting point is 01:17:18 And then I looked at what he was doing, and he had been doing it for years. And I wasn't aware of it. Just to you? Just to me and Bill Maher. He hates Bill Maher. But then you've got another guy on this podcast, Hunter Matz. I had him on once. Okay, so he was on here, and I actually had to go back and watch what he said here because
Starting point is 01:17:38 I had been getting so much of this on Twitter. And I mean, it's incredibly common. People are tweeting at me saying, why won't you debate Hunter? It's Matt's, right? Why won't you debate Hunter Matt? So I went back and looked at what he said here. Half of it, frankly, didn't make any sense. But his attacks on me on Twitter are the most juvenile. It's like the idea that he thinks this is a way he's going to establish a conversation with me by sending me two tweets and then sending me 400, which say, you're scared to debate me, right? Yeah. It's crazy.
Starting point is 01:18:11 I mean, it's crazy behavior. I don't know. Very, very, very, very smart guy, but taking a terrible, socially retarded approach to establishing a debate. And I've contacted Brian Callen about this and Brian Callen contacted Hunter about this and was like, what the fuck are you doing? And he continues to do it. It's crazy behavior.
Starting point is 01:18:29 He said he wouldn't do it anymore. And then he did more. I mean, every time I look, I see him somewhere in there. But then there's this guy, Mike Cernovich, who actually has effects on the real world. I mean, Mike Cernovich is, again, he's a Twitter troll. Again, one of these other guys who's challenged me to debate him, and I, you know, I mean, there's absolutely no
Starting point is 01:18:52 possibility of a profitable dialogue. You need to spend less time on Twitter. Yeah, listen to you, man. But Cernovich was just on 60 Minutes. I mean, he's like, he's in the world, and he's affecting people's perception of Trump. I'm just talking about your psyche. Oh, yeah. Well. For sure, right? Yeah, so, to come back, it's not, yeah. Well. For sure, right? Yeah.
Starting point is 01:19:06 So come back. It's not. Yeah, it's not. It's not. The nine times out of 10 looking just makes me think. I mean, it's an illusion because if you met most of these people, if you if they came up to you at a conference or at a book signing or so, you know, after a gig of yours. And you saw you had more information about them, you saw all of the crazy coming out,
Starting point is 01:19:30 you would say, I don't have to pay any attention. There's no reason to pay attention to what this guy is saying. Whereas on Twitter, everything has the same stature. So whether it's a Washington Post columnist who's tweeting at me, or some guy like Hunter, who I have no idea who he is, but he's telling me I've got something wrong. Everything has the same stature, and there's no signal-to-noise sense of what...
Starting point is 01:19:56 Well, first of all, you fucked up because you talked about them, both of them. You said Candyman five times, and now you've got a problem. You talked about him right now on this podcast. This is the first time I've ever mentioned the guy. I should have talked to you before this. I'm bringing it to you, but you created it. I didn't do it. It was his podcast with you that kicked this whole thing off.
Starting point is 01:20:16 I certainly didn't think that Hunter was going to do that, and I know for a fact that Hunter is actually a fan of yours. I think what Hunter is trying to do— He's got a strange way of showing it. I don't think it's a smart way. I think what he's trying to do... He's got a strange way of showing it. I don't think it's a smart way. I think what he's trying to do... And when I was trying to hold his feet to the fire and get some sort of a logical definition
Starting point is 01:20:32 of what you do wrong, he really didn't have anything. Yeah, but he thinks he does, and he... But did he, when I talked to him about it? No, no. And half of what he said about Dawkins and me was totally wrong. Half of what he said about Dawkins and me was totally wrong. Half of what he said about the relevant biology was wrong.
Starting point is 01:20:48 I mean, he's just, he's not, he doesn't have his shit together, but he thinks he does. And there's this, there's a level of arrogance and incivility and just kind of a lack of charity in interacting with other people's views, which is now kind of getting rebranded on the Internet as just American can-do chutzpah, right? And it's like it's given us Trump, right? It's like Trump is the ultimate example. He's like the cartoon version of, you know, a person who doesn't know anything relevant to the enterprise, who doesn't show any aptitude for civilly engaging with differences of opinion. And this thing gets, you know, amplified to the place of greatest
Starting point is 01:21:41 prominence now in human history. Everyone's on social media, or many people on social media, are playing this same game. And Cernovich is another just malignant example of this, where you have someone who's got a fairly large following. I mean, it's not as big as yours, but it's a very engaged following. I mean, this whole Trump phenomenon has shown me that a small percentage of one's audience can have like a hundred times the energy of the rest of your audience. Like whenever I went against Trump on my podcast, or this is still the case, The level of pain it causes in the feedback space is completely out of proportion to the numbers of people who are on that side of the argument. And so it's incredibly energized, but it's just this weird style of self-promotion where all you do is that you brag about yourself, you say you're the best, And you're someone who has absolutely no basis to
Starting point is 01:22:47 back up those claims. And yet an audience thrills to that level of arrogance and empty boast. It's just bizarre. Well, there's a big issue, I think, online where people find like-minded people, and then they develop these communities where they just support each other, and they just have these gigantic groups of people. I mean, not necessarily gigantic, but groups of people that find any subject. Like, for me, you know what it is? Flat Earth.
Starting point is 01:23:20 Right. I get trolled by people all day that claim I'm a sellout because I don't believe the Earth is flat. This is real. By the same token, you've got parents all over the world who've got children with rare disease, but they can connect on the Internet and bond over that and share tips and doctors and all that stuff. So it is it's actually they're both outgrowths of the same kind of phenomenon. But it can be we see the the really difficult consequences of this in our politics right now, et cetera, et cetera. I think we're talking about two different things. You think so? I'm talking about confirmation bias.
Starting point is 01:24:08 I'm talking about a bunch of people that get together and say, yeah, obviously I'm woke and the earth is flat and pay attention. But technology allows both. There's an ice wall. Yes. But they're different things. The groups of people that will find, you know, communities where maybe your child has autism and there's some sort of an issue that
Starting point is 01:24:25 can be mitigated with diet and parents have had some success with that and they could give you some enlightening information and you can communicate with each other and that's nice. It's beautiful. Some of the hashtags that people use that they find, search through, it's great. But these little communities that you bond they're not that's those those ideas that there's no confirmation bias in those ideas but there's confirmation bias in the idea that trump is the man there's confirmation bias in the idea that the earth is flat you know and if you
Starting point is 01:24:56 just huddle in those little communities and just bark the same noises that everybody else barks there's some sort of sense of community in that too yes absolutely and people love that they love being a part of a fucking team even it's a stupid team of green frogs and you don't have it you don't have to listen to other people's views and you get deeply deeply entrenched i think we're seeing we're seeing this all through our politics and media right now well you also see when you go to those people's pages which i I do often, I don't engage with people in a negative way online very often, very, very rarely. Now, do you not look at your ad mentions? I do a little bit, but you know what, man?
Starting point is 01:25:31 I just like to just go on about my day. I've found that the negative consequences that you're discussing, I can feel. It's rare that I go down the rabbit hole and I go for days without looking at ad mentions. I mean, it's more that if I publish something, if I ask for feedback, I want to see the feedback. It's not necessarily that there's anything wrong. I mean, just as your friend, I mean, I feel like there's nothing wrong. I look at my ad mentions, but... But you function in a very different space.
Starting point is 01:25:58 I mean, you're not pushing controversial stuff out there. No, definitely not. Definitely not. So I agree. I stay offline. I'm sorry, go ahead. No, I was just Definitely not. So I agree. I stay offline. I'm sorry, go ahead. No, I was just more worried about, again, back to your psyche. It feels like you need a little bit of a middle ground.
Starting point is 01:26:13 Yeah. I stay offline after UFCs for the most part. That's when I get the most crazy people, especially if there's any sort of a controversial decision. But are they criticizing you for something you said? Oh, yeah, they get fucking mad at my commentary. They'll just disagree with who I thought won. And then they're so fucking vicious about it.
Starting point is 01:26:30 I'm just like, just yell. Just yell in your own space. So you don't want to see any of that. There's too many people. You're dealing with millions and millions of people, and who knows how many of them are rational. What you're dealing with, you know, there's a bit that I had in one of my specials
Starting point is 01:26:42 where I was talking about the number of people that are stupid in the world. Like, if you get a room full of a hundred people the idea that one person isn't a complete fucking idiot of course well you're being very charitable right i'm being charitable but if you do that we're talking about 300 million people in the united states plus that's three million fucking idiots yeah so if you have three million fucking idiots and all your ad mentions, if you look at your ad mentions and three million comments are saying you're a fucking
Starting point is 01:27:09 moron, and it's just too many people. The numbers are, they're not manageable. The numbers of human beings you interact with online are not manageable. So anytime anything gets negative or insulting, I just check out. I just next, next, next.
Starting point is 01:27:26 I don't pay attention. Because you can't. But if I fucked up, and I know I fucked up, I think one of the most important things that I do is I admit that I fucked up. And I talk about it, and I apologize, and I say, look, I'm flawed, I'm human, I made a mistake. Sorry. And then just step away. Like, let the fucking chaos ensue in the comments let all the people call you a shill and a whatever let all that happen but don't let it in like you're
Starting point is 01:27:51 letting it in and then you stew on it and you have to bring up you know hunter and all these other people and that's you're getting you're letting them way too no no no i mean no perhaps i gave a a false impression of of how under my skin this has gotten. Just the fact that you talked about it at all, though, is too much. Well, no, but I think it's of huge consequence. I think it's given us Trump. Yes. This way, the fact that this style of communication is attractive to so many people, the fact that you can bluff and bluster and empty boast and lies, being caught in lies without consequence.
Starting point is 01:28:28 Right. The fact that people never admit mistakes. I mean, what you just described, you do and which I do. You know, you make a mistake. You want to hear about it. You want to correct it as quickly as possible. Right. That is the antithesis of what we're seeing now in this space.
Starting point is 01:28:44 I mean, when has Trump admitted a mistake? But isn't the difference being that neither you nor I have any desire to control anything? Like, I don't want to be the leader of anything. Well, no, but you want effective, you want good ideas to win. Sure. And you want the truth to win. You want the truth to propagate. You want facts to propagate. You want to be able to correct errors, whether they're your own or others, especially when they're consequential. And I think that we need a common ethic where lying has real consequences. So many people are trying to figure out what's the antidote to fake news.
Starting point is 01:29:18 Well, one antidote is to be caught lying has to be devastating for your career, right? Like politically, journalistically, academically, as a public intellectual. I mean, to be caught in a whopping lie will require, at minimum, some serious atonement. And historically, it has been changed. Absolutely. It still is in many spheres. We have slipped those rails just to a degree that I never thought possible. We're going light speed right into the woods.
Starting point is 01:29:52 Yeah. I mean, we're talking about people have figured out how to just use dishonesty as extra fuel. I mean, it's like, you know, whatever the nitrous oxide boost in those hacky cars, right? It's like, we can use this stuff, right? And the way you behave when you get caught, which is to say,
Starting point is 01:30:17 just go fuck yourself, just redounds to your credibility among your tribe, right? It's like, this guy is so powerful, he so fully doesn't give a shit what people think that he can catch him in a lie and just watch how he he gets out of it um and but that's that's what so what i'm when i was bringing up these guys like cernovich i mean this this is um i actually i i trolled cern i decided to troll cernovich one day and i i thought it was hilarious. And it was just nothing but fun.
Starting point is 01:30:45 So there was nothing toxic about that. He's obsessed. But the thing that bothers me is that this has real political consequences. Do you think this is a time period where we're in this sort of adolescent stage of communication online where you can get away with saying things that are dishonest and that there might be some sort of a Way to mitigate that in the future there might be some I don't think we should act like dishonesty And bluff and bluster to use a phrase you before is started somehow new to the human repertoire I get that the acceptance of it seems to be though
Starting point is 01:31:19 I think we're in a period where that is true, and I think it is aided and abetted by technology and social networks. I agree with your diagnosis on many levels. But, you know, I was having an interesting conversation with a guy that you introduced me to, Joseph Goldstein, who's an eminent meditation teacher. He's become my meditation teacher, old friend of Sam's. And we were talking about the current political situation. He used a phrase that i like when i was asked him what he thought about it he said i'm kind of slotting into geological time and i think that actually makes some sense not to say that what does that mean meaning that i'm just viewing it from a far i'm widening the lens to look at the broad scope of human history to see that, you know, over time, we've got these ups and downs.
Starting point is 01:32:11 He's getting to the top of the mountain near Denver with a big bucket of margaritas. That's exactly right. That's exactly right. They're slotting in. They're slotting in. Just look down with binoculars. Watch the bombs go off. Learn how to get water out of the ground, get a solar power generator.
Starting point is 01:32:29 And make good margaritas. Yeah. I mean, it just doesn't seem to me that this is sustainable. It feels like this is just going to be some spectacular moment in history where people were rah-rah Nixon, and now they look back and go, my God, Nixon was a fucking liar and a buffoon. But take this. This is a point I've made before. I don't think it's original with me. I think other people have made it.
Starting point is 01:32:49 But my claim is that if Trump were one-tenth as bad, he would appear much worse. Because everything he does now is appearing against a background of so many lies and so much craziness that you can barely even weight its value. But I mean, just so I just, uh, and this is one of the things I find useful about Twitter because I follow some very interesting people. And so Ann Applebaum, the Washington Post columnist, who's just awesome on Twitter, um, everyone should follow her. She just keeps hammering Trump with, with her own points and other stuff that she finds. And she just pointed out that, did anyone notice that Trump threatened a war with North Korea two days ago? It was in the Financial Times, and yet no one can talk about it because no one believes him,
Starting point is 01:33:43 that no one can talk about it because no one believes him, right? Like, it's like we have a president whose speech has now become so denuded of truth value, right, perceived truth value, that he can say, you know, if China doesn't handle North Korea, we're going to. And no one even feels like they have to ask a follow-up question on that topic because everyone assumes it's an empty bluff. I mean, let me just imagine, just to step back into the previous presidency, if Obama had said, if China doesn't handle North Korea, we will, right? That would be top of the fold. This is all we're talking about today.
Starting point is 01:34:25 Right. This is like it just comes out of a blizzard of inanity and craziness that, you know, he's he's going after Meryl Streep. He's he's lying about Obama wiretapping him. Now he's he's threatening war with North Korea. And, you know, nobody knows what to talk about. And, you know, nobody knows what to talk about. So it's like the consequence of this is we have a president who not only can he not be trusted to tell the truth, he can be trusted to lie whenever he thinks it suits his purpose. And now, so the state of mind that everyone's in, including the press, in listening to him is just, is to take potentially the most serious things in the world not seriously. And the least serious things in the world, like, you know, Meryl Streep, or what he thinks of her acting, you know, that becomes, that dominates a whole new cycle. It's very upside down. It seems to me to be quite new,
Starting point is 01:35:22 as opposed to, I mean, lies are are perennial but i feel like we're in a very different space now with the consequences of of misinformation we certainly are do you think that they're connected to what we were talking about before where you said that people would rather be electrocuted than to be alone with their thoughts that we have gotten to this weird place with our society with our civilization where we've made things so easy. Made people so soft, so dependent upon technology. We've slotted out these paths, these predetermined patterns of behavior for people to follow, where they can just sort of plug into a philosophy, whether it's a right wing one or a left wing
Starting point is 01:36:01 one with very little room for personal thought at all. Very, very little room for objective reasoning. We sort of made it easy. We babied them. I do think that it's imperative if you want to be a good citizen to have a varied media diet. You're not going to have a clear view of the world if all you're reading it, well, is Breitbart.
Starting point is 01:36:23 Or the New York Times. Right. You know, I think you have nothing against the new york times or bright part but i think you need to read many things and follow many different sorts of people on twitter not just because you want to troll them but because you actually want to listen to what they have to say and take it seriously well the new york times really fucked up where they really fucked up is where they said that they're going to after the election they're they're going to rededicate themselves to reporting the truth.
Starting point is 01:36:46 And like, what? Why did you say that? Like, I wish I was there. I wish I was in the office. Why? That just sends the wrong message? Yes. They weren't into the truth.
Starting point is 01:36:54 Yes. Well, it says they were biased. Yeah. They fucked up. I mean, like, they had an idea. The truth is they were biased. Yes. You're right.
Starting point is 01:37:01 You're right. The thing is, the enemy was so grotesque in this case that it was impossible to not have been biased, seemed an abdication of responsibility. I feel it myself. Like people, everything I say against Trump from a Trump person sounds like mere partisan bias. I mean, I've got zero connection to the Democratic Party or to, I mean, it's like there's no partisan bias. I mean, I've got zero connection to the Democratic Party or to, I mean, it's like, there's no partisan bias. I mean, 100% of what I want to say about Trump does not apply to some other Republican who just stands for things that I don't, you know, policies I might not like. It's a completely unique circumstance. And, yeah, so, I mean, yes, it's true that to read the New York Times for the longest time,
Starting point is 01:37:52 it was reading like just the entire thing had become the opinion page on the Huffington Post or something. Yeah. I just feel like at this stage of our society, there's real consequences to the infantilization, if that's actually a word, of human beings in our culture. We've made it very easy to just go to work and just get home and watch television and just not pay attention to anything and not read anything and not really think and and then be manipulated I mean I think it's it's it's incredibly easy to manipulate people especially people that are aware that they don't have a varied media diet people that are aware that they don't have a real sense of the way and it seems daunting to try to take into consideration like what is it what is involved in foreign policy what is
Starting point is 01:38:42 involved in dealing with Russia? What is involved? How do you negotiate with North Korea? Fuck, it's too much. Put it in the hands of the strong man. I think this is true on both sides of the spectrum, though. Yeah, I agree. Because I think you've got folks who slot into just a media diet where they're just hearing things on the left and they're not curious about or, I guess, just not curious enough to hear things from a different perspective unless it's just some right-wing palooka who comes on and they beat them up. So I really do think, and I think it places, and this is going to be a self-serving argument. But I do think it places increasing importance on media outlets like the one for which I
Starting point is 01:39:25 work to be really vigilant about being seen as fair arbiters of fact. Yeah. I mean, I think the consequences have never been greater for that. And I think the reason that so many people on the right, so many Trump supporters fear, feel like they're right is because it has been proven that the media was biased and that they did get it all wrong. And they were absolutely wrong when it came to who was going to win. I mean, Huffington Post had some ridiculous thing where it was the night of the election. They said that Hillary had like a 98% chance of winning or something crazy like that.
Starting point is 01:40:00 Well, I think, yeah, there were some polls that were bad, but the poll, like the, Well, I think, yeah, there were some polls that were bad. But the poll, like the, because I remember this, because I sent out a tweet, which said, like, you know, bye bye, Donald, or something like that, you know, the day of. But when I did that, I mean, that wasn't a prediction. I mean, the polls that I was going by, that most people were going by at that point, it was like 80-20, you know, or at best 75-25 that she was going to win. Now, that's not, I mean, you roll dice for a few minutes, you realize a 20% chance comes up a lot. Absolutely. So I guess that's not infinitesimal odds. Plus Florida. Yeah. But so, but, well, you should tell the story about what it was like to anchor the broadcast. Yeah.
Starting point is 01:40:40 So I was anchoring the ABC News digital coverage that night. And, you know, they give you the exit polling. You're not supposed to report it publicly. But the exit polling that we were seeing before we went. Certainly that was not the case for the folks on my set. It was that we didn't see it coming. We weren't prepared for it. Everything that we're seeing in terms of the math made it look like this was a Clinton victory, a shoo-in. It was just about just tying a ribbon around it. So when the night became long, there was just confusion about what was going on. How did they get it so wrong?
Starting point is 01:41:25 You know, I think it actually goes back to what Sam was saying before, that people think when you see numbers like 70% odds that Clinton's going to win, 80% odds that Clinton's going to win, that she's definitely going to win. But there's room there for Trump to win. A lot of room. Yeah, a lot of room there for Trump to win. 20% comes up all the time. It's Russian roulette.
Starting point is 01:41:49 Yeah. Yeah. I mean, those are bad. If it's Russian roulette, those are bad odds, right? You're not going to take that, you know, not going to put a single bullet in a five chamber gun and spin it. No, that's deer hunter. That's really good odds that you're going to get shot. I don't think so. It's just so much about blaming the polls as it was blaming the overall tenor of the coverage, which made it seem like Clinton was inevitable. Yeah.
Starting point is 01:42:11 Yeah. It just, it was, it was so shocking. That's a hit. That's a hit. I think that we, we can and should take. Yeah. We definitely, you know, I think we, we weren't giving the 20 or 30% chance a serious enough look. What is your thought as being someone who covers these things what is your thought about the electoral college
Starting point is 01:42:30 do you think that that's an antiquated idea and then i mean it was kind of established back when you really needed a representative because otherwise you would have to get on a fucking horse and ride into washington and it would take six months i I can see it. You can make very powerful arguments that it's a deeply problematic institution. I can see those that I can see the power of those arguments for sure. There are people who argue, make similar arguments about the United States Senate. Yeah. There was a there was a piece that ran in The New York Times in their Sunday Week in Review not long after the election making the case that the angriest people in America actually should be those who live on the coast because it's taxation without representation. That the people who live on the coast are paying more in taxes, but they have less representation actually in Washington.
Starting point is 01:43:22 Again, that's not research that I've done, but it's an interesting idea. Have you ever seen anybody present any sort of a logical argument that there really shouldn't be a president anymore? That the idea of having one alpha chimp run this whole thing seems pretty outdated. I'm just not sure that he runs the whole thing. But he's got a lot of influence. He has an enormous amount of influence. But we're seeing, Donald Trump is seeing right now the limits of presidential power.
Starting point is 01:43:49 He is. He couldn't get his, the health care bill that didn't even make it wasn't even close to what he promised on the, in some ways, wasn't even close to what he promised on the campaign trail. In other words,
Starting point is 01:44:02 he couldn't get the bill that he wanted and then he couldn't get that passed. Right. words, he couldn't get the bill that he wanted, and then he couldn't get that passed. And now he's looking at having to watch his party employ the nuclear option in order to get his Supreme Court nominee seated. So I don't know. I think that the founders designed in many ways a really ingenious system, and we put a lot of attention on the president because it's one person who's on our TV screens or our phones all the time. But I'm not sure how much power is vested in that person.
Starting point is 01:44:34 Now, when it comes to foreign policy, it's a different kettle of fish. Well, it's enough that the EPA has been sort of hobbled. I mean, what they've done with the Environmental Protection Agency standards, especially when it regards to emissions. He's rolled back emission standards. I mean, if there's anything that we should be concerned about, it's the air that we breathe. And we are moving in a direction.
Starting point is 01:44:56 We're clearly moving in a direction to get away from things like coal. And he's going the opposite way. Not only that, but what I've heard is that that's not even going to be effective because most places have moved away from coal to the point where restarting coal production is it's not even going to recharge the economy in the way that would make it a viable option in the first place. The issue of climate change is just I'll say say as a member of the media an area where i feel i'm just speaking for myself here um really one of our biggest failures um and i don't think history is going to judge us kindly and again i'll put the i'll put the blame on myself it's become it's a hard story to get people just that interested in. And especially for television, because it's a lot of sort of graphs and science,
Starting point is 01:45:49 and there's so many pictures of polar bears you can show. And so I anguish about that, because I do think there isn't a debate. Climate change is real, almost certainly caused by humans. And for too long we fell for that in the media where we presented it as a debate when it wasn't. Now I think we're past that, but I still don't think we're covering it enough and as robustly as we should. Well, climate change I think is very, it's a sort of almost abstract to people.
Starting point is 01:46:23 It's very difficult for them to wrap their head around, especially when they look at the ice core samples. And there's plenty of stuff online where you could sort of convince yourself that there's always been this rise and fall of the temperature on the earth. And in many ways, that is true. But pollution is another thing.
Starting point is 01:46:41 I think, actually, I just walked in, so I might have missed what you said there, but I think that's a crucial shift of emphasis because there is no argument about air pollution, its consequences, its undesirability. I mean, you don't want particulates in your lungs that you don't need to have there. And the idea that we, I mean, so you could completely justify a green economy on that basis alone. Maybe just imagine if we had no pollution coming from the exhausts of all the cars out there, and there was no coal fired power plants. We just had solar and wind and, you know, safe nuclear technology powering the grid, it would be fantastic from just a pure—
Starting point is 01:47:26 I mean, forget about the health. Obviously, lung cancer and cardiovascular disease is a huge issue there. But just aesthetically, it's so desirable. I mean, there's no argument against it. And I feel like people don't make that connection very much. Because climate change is slow moving. You know, and it could be way out in the future before... But pollution isn't.
Starting point is 01:47:52 I know. I know. But pollution is much less controversial than climate change. Well, that's why this coal thing is so disturbing. You know, reigniting this production of coal. Yeah, it's unbelievable. It's ridiculous. Yeah, I guess I have questions about whether it's even going to happen. In other words, you can roll it back, but the coal industry, there are many, many factors
Starting point is 01:48:16 to their decision making. Right. For example, if they go and do the mining, is what they mine going to actually even be consumed. So, you know, and I think that even though they're doing some pretty radical, making some pretty radical noises at the EPA, I'm not sure how far Pruitt can take some of this stuff, given the existing body of law, case law, that has formed around Obama's decisions. that has formed around Obama's decisions. So, you know, it actually gets pretty more complicated the closer you look at it, from what I can tell, and I won't claim to have studied it too, too closely, but it seems to get more complicated the more you look at it.
Starting point is 01:48:56 The headlines may be scarier, I guess, is my point. Well, what's pretty clear, though, is emission standards. Rolling back emission standards sends a very clear message that it's okay to pollute the air. I mean, we were moving in a direction of going towards electric cars, going towards cars that pollute the environment less. I mean, even cars that use gas. You know, for Porsche, Porsche has a 911 Turbo and the standards of emissions are so strong. What they've developed is a car that when you drive it through LA, the exhaust that comes out is cleaner than the air it sucks in. Imagine that. It's an air filter.
Starting point is 01:49:34 Yeah, it really is. It's a bad air filter. But it can be done. Very expensive air filter. It can be done. We can move further and further away. Obviously, the problem with that is it's taking in polluted air. You know, that's the issue in the first place.
Starting point is 01:49:47 I mean, it's not clean. You don't want to breathe the exhaust of a Porsche. But what they've done is managed to make something so efficient that it actually does emit clean air coming out of it or cleaner than the polluted air that it's sucking in. cleaner than the polluted air that it's sucking in. Now, if these Environmental Protection Agency standards keep getting rolled back, I mean, we're going to go back to, I mean, I don't know how far they're rolling it back, but what you said is so clear. There's nothing good about polluting the air. I mean, it's what we need to breathe. And there's options. The idea that business should take precedent over the actual environment that we need to sustain ourselves.
Starting point is 01:50:27 Well, let's not forget there are real human beings in coal country who have spent generations working in this industry, take great pride in it. And we've got to think about what we do. But the numbers here are surprising and also little reported. It's only 75,000 coal jobs we're talking about in the country. And there's something like 500,000 clean tech jobs
Starting point is 01:50:56 just in California alone. I mean, the numbers are completely out of whack. No, I think the clean tech industry offers an enormous amount of promise, but 75,000 families is not nothing. And when we're talking about— But then give them money, right? They don't want to hand out. They want to work.
Starting point is 01:51:12 This goes to the question of meaning and, you know, what are we going to do? Because the precipice we're getting to is everyone, virtually everyone, is going to be in the position of these coal miners. When we're talking about—and that's's a good thing that's the thing i mean that's a that that's the you know why can't they figure out that they just want to learn new languages and and spend more time with their kids and play frisbee and and uh have fun we need a new ethic that and politics that decouples a person's claim on existence from doing profitable work that someone will pay you for. Because a lot of that work is going away. I mean, we could view it as an opportunity, and it is actually something. It does dovetail with this hobby horse that you and I have been on for a while about the power of meditation
Starting point is 01:52:03 and what it can do to a human mind and the way you view the world and your role in it, for sure. Well, what are your thoughts on universal basic income? Because bring it back to that, with this rise of the machines, if we do have things automated, I mean, some ridiculous number of people make their living driving cars and driving trucks. Now, when those jobs are gone, I think it's millions of people, right? Driving trucks.
Starting point is 01:52:23 Now, when those jobs are gone, I think it's millions of people, right? Yeah. Isn't it? No. And I think in the States, it's the most common job for white men, I think. Something like 9 million white men are driving trucks and cars. The problem with that is most people are like, fuck white men. Tired of white men. We're the patriarchy.
Starting point is 01:52:42 This is Trump's base. Yeah. Well, tired white man. But yeah, patriarchy. This is this is Trump's base. Yeah. No, it's these. I think universal basic income. There are reasons to worry that it's not a perfect solution because you do want you want to incentivize the things you would put in place. But there's just no question that viewed as an opportunity, this is the greatest opportunity in human history. We're talking about canceling the need for dangerous, boring, repetitive work and freeing up humanity to do interesting, creative, fun things. Now, how could that be bad? Well, give us a little time and we'll show you how we can make it bad. And it'll be bad if it leads to just extraordinary wealth inequality that we don't have the political or ethical will to fix. Because if we have a culture of people who think,
Starting point is 01:53:49 I don't want any handouts, and I certainly don't want my neighbor to get any handouts, and I don't want to pay any taxes so that he can be a lazy bum. If we have this hangover from Calvinism that makes it impossible to talk creatively and reasonably about what has changed, yeah, it could be a very painful bottleneck we have to pass through until we get to something that is much better or a hell of a lot worse, depending on where the technology goes. And I think at a certain point, the wealth inequality will be obviously unsustainable.
Starting point is 01:54:28 I mean, you can't have multiple trillionaires walking around living in compounds with razor wire and just moving everywhere by private jet and then massive levels of unemployment in a society like ours. I mean, at a certain point, the richest people will realize that enough is enough. We have to spread this wealth because otherwise people are just going to show up at our compounds with their AR-15s or their pitchforks. And, you know, the society will not sustain it. I mean, you can't, there has to be some level of wealth inequality that is unsustainable,
Starting point is 01:55:11 that people will not tolerate. And you begin to look more and more like a banana republic until you become a banana republic. But now we're talking about, you know, the U.S. or the developed world where all the wealth is. So redistribution is the endgame. But that's a toxic concept for half of the country right now. Right, the idea of the welfare state, the idea of perpetuating that and spreading it across the board. Yeah. But these are, so yeah, I mean, whatever the solution is for coal mining, we should not be hostage, for the coal miners, we should not be hostage to the idea that they need jobs so that whatever job they were doing and are still qualified to do,
Starting point is 01:56:02 that job has to continue to exist no matter what, no matter what the environmental consequences, no matter what the health consequences, no matter how it closes the door to good things that we want. We don't do that with anything. We didn't do that with, you know, the people who are making buggy whips or anything else. Or slavery. Yeah. I mean, there's just, there's no, at a certain point we move on and we make progress and we don't let that progress get rolled back.
Starting point is 01:56:30 And when you're talking about developing technology that produces energy that doesn't have any of these negative effects, you know, whether it's global climate change or just pollution of course we have to move in that direction and the other thing that's crazy is that we're not talking honestly about how dirty tech is subsidized I mean you have the oil people say well solar is all subsidized right?
Starting point is 01:57:00 this is a government handout that's giving us the solar industry well one, that's not even you have to produce an argument a government handout that's giving us the solar industry. Well, one, you have to produce an argument as to why that's a bad thing. This is something we should want the government to do. The government needs to incentivize new industries that the market can't incentivize now if they are industries that are just intrinsically good and are going to lead to the betterment of humanity. But carbon is massively subsidized. We don't have, I mean, if we actually had the coal producers and the petroleum producers
Starting point is 01:57:36 pay for the consequences of carbon and pollution, it would be much more expensive than it is. So it's already subsidized. So we should, I mean, we need a carbon tax, clearly. We need to, the tax code should incentivize what we want to incentivize. Well, there's a ton of industries in this country that you could make that argument for. The corn industry is one. I mean, subsidizing the corn industry, and when you find out that corn and corn syrup is responsible for just a huge epidemic of obesity in this country and the amount of corn syrup that's in foods.
Starting point is 01:58:12 If you as a polluter had to pay the consequences of your pollution all the way down the line, right? You had to compensate everyone who got emphysema or lung cancer because of what you were putting into the air. Your industry, I mean, it would be less profitable, right? And it might not be profitable at all. And we haven't priced all of that in to any of these things, whether you're talking about the chemical industry or the cigarette industry. I mean, we're addicted to the use of these fossil fuels the same way we're addicted to some people are addicted to the use of cigarettes.
Starting point is 01:58:52 I mean, the health consequences of those things, they're almost parallel in a lot of ways. Yeah. Well, and amazingly is you have the same, I mean, I think you turned me on to this documentary. What was it? Merchants of Doubt? Yes. Isn't that amazing? The same PhDs.
Starting point is 01:59:08 You've got like 10 guys who move from just toxic industry to toxic industry, defending whether it's big tobacco or the fire. I'm in that documentary. Oh, you are? Oh, you are. That's right. I interviewed a guy named Fred Singer, who was one of these people. And he's a climate change denier.
Starting point is 01:59:44 And there's just some it's been a while, but it was there were some key moments where I was like listening, listing for him all the major scientific organizations that say that climate change is real and that humans are major contributors to it. And he basically just refused to accept it. Well, those shows where you have the three heads, you and then the two experts, and they yell over each other, and we'll be right back. And then you go to commercial. Those fucking things don't solve anything. Those weird moments where people are yelling at each other and you can't figure out who's right or who's wrong. I'm not a big fan of that personally.
Starting point is 02:00:04 And I think that especially it was a failure to do that with climate change because it created the doubt. It created the doubt. And I think that was a very successful. It was a very successful strategy. But that's intention with what we just said about The New York Times, because if you if you take a position as a journalist, if you say, OK, actually, one side of this conversation is full of shit, then you seem like you're not an honest broker of information. I think it's different to take a position. I don't I don't it's not controversial for me to sit here and say, if you smoke cigarettes, the science strongly suggests you have a higher odds of lung cancer same thing with climate change but it became politicized so i don't feel i don't uh you'll notice i've stepped out of some of the discussions that have been taking place because it's not my role as a
Starting point is 02:00:57 journalist to come down on one side or another but with climate change I feel absolutely comfortable saying the vast majority of scientists believe this is real and a big, big problem. So I don't what what you're talking about with The New York Times, and I don't really I'm not going to step out and take a view on it. But what I believe you're saying with The Times is that they were pro Clinton and anti Trump. That's different from having a position on climate change. pro-clinton and anti-trump that's different from having a position on climate change well it's not well no because to have a position on just just take the lens as climate change you have one candidate who's denying the reality of climate change or who's who's just claiming that that the climate change is a chinese anti-trump well no because no, because when Trump says climate change is a Chinese hoax, you have to call him... Has he really said that?
Starting point is 02:01:48 Yes. Yeah, yeah. He said it's a Chinese hoax? Yes. He should go to China and see those people that are walking around with gas masks on, because they can't breathe in Beijing. Okay, but that's the pollution argument, which I think is actually much stronger, because there's just... You can't go to Beijing and say
Starting point is 02:02:04 this is how we want the air to be right uh but and i think i think something like 20 i think something like 25 percent of the air pollution in california on some days is coming from china i mean there's some amazing i could have that slightly wrong jamie could check that out but please do there's some there's some extraordinary amount of air air pollution that we get from china that's insane. Jamie could check that out. Please do. There's some extraordinary amount of air pollution that we get from China. That's insane. I did not know that. Yeah.
Starting point is 02:02:29 Just makes its way across the ocean? Yeah. Oh, yeah. There's no wall. You know, we've got to build that wall. But no, but at a certain point, I mean, there's some level of dishonesty and misinformation that's so egregious that if you're a journalist, all committed to being an unbiased or a disinterested broker of information and a conversation, if you set up a conversation between, you know, the Sean Spicers of the world or the
Starting point is 02:02:56 Kellyanne Conways who have their alternative facts, and someone who's honestly talking about whatever it is, climate change. It's very you can't split that baby. But I don't think you have to create mock unnecessary debates around climate change. What I do think is that with the Trump administration, that it is imperative for that journalists call out when things are said that aren't true, but that I don't think it's constructive for mainstream journalistic organizations to have an openly hostile anti-Trump attitude or pro-somebody else attitude, because then that just leads to further polarization, which is exactly what we don't need. But you call him out on all the times that he's said things that are just absolutely
Starting point is 02:03:46 not true. That feels like you're anti-Trump. Yes, it can. There's no question. There's no question. And I think that's an issue we're dealing with. But I think this is a time of real soul searching in my industry, because I firmly believe there is a very powerful place in a functioning democracy for a press, for media that people generally view as fair.
Starting point is 02:04:10 Well, it's also it really highlights the responsibility of getting accurate and unbiased information to people because there's not a lot of sources of that left. I mean, when you look at Fox News and sometimes you look at CNN, sometimes you look at MSNBC, you're like, boy, how much of this is editorialized? How much of this is opinion? You know, you need unbiased facts and you almost need it not delivered by people. You know, the problem is when people are delivering the news, like when you talk to someone, you know they're educated in an Ivy League university, and they speak a certain way, they act a certain way. You almost can assume that these intellectual people, these well-read people, are going to sort of lean towards one way or another. I think that there is that issue. I would argue, and again, I know this is
Starting point is 02:05:03 self-serving, but I still believe it, that the three broadcast networks have actually fared quite well in what is an incredibly difficult environment right now. So you don't count Fox as a broadcast network? Fox News is a cable network. But Fox Broadcast does not have a news division the way ABC, CBS, and NBC do. So just broadcast, meaning traditional old school signal in the air that nobody uses anymore. Nobody uses them, but we refer to them within the industry as the three broadcast networks. Isn't that bizarre though? It is.
Starting point is 02:05:37 Well, some people, you know, they put up their rabbit ears and get the signal. Who the fuck does that? Cord cutters, extreme cord cutters. Jamie, you got one? It's 20 bucks. I get like 30 channels and it's's way better, and it works. There's no lagging. There's no buffering.
Starting point is 02:05:48 So you do that, and then you get like Netflix for TV shows? Yeah, I get like Hulu, and I have Sling. But when I want to watch a basketball game, like the NCAA basketball tournament last night, I had to watch it on that. You watched it through the air. It worked really good. Old school. I'm just saying it worked.
Starting point is 02:06:01 Wow, interesting. It's like having a vinyl collection, too. Well, that's the benefits. What's the viewership on an evening news broadcast now? Is it like 5 million? No, it's around 8 or 9. 8 or 9. Yeah.
Starting point is 02:06:13 So it's still a really big number. Is it up? I don't follow all of the ticks when it comes to the numbers, so I can't answer that accurately. when it comes to the numbers, so I can't answer that accurately. But people have been predicting the demise of network news for decades now. I've been at ABC News for 17 years. I've been reading obituaries all throughout that time. Eight million a night on each of the networks is still, that is a gigantic number, especially in an age of micro information, of niche broadcasting and the Internet.
Starting point is 02:06:50 It's all that. Here are the numbers right there. NBC. So ABC is in the lead. ABC is the lead in total viewers. So ABC and NBC, and this is just one week, March 27th, right around 8, and then CBS is 6.5 million. So that's a lot of people. It skews super old, though.
Starting point is 02:07:06 It does. Look at that demographic. It absolutely skews very, very old. Well, it's all people. The majority is people over 54 or under 25, but I don't think that's the case. No. There's like four guys under 25. We're looking at 25 to 54.
Starting point is 02:07:18 ABC has 1.6 million as opposed to 8 million total viewers. NBC, 7.8 million total viewers. 1.7 million, 25 to 54. And then CBS, 6.4 million total. 1.3 million, 25 to 54. It's funny. After 54, fuck you. Before 25, fuck you. That's what they concentrate on.
Starting point is 02:07:42 It is what advertisers want. It's funny. People with vital advertisers want. It's funny. People with vitality. You can tell based on the ads. I mean, the ads you run, it's like, you know, for catheters and anti-inflammatories. It gets a little different on the morning because in the morning shows where we're also, it's closer to four or five million for ABC and NBC and a little less for CBS. The percentage of that audience that falls within the demo, as we call it, 25 to 54, is higher. And so the ads are kind of different.
Starting point is 02:08:13 But I guess my point is that you still have a really significant number of people, if you take the mornings and evenings on these broadcast networks, that are getting their news from these places, which just gets back to the polarized media atmosphere you were talking about before. I think in this atmosphere, having the networks be seen to a certain extent, to the extent possible as above the fray, actually is important for democracy. How many does Alex Jones get? What does Alex Jones get? He claims, I think, 40 million a week, but they include website hits in their big number. So it's a skewed number. I've spent some time with him.
Starting point is 02:08:52 Me too. Have you? Oh, yeah. I know him very well. Yeah, okay. I've had him on the podcast. I went down to Austin and spent a lot of time. Yeah, yeah, yeah.
Starting point is 02:08:59 How long? How long were you there for? I basically spent a day hanging around his operation. This was, I would say, in 2009. Yeah, so it was a while ago. Before the monolith that it is now. It was pretty big back then, but not what it is now. In 1999,
Starting point is 02:09:15 Alex Jones and I put on George Bush Sr. and Jr. masks, and we smoked pot out of a bong and then danced around the White, not the White House, the Capitol building in Texas for a stand-up video that I did. Yeah, I've been friends with that guy since 99. Well, yeah, I listened to your recent podcast with him, which was just... Interdimensional child molesters that are infiltrating our airwaves.
Starting point is 02:09:42 Yeah, he went deep. We got him high and drunk, and he went as deep as he's ever gone before. Do you find that he is different when he's not being recorded? Yeah, sort of. There he is. There he is, screaming and yelling. That's me and him. Oh, shit, I broke the mask. That's him singing, by the way.
Starting point is 02:10:04 Here we go He made up all this Yeah, Moloch the Owl God No, it's the He made up these lyrics This song is all him I believe he might have ad-libbed it too Give me some volume on this
Starting point is 02:10:21 That was the name of my DVD, Belly of the Beast. Live from Austin, Texas. Strap yourselves in. It's the belly of the beast. Belly of the beast. That's Alex.
Starting point is 02:10:39 He's so crazy. I mean, he's definitely different when he's not being recorded, but he is that guy, you know? It's sincere. He believes all the stuff. He believes a lot of it. I mean, whether he's correct or not, that's a different argument. Whether or not he believes it.
Starting point is 02:10:57 I was struck. I spent a full day with him, and then we had dinner with him afterwards, and we were not recorded. I spent a full day with him, and then we had dinner with him afterwards, and we were not recorded. And I was struck by the difference in demeanor off camera and not being recorded. Yeah, there's definitely that. I'm not saying he doesn't believe it. I don't know. Right.
Starting point is 02:11:17 I'm not in his mind. He's a showman, for sure. What's really disturbing is when he gets stuff right. You know, like the World Trade Organization is a perfect example. He was one of the ones that highlighted the use of agent provocateurs. Now, what agency, who hires these people? But what they do is they take something like the WTO, which was a big embarrassment that people were protesting the WTO, and they hire people to turn this peaceful protest into a violent protest.
Starting point is 02:11:43 So these people come in, they wear ski masks, they break windows, and they light things on fire or do whatever they do that makes it violent. And then they have the cops come in and break it up because now it's no longer a peaceful protest. And so it got to the point where people were trying to show up for work. They had WTO pins on that aligned through it. Well, they had created a no protest zone. And they were taking, this is all on footage on the news,
Starting point is 02:12:06 they were telling people, you have to take the pin off. You can't go through this area where you work with a WTO pin on your backpack or on your jacket. I mean, that's crazy. And he highlighted that and that use of agent provocateurs
Starting point is 02:12:23 has been documented. This is a real thing. And it's a real tactic that, again, what agency, what faction of the military, what faction of the government hires these people to do that, I don't know. But it is a real thing. And I did not know about that until Alex highlighted it on one of his videos. It's been proven. It's been proven that it's real. And so that alone is disturbing. When you stop and think about all the different things that attack Guantanamo Bay. They were going to blow up a drone
Starting point is 02:13:05 jetliner. They're going to blame it on the Cubans. And they were trying to use this as impetus to get us to go to war with Cuba. Yeah, but how do you feel about the things that he's talking about that are a good monster? But let's talk about that first. I mean, there are things that are true. That's what gets really squirrely. What gets really squirrely is
Starting point is 02:13:21 like when you find out that there have been things, like the Gulf of Tonkin incident. There's many things that have happened where there have been false flags, where the government has conspired to lie to the American people, and people have died because of it. I think, well, I don't know a lot about Gulf of Tonkin, although I know more about it than those other examples. than those other examples. I mean, there are definitely cases where it's an additional interpretation to say that the government lied. I mean, so to take even something that's closer to hand,
Starting point is 02:13:53 like weapons of mass destruction as a pretext to go into Iraq, right? Now, it's one thing to say that people knowingly lied about, that Bush and Cheney knowingly lied to the american people about that or they were misled by people who were knowingly lying right or just everyone got it wrong like yeah like it was just it was it was totally plausible to everyone who was informed that he had a wmd program and they misinterpreted whatever evidence they thought they had, and they were just wrong. Now, so that's a spectrum. I'm not claiming to know which one of those is true. I
Starting point is 02:14:30 think it's probably the last is much closer to the truth. And that explains many of these instances, but what's so corrosive about pure examples of lying is that, and we may have one case now that's just emerging in the news. I don't know if maybe the story has been clarified while we've been talking, but it now seems that Susan Rice came out. At one point, she said she knew nothing about the unmasking of Trump associates in this recent surveillance case. And now it seems that she is claimed that she actually asked to have certain names unmasked. And so this is being seized upon again, just in the last few hours, as an example of a lie, right, which seems very sinister, but as though it equalizes the two sides here, right? So let's say worst case scenario, Susan Rice lied about having some
Starting point is 02:15:32 knowledge of this investigation. That doesn't, it says something bad about Susan Rice. It says something, I mean, she has to deal with the consequences of that lie, but it doesn't exonerate all of the lying that Trump has done about everything under the sun, right? commits an honest error, that gets pointed to from those who want to treat the mainstream news media as just fake news, as, see, everything's the same. You're no better than somebody who's just manufacturing fake news on a laptop in his basement. And the flip side of that is when Alex Jones gets something right, So it's like it's and the flip side of that is when Alex Jones gets something right, it seems to it seems to make him look like a a dignified journalistic enterprise analogous to The New York Times or to ABC News. And neither of those things are true. I mean, there are small lies and then there are huge lies.
Starting point is 02:16:39 There are honest mistakes committed by totally reputable organizations that are really trying to get their facts straight. And then there are malicious, there's just malicious propaganda outlets that are not at all trying to get their facts straight and trying to engineer everyone's credulity and ignorance into something that is just purely a matter of, you know And so we lose our ability to distinguish these different projects when we say, oh, look, fake news, both sides do it. Or here's a lie. Here's Susan Rice's one lie that she told in the last 10 years, maybe, and got caught for. And we have a president who lies every time he approaches a mic or picks up his Twitter. She had other problems because she had gone on, I believe,
Starting point is 02:17:29 on the Sunday morning talk shows after Benghazi with some outdated talking points. Yes. Yeah. Susan looked like she was itching to get caught. I mean, whether this is a case of she's going to get caught. Yeah. No, I think your point is very, very important because I think that when you have, if you have one side that lies all the time, it's imperative that the other side don't lie at all. Yeah. Yeah.
Starting point is 02:17:54 Oh, and they're lying. They say, look, everybody does. That's why it's a tense time for people in my line of work because we wear so much scrutiny and when we get it wrong, we really, you know, we take a lot of heat. But when we get it wrong, I think we're pretty quick to say, we got it wrong, here's the actual,
Starting point is 02:18:13 here's the truth. Do you know that Donald Trump Jr. said that Cernovich should get a Pulitzer? No. For exposing Susan Rice? Really? Yeah. Right.
Starting point is 02:18:20 Yeah. But that's why this drives me a little crazy, because it's not Cernovich. It's the fact that this has bled into the real world. Why did you lead him down the Cernovich thing? You just brought him right back in. I mean, it's huge. We're only now discovering the consequences of this.
Starting point is 02:18:41 Let's come back in five years and see if we're talking about anything that makes any sense. Well, it's all very blurry. It really is very blurry because we've never had a situation like this where we have a president that people just don't trust, to be honest. I mean, if someone lied about anything in the past, I mean, if Donald Trump got caught having his dick sucked in the White House tomorrow on film, you know, he'd be like, look, I made a mistake. It would be nothing. It would be nothing.
Starting point is 02:19:08 Photoshop. Yeah. Fake news. But don't do anything. Just again, just slot to use Joseph Goldstein's phrase before about slotting into geological time and just looking at the broader scope of history. We've had really we've had periods of time where we had media, the muckrakers, you know, where the media outlets were didn't even pretend at times to to to not have an agenda. What times were those? where we had, was it Teddy Roosevelt called the journalists muckrakers, where basically he was saying that their job was to rake muck, to be working in filth.
Starting point is 02:19:59 And the media and American politics have been filled with instances, American media and American politics have been filled with instances over our short history of untruths and agendas. And then we have times of relative stability. And then sometimes we slide back into more tumultuous times. I think, maybe the Pollyanna here, but I think that we, the Republic will survive and that we will, that it doesn't inexorably lead to a time where truth doesn't matter. Well, it seems to open up the door for a viable alternative. It seems to open up the door for someone who comes along who is in many ways bulletproof. That is a very ethical person. It's opened a door.
Starting point is 02:20:39 I mean, now anyone can be president. You're talking about somebody to run for office. Yes. I just don't think anybody's bulletproof. Humans, I mean, with the exception of Joe Rogan. Well, definitely not bulletproof. You don't have to be bulletproof. I mean, this guy's got nothing but bullet holes in him.
Starting point is 02:20:53 He's president. No, but what Joe was saying is it opens the door to somebody who's unimpeachable. A strong, viable alternative, like someone who is ethical. I think the door is open to anyone. I mean, who couldn't be president at this point there's no one that steps forward right now do you think they're waiting I mean like who the fuck
Starting point is 02:21:11 first of all you couldn't be president there's never been an open atheist they would attack your twitter page good luck no I mean I think if you're going to use a conventional political calculus, well, then, yeah, then being an atheist, having a history of get through an hour of the day without something that would have been a scandal in some previous age of the earth coming out of his mouth.
Starting point is 02:22:08 mouth. It's just, it's, I think, yeah, I think, I think there's, there's no predicting who could be president in the future. I mean, it's, it's, it's, you need wealth, you need wealth and you need charisma on some level. You need to, you need to be able to, to get a tribe behind you. But I don't think you need any of the of what people thought you needed even a year ago. Well, what he's done by circumventing the whole system and by being independently wealthy is really kind of... But even that was a sham. I mean, what's amazing is that he is like, I don't know how much money he actually put into his campaign, but it's just not nearly what you would have... I mean, he really only had to pretend to be that wealthy and It was incredible what happened is what he achieved with very little campaign
Starting point is 02:22:54 You know, I mean you It's like him or not. You have to give him credit for an unbelievable victory that very few people saw coming It's a popularity contest. Isn't that part of the problem is i mean oh yeah you made a popular person president of a popularity contest i mean that's what it is he won he was running against somebody who wasn't popular so i mean that's let's put it in context yeah and we're and we was coming after eight years of a democratic president who was controversial in many ways and um it's always harder for somebody of the same party to run so there are a lot of dynamics larger impersonal dynamics that were working in his favor yeah it's just but when we just to get back to the argument about like
Starting point is 02:23:36 are we in apocalyptic times or whatever i guess i mean i don't know anything i i But my instinct is, just as a country, we've seen times of much worse division. We had a civil war. Yeah, but it's telling that you have to go back there for a compelling example. I mean, I think this is... Well, what about in the 60s? Yeah. We had people in the streets. We had the weather underground. We had MLK and RFK assassinated. It was pretty divided time. Yeah. I mean, what I'm worried about now is, I mean, I think he's truly unique.
Starting point is 02:24:16 And I think the way we're becoming unmoored from kind of the ordinary truth testing in politics, I think, is unique. I mean, the fact that we have a president who's seen it, because there's a whole feedback mechanism. Trump seems to get some of his information from InfoWars and from places like InfoWars. Oh, he goes on it all the time. But you don't think he pays a price for his popularity rating is not high. Oh, yeah. Well, he's paying some price. But the question is, is it going to be enough?
Starting point is 02:24:52 And what happens with the next terrorist attack? Right. I mean, so the real fear is that if we have a Reichstag fire kind of moment, right, you know, engineered by him or not. I mean, I'm not so paranoid as to think he's going to do it, but I just think it's inevitable something is going to happen. I mean, we've had 80 days, 100 days, whatever it's been of his presidency, where basically nothing has happened and it's been pure chaos, right? And the work of government's not getting done. The government's barely even staffed. And this has been a period of where nothing really has happened. But imagine a 9-11 size event, or something really goes off the rails with North Korea or China or Russia.
Starting point is 02:25:40 It will be so, I mean, the pressure, one, the pressure to normalize him as commander in chief will be much greater than it is now. And already everyone's feeling that pressure. But two, the ability to clamp down on civil liberties and just just use the power of the presidency in a way that it would be very difficult for our society to recover from. I think it's just, I mean, I think it's a scary moment. And I'm, I mean, the people who are drawing analogies between him and, know the 30s in in um germany um i mean i think i think those analogies are misleading in ways but again you just don't know how strange things could get if with a big negative stimulus to the system what I think we know is that we have someone in charge who is a malignantly selfish con man, right?
Starting point is 02:26:53 I think that is an objective fact. That's not a partisan thing to say. He is, I mean, that's as obvious about him as his hair. And to have that much power in the hands of somebody whose ethical compass is that unreliable or reliably bad, in my view, and I wouldn't say this of Pence, who scares me for other reasons. I mean, he's a theocrat. I wouldn't say this of most Republicans who I might disagree with, but there's just something to put that much power in the hands of somebody who is so disconnected from facts and a reasonable concern for the well-being of the rest of humanity and the future. You know, I mean, it's just it's it's an incredible moment.
Starting point is 02:27:42 And I don't think we've ever had a president who you could look at and say that about so clearly. I mean, not even Nixon. I mean, Nixon was pathological in other ways. But Nixon was giving us the Clean Air Act, right? I mean, Nixon had some point of contact with terrestrial reality, which wasn't just about figuring out how to burnish his imaginary grandeur. So, yeah, I mean, I think we're one huge news story away from finding out how bad a president he could be. And I think that's, you know, I think he could surprise even the people who are very pessimistic. Well, he could also rise i think i think that's you know i think he could surprise he could surprise even the people who are very pessimistic well he could also rise to the occasion yeah i mean look at you
Starting point is 02:28:31 i'll be objective inadvertently man in them yeah see that well no i mean i also like to you know i i would be the first to give him or the system credit for that i mean it's like it's like it's not that i i if he starts doing something good, like, you know, let's say a massive infrastructure project, right. That includes building out good things, not coal jobs, but you know, if, if, you know, fixing bridges, I mean, that'd be fantastic, right? And it's possible he could start doing that for his own, his motives wouldn't even matter ultimately, as long as he was doing the right things. But his motives are so reliably self-involved that, I mean, what we need is a system. I mean, what we need is a system. Someone needs to be able to play him, this kind of narcissistic block and spiel, well enough to get him to do the things that would be good for the world.
Starting point is 02:29:35 But I just don't see, there's just too much chaos in the system for that to be a reliable game. Well, like you, they try to keep him away from social media. Yeah, well, that's working out. I mean, that's a big thing, don't they? They try to pull him away from it. Yeah,, that's working out. I mean, that's a big thing, don't they? They try to pull him away from it. Yeah, but that's been impossible. No one can do it. Yeah.
Starting point is 02:29:53 Remember that crazy press conference that he had just a few weeks into being president? That was unbelievable. I got text messages from friends that are Republicans. They were like, what the fuck? Yeah, but you know, it's an interesting sort of Rorschach test because there are millions of Americans who watched that press conference
Starting point is 02:30:11 and found it delightful. They thought that he was pounding on the liberal media. When you read a transcript of what actually comes out of his mouth, it is amazing.
Starting point is 02:30:32 I mean, the poverty of the passages in terms of information. And I mean, it's just the fact that we have a president who speaks like this is, I could never have foreseen that in our lifetime this was going to happen. It's just, it's unimaginable to me yeah i think a lot of people there are just indisputably millions of americans find it did you see bill maher's thing about anthony wiener that he thinks that anthony wiener should run against trump we need our own crazy man yeah because Anthony Weiner, besides his obvious addictions to sexting with people, was a very powerful politician and a very good speaker. And he had a lot of very good quality. He was going after bankers, the same as Eliot Spitzer. Right.
Starting point is 02:31:17 Eliot Spitzer did a lot of good things. He was going after the bankers. He was going after corruption. Yeah. But that's an example i mean that's a there's a flag planted that shows us how far we have gone from the normal i mean elliot spitzer destroyed himself with one instance of hypocrisy right right well it was pretty bad you know he was trump level bad mean, there's just no...
Starting point is 02:31:45 I don't know. I mean, we don't have Trump breaking the law in this way. I mean, he was with a prostitute. Well, that's a stupid law. The real problem was that he was going after prostitutes. I know. That was the real problem. That's the hypocrisy.
Starting point is 02:31:59 Yeah, the hypocrisy wasn't that he did something illegal where he paid someone for sex. That law seems to me to be so archaic and so ridiculous. You can pay someone to massage you. You can't pay someone to touch your genitals. It seems ridiculous. I guess the other piece you have to put in play there is that there are some percentage of people working in the sex trade who are not doing it voluntarily. Who are coerced to one or another degree. And that's a horror show. And isn't part of the problem with that is that it's illegal. working in the sex trade who are not doing it voluntarily, who are coerced to one or another degree. Yes.
Starting point is 02:32:25 And that's a horror show. And isn't part of the problem with that is that it's illegal. I mean, that's the big argument is that. Well, yes, there is that argument. And no, I agree with you that consenting adults should be able to do what they want to do, but the crucial variable there is consent. And kids can't consent. Agreed. Agre consent. Right. And kids can't consent. Agreed.
Starting point is 02:32:45 Agreed. Yes. I just think that someone is going to have to be dynamic. They're going to have to be, they're going to have to engage people in a way that, I mean, they're going to have to deal with his attacks. Or maybe what people will be thirsting for after four years is actually bland.
Starting point is 02:33:09 Yeah, right? Maybe. Right? Yeah. I would dupe with bland right now. That would be fantastic. Bland and actually super religious would be... Ted Cruz? You'd be happy for Ted Cruz? Ted Cruz is not bland. No. But my enthusiasm for impeachment
Starting point is 02:33:24 suggests that I'm happy with Pence, right? Right. So, like, if I could find the impeachment button, I would not hesitate to press it. And Pence, given his religious commitments, you know, a few short years ago would have been among my worst nightmares. I mean, I would be talking about the rise of the Christian right and the danger of theocracy. the rise of the Christian right and the danger of theocracy. But psychologically, he seems like a normal, predictable, solid American compared to what we have. Isn't the other question like Trump's age?
Starting point is 02:34:01 He's a 70-year-old man. People don't really live that much past 70, especially people that are overweight and people that don't exercise. He seems all too healthy to me. He seems pretty healthy. I mean, that campaign. He does. I can't believe it. He's a man of energy.
Starting point is 02:34:15 In his credit, yeah. I mean, that's an amazing thing to be able to do. How do you think he does that? It's the most punishing thing in the world. I can't even imagine what his schedule is like. It looked like he was having a good time. Seriously. It did. It did.
Starting point is 02:34:28 And in that press conference we were talking about, I think it was a 77-minute press conference, he looked like he was having a blast. Yeah. He gets really energized. Yeah, I mean, it's got to be the most punishing beatdown ever if you're not
Starting point is 02:34:43 designed for it. I mean, I can't imagine how tiring that campaign would be. I mean, you get exhausted for 30 minutes looking at your at mention. Exactly, yeah. No, you have to be wired differently. But, you know, he clearly is wired that way, and it's worked for him. But you need someone who's willing to submit to the punishment of running. And that's, that's a rare person or it's not.
Starting point is 02:35:10 And the problem is that kind of selects for things that you don't actually want in a president, or at least I wouldn't think you would want. I mean, it selects for a kind of narcissism and a sense that, that it really has to be you, right? It doesn't select for the, in a normal intellectual space,
Starting point is 02:35:31 you're constantly aware of the ways in which you are not the best guy or gal to be doing the thing, right? Like you want to defer to experts and, you know, Trump is only I can fix it, right? And that worked. And so you need there's something of that that creep any one person's expertise is not necessarily the right piece of software to have running when it comes time to run for president. Now, I want to switch gears a little bit. It's not totally related, but it is in some ways. Headspace. Talking about mindsets and talking about, like we've brought this up, but really haven't delved into it much at all, about meditation and about how much it's affected you
Starting point is 02:36:29 and how it got you back on track. And I know that you're a big proponent of it, and I am as well, although I think I'd probably do it differently than you guys do. I'd love to hear how you do it. I use an isolation tank. Oh, really?
Starting point is 02:36:42 Like a sensory deprivation tank? Yeah, yeah. And I do a lot of yoga. Those are, those are two big ones for me. You know, I think, um, those alone have, uh, straightened out my brain in a great way. Yoga in particular yoga, because it's, um, it's not yoga forces you to do it. You know, like if you're either doing it or you're not doing it, there's no room for distraction. You know, you're, you're essentially forced to deal with what these poses require of you. And I think that in, in doing so and having a singular focus of trying to, uh, maintain your balance and stretch and
Starting point is 02:37:16 extend and do all these different things while you're doing then concentrating almost entirely on your breath, which is a big factor in yoga. It has remarkable brain scrubbing attributes. Yeah. I would say, and I don't know much, before I say this, let me just ask, what are you doing in the isolation tank? What are you doing in your mind? A lot. Well, you know, I use it in a bunch of different ways.
Starting point is 02:37:40 I don't use it as much as I should, honestly. But I concentrate on, sometimes I go in there with an idea. Like I'll concentrate on material that I'm working on or maybe jujitsu techniques that I'm having problems with or some other things I'm dealing with, any sort of issues that I have. Sometimes I do that. Sometimes I just go in there and chill out and relax and breathe and concentrate. There's a lot of physical things that happen inside the tank. There's the amount of magnesium that's in the water because it's Epsom salts. It's really good for you physically.
Starting point is 02:38:12 It's very good for the muscles. It loosens you up and relaxes you and that eliminates a lot of stress. And that physical elimination of stress allows the brain to function with just less pressure and allows you to relax more, puts things in perspective better. And it also gives you this environment that's not available anywhere else on the planet, this weightless, floating, disconnected from your body environment where you don't hear anything, you don't see anything, you don't feel anything. You feel like you're weightless.
Starting point is 02:38:40 You have this sensation of flying because you're totally weightless in the dark. You open your eyes. You don't see anything. You close your eyes. It's exactly the same. The water is the same temperature as your skin, so you don't feel the water, and you're floating. I'll say this. I don't know, Cheryl, if this is going to go down.
Starting point is 02:38:58 I actually don't have any questions about the benefits of being in an isolation tank, even though I don't know too much about it. And I also am a, I think yoga is great, although I don't do much of it myself. I think, though, that there may actually be a difference between those two activities and meditation. Because there's a kind of, this is a highfalutin term, metacognition, sort of knowing what you know or knowing that you're thinking. recognition, sort of knowing what you know or knowing that you're thinking. That happens in the kind of mindfulness meditation of which Sam and I are proponents, that I think is a different thing, where you're seeing, and I'll just try to put this in English, when you're meditating the way we do, you're seeing how crazy you are. You're seeing you're fucking nuts. And that actually has
Starting point is 02:39:43 a real value. A systematic collision with the asshole in your head has a real value because when the asshole offers you up a shitty suggestion in the rest of your life which is basically its job like oh yeah you should eat the 17th cookie or say the thing that's gonna ruin the next 48 hours your marriage or whatever you're better able to resist it. So what do you do? So, I mean, the basic steps of mindfulness meditation are to sit. Most people close their eyes. You bring your full attention to the feeling of your breath.
Starting point is 02:40:14 You're not thinking about your breath. You're just feeling the raw data of the physical sensations. And then the third step is the biggie, which is as soon as you try to do this, your mind's going to go bonkers. You're going to start thinking about, you know, what's for lunch? Do I need a haircut? Where do gerbils run wild? Whatever. Blah, blah, blah. You're just going to notice, oh, my mind's going crazy right now. But that noticing is the key moment, is in fact the victory.
Starting point is 02:40:39 It's interesting because this is when most people think they've failed. Oh, I can't meditate because I can't clear my mind. This is the biggest misconception about meditation. You do not need to clear your mind. That's impossible unless you're enlightened or dead. The whole goal is just to notice when you become distracted and start again. You return your attention to your breath. And you just do that a million times.
Starting point is 02:41:01 And every time you catch yourself wondering and go back to your breath, it's a bicep curl for your brain. It changes your brain. And that, over time, creates the kind of metacognition I was discussing before where you see that you're a homo sapien sapien. In other words, that's how we're classified as a species. We're the one who thinks and knows he or she thinks. as a species, the one who thinks and knows he or she thinks. And that just that knowing that you have this voice in your head, as Sam likes to joke, he feels like when he thinks about the voice in his head, he feels like he's been hijacked by the most boring person alive.
Starting point is 02:41:36 Just says the same shit over and over. A joke that I steal from him all the time. Yeah, that is enormously powerful because then you are not held hostage by this voice. A similar thing happens in the tank. I do a form of meditation in the tank. Sometimes when I go in there without an idea, like if I'm not working on material or anything else, where I just concentrate on my breath in through the nose, out through the mouth, and I just literally concentrate on the breath and the same thing happened that's
Starting point is 02:42:05 meditation well yeah yeah it's it's not similar it's the exact same thing the difference is in the tank after a while after about 20 minutes or so that breaks loose to psychedelic states Wow you ever combine the tank with psychedelics yes altered state so yeah yeah yeah yeah what trip what LSD or mushrooms Have you ever combined the tank with psychedelics? Yes. Altered state stuff? Yeah. Yeah. Trip. LSD or mushrooms? Mushrooms. Mushrooms. But the big one is edible pot.
Starting point is 02:42:30 Edible pot seems to be as strong as anything in there. Like I eat enough pot where I'm convinced I'm going to die and then I climb in there. Right. Every time I do it, I go, don't do that again because I just get out. I'm terrified. What's your motivation when you're eating the pot and climbing into the tank? Let's see what happens. Okay. Yeah. Just be scared. Be terrified. What's your motivation when you're eating the pot and climbing into the tank? Let's see what happens. Okay.
Starting point is 02:42:46 Yeah, just be scared. Be terrified. Really? Yeah, because nothing ever happens. You never die. But God damn, you're just convinced that the universe is imploding around you. So it usually has the character of fear being a major part of it. It's also not embracing the fear, not letting the fear run rampant and just sort of relaxing
Starting point is 02:43:07 and giving in to the vulnerability, the finite nature of your existence and just breathing and concentrating and letting the dance take place. Because there's some sort of a weird, one of the things that there's a big misconception about when it comes to edible pot is that edible pot is like smoking pot.
Starting point is 02:43:25 It's an entirely different process physiologically. The time course is very different too. You can stay stoned for three days. Yeah. You could fuck up and eat too many brownies and you'd be gone for a long time. That sounds miserable. It is, but it's not because you get something out of it when it's over. The process is excruciating, but when you come out of it, you just feel so happy.
Starting point is 02:43:49 Feel so happy it's over. Yeah, it's like the joke about the dude who's banging his head up against the wall. And somebody says, why are you doing that? He says, because it feels so good when I stop. In a way, but there's no physical damage. Banging your head up against the wall, you're going to hurt yourself. This is true. Yeah, I don't know.
Starting point is 02:44:04 I think there seems like there might be easier ways to get to the same wisdom. Maybe. But I think there's also creativity that gets inspired by the edible pot. Something called 11-hydroxy metabolite that your body produces. It's so different that most people, when they eat pot, they think they've been dosed. Like anybody who's smoked pot before and then you give them a brownie, they think, oh, my God, there was something in that. And they're convinced because reality itself just seems like it just dissolves. And especially inside the tank.
Starting point is 02:44:35 There's something about the tank environment that produces, in the absence of any external stimuli, your brain becomes sort of supercharged. external stimuli, your brain becomes sort of supercharged. Because what you're trying to do when you're just sitting down and concentrating and relaxing is you're trying to focus on your thoughts, but you're still aware of your body. You're still aware of your elbows touching this desk, your butt touching the chair. There's all these different factors that are, there's stimuli that's coming into your senses. Whereas in the tank, there's none of that, virtually. It's almost completely eliminated. There's some, but you can phase that stuff out. Like, you could still feel the water a little bit if you think about it.
Starting point is 02:45:13 You could still, sometimes you bump into the wall and you have to, like, center yourself and you have to, like, relax again and make sure you're not moving so that you don't touch things, which can kind of dissolve the experience. There are experiences in meditation where you have that same experience, where you lose your sense of the body, but that usually comes with more concentration. You have to be very concentrated on it. I feel like you would have that experience,
Starting point is 02:45:39 and it would be even more intense if you did the exact same thing that you do outside the tank in the tank. I don't think you need any psychedelics in the tank it's one thing i tell people when they ask me should i get high before i do it i'm like no just do it just do it if you decide after a while if you've done it three or four times you're like i wonder what it's like if i just take a little bit of a hit of pot and see where it takes me there's nothing wrong with that it's not going to hurt you you know if you're a type of person enjoys marijuana or whatever but the the tank alone by itself just the absence of sensory input your brain goes to a very very different place and as long as you can relax as long as you don't think
Starting point is 02:46:15 too much about the fact that you're in the tank just concentrate entirely on your thoughts entirely on your breath and again let all those crazy like where do hamsters live, like all that shit. Let all that stuff run wild through your mouth. But I feel like in the tank at least, that gets to a certain spot and then it stops existing. And then the psychedelic state takes over. Yeah, well, it depends on what the goal is. I think there can be many different goals of meditation or quasi-spiritual practice, and they're not, they're distinct. So, I mean, the center of the bullseye for me is not suffering unnecessarily, right?
Starting point is 02:46:56 And so one thing that mindfulness gives you is, I mean, so it's compatible with every experience you can have. I mean, there is nothing in your experience that isn't an appropriate object of meditation. You just, most people start with the breath because it's just a very convenient thing to start with. But once you know how to do this particular practice, your goal is to just be clearly aware of whatever your experience is in each moment. So emotions arise, thoughts arise, sounds come in. Your attention is wide open to whatever your experience is. So nothing in principle is a distraction.
Starting point is 02:47:38 You could be meditating right next to a construction site, and the sound of the hammers is just as good an object of meditation as the breath or anything else. So there's nothing, everything is included. But the, the, the superpower you're after, which you actually can acquire through this practice, is to realize that virtually all of your, of your psychological suffering, and actually, that virtually all of your psychological suffering, and actually arguably virtually all of your physical suffering, or the difference between physical pain and suffering, which those two are not quite the same thing,
Starting point is 02:48:17 is a matter of being lost in thought. It's a matter of thinking without knowing that you're thinking. And what mindfulness does, and really any technique of meditation ultimately should do, is teach you to break the spell of being your life. You're having a conversation with yourself. You're having content, whether it's imagistic or linguistic, pour forth into consciousness every moment and so incessantly that you don't even notice it. It's just white noise. And not only does it completely color your experience moment to moment, so like if they're angry thoughts, you're angry like if they're angry thoughts, you're angry. If they're depressed thoughts, you're depressed. If they're sad, you're sad. So you become your thoughts, but you also feel identified. You feel that you are the thinker of your thoughts. You feel like a self. And it's completely structured by this flow of
Starting point is 02:49:21 mentation every moment. And it produces everything you do it produces all of your intentions and your goals and your actions and he said this about me and now i'm going to say this i mean so it's like it's everything coming out of you is born of of this the same process and meditation is a way of recognizing that consciousness i mean what you are subjectively is this prior condition of just awareness in which everything is showing up, sounds, sensations, and thoughts. And thoughts can become just other objects of consciousness. And so, I mean, to take even a very basic example of the difference between pain and suffering,
Starting point is 02:50:08 example of the difference between pain and suffering, you can feel very strong physical pain, I mean, unpleasant pain, and just be aware of it in that, like, the sense that it's unbearable is virtually always untrue because in that moment, you've already borne it, right? I mean, the feeling that something's unbearable is really the fear of having to experience it in the next moment in the future, right? Because you're always like, if someone drives a nail into your knee, right, you might, well, that sounds like it's unbearable, but every moment you're feeling it, you're bearing it, right? It's like, what you're thinking about is the last moment and the next moment, and you're thinking about when am I going to get some relief, and what's the cure, and how badly is my knee injured? You're worried about the future continuously, and you're not noticing the automaticity of thought that is amplifying the negativity of the experience in that moment. And we all know that you can have super intense sensation, which is either pleasant or unpleasant
Starting point is 02:51:13 depending on the conceptual frame you've put around it. So, for instance, if you had this massive sense of soreness in your shoulder, you would experience it very differently if it was, A, the result of you deadlifting more than you ever had in your life and you were proud of it, right? B, probably cancer and you're waiting for the biopsy results and you're worried about this is the thing that's going to kill you, or you're getting rolfed, like some deep tissue massage, and it hurts like hell, but you actually totally understand the source of the pain,
Starting point is 02:51:56 and you know it's going to be gone the moment the guy pulls his elbow back, right? So it could be the exact same sensation in each one of those, but the conceptual frame you have around it totally dictates the level of psychological suffering, or it can dictate the total absence of psychological suffering. Now, we were talking before the podcast started about your apps. And we were talking about the amount of different meditation exercises on the apps. Like what kind of different meditation exercises are there if you're talking about just concentrating on mindfulness and breathing uh as it turns out you can iterate off of that basic exercise in some to to like uh infinity essentially um because you talk about not only well i don't want to get too ahead of myself but basically the the the basic instructions are that we listed before. You're feeling your breath coming in, and then when you get lost, you start again.
Starting point is 02:52:50 But then you can add onto that. So one big thing to add on is something called mental noting. So you're breathing in and out, and you're feeling your breath, and then you get distracted by a huge wave of anger. Generally speaking, when we get hit by a wave of anger, we just inhabit the anger. We become angry. There's no buffer between the stimulus and our response to it. But there's this little technique you can do of just making a little mental note of, oh, that's anger. And that kind of objectifies the thing. It's a little bit
Starting point is 02:53:21 like pressing the picture and picture button on remote control, where the story that's taken up the whole frame can be seen with some perspective. So that's just one example of the little techniques that you can add on to the basic exercise, and you can go for a long time. So as we were discussing, Sam's about to start his meditation app, which is going to be called what, Sam, Waking Up? Waking Up, yeah. And I have mine, which is called 10% Happier. Sam is going to be doing all the teaching on his app. And on my app, since I'm not a teacher, we have experts coming in, like
Starting point is 02:53:54 Joseph Goldstein, who's, again, a friend of both Sam and I. And you, each teacher has their own emphasis. And you then start talking about applied meditation. So how do I use it in my everyday life? How do I use it if really what I want to do right now is control my eating? So meditation, for example, we have a course on the app that talks about using it to not overeat.
Starting point is 02:54:21 By the way, I'm terrible at this. But you can use your ability, your mindfulness, your ability to know what's happening in your head in any given moment without getting carried away by it to not overeat. Notice, oh, I'm having this urge right now to eat as I did last night, an entire bag of malted chocolate in my hotel room. But I don't I can ride that urge and not do the thing that I know is stupid. So anyway, that's just a little taste of how you can take meditation and bring it in kind of numerous directions. Do you guys feel competitive?
Starting point is 02:54:56 Both of apps? No, mine's not out yet. No, I feel not yet. When his comes out and completely cannibalizes mine, then yeah. We'll see how happy he is for me. 10% happy? I'll be like negative 500% happier. What's 10% of zero?
Starting point is 02:55:13 Yeah, exactly. No, I actually think I'm of the view, you know, now that I've been in this meditation app business for a little while, I don't think it's Uber. I don't think it's Uber. I don't think the business model is that there's just one huge app that everybody uses and maybe there's some distant second. I actually think it's a little bit more like fast food. I think there's going to be a bunch of big players and you may switch back and forth. Does one need an app? No. No, you don't. But the thing that's useful, and it's really useful at any level
Starting point is 02:55:47 of expertise in meditation, at least this kind of meditation, is having someone guide you. Yeah. It's like a mindfulness alarm that's constantly going off or going off, you know, periodically over the course of 10 minutes or 20 minutes or however long you're sitting. And because distraction is just continually the problem, you're either meditating or you're distracted. You're either aware of what's happening at that moment or you're lost in thought. And that's true throughout your life. I mean, you're either hearing what I'm saying right now or you're thinking about something else and you don't know it, right?
Starting point is 02:56:23 Or you're either reading the book you're intending to read, or your mind is wandering, you're going to have to read that paragraph again. So this failure to concentrate, this failure to be able to focus on what you're intending to focus on, is just this universal problem of human consciousness. And so meditation trains that and other benefits follow. But the having a voice in your head reminding you that you're even attempting to be meditating is very powerful, even if it's your own voice. I mean, even when I'm listening to a meditation that I recorded, When I'm listening to a meditation that I recorded, just my own voice reminding me that I'm supposed to be meditating, it works like any other voice. And so it's a feedback system that you can't really provide for yourself. Although, obviously, you can meditate without an app, and most people do.
Starting point is 02:57:26 I've spent very little time meditating with apps. I just think they're very useful. But you, you know, both of us started meditating. You started meditating well before I did, but we both started pre the apps weren't around. So you can read a book, read a good book, and learn how to meditate out of the book. Just basically remember the basic instructions and do it. But it really is useful to have an app, especially for some people. Because one of the biggest problems in meditation is this persistent fear that you're not doing it right.
Starting point is 02:57:58 And so to have a voice you trust in your ear, just reminding you of the basic instructions, which are so simple but very easy to forget, it can be very useful. I like the idea of it being like bicep curls for your mind. Yeah. I mean, you see it in the brain scans. And Sam will correct me where I run afoul of scientific accuracy here. But this simple act of sitting, trying to focus on one thing at a time, and then when you get distracted, knowing you're distracted and returning to your breath is changing your brain when you do that. You're boosting the muscles and obviously the muscles I'm using
Starting point is 02:58:47 loosely. You're boosting your focus muscle and in many cases, whether there was a study in 2010, I think it was done at Harvard that took people who had never meditated before and they scanned their brains and then they had them do eight weeks of I think a half hour a day of meditation.
Starting point is 02:59:03 At the end of the eight weeks, they scanned their brains again. What they found was in the area of the brain associated with self-awareness, the gray matter grew. And in the area of the brain associated with stress, the gray matter shrank. That, to me, is pretty compelling. That is. Beautiful. All right, gentlemen, we just did three hours. Wow.
Starting point is 02:59:21 Wow. Flew by. Yes, it did. Thank you very much. Thank you. My pleasure. It was a lot of fun. It was a lot of fun. It was fun to meet you. As always. Yeah. Yeah. All right, everybody. Wow. Wow. Flew by. Yes, it did. Yes. Thank you very much. Thank you. My pleasure. It was a lot of fun. It was a lot of fun.
Starting point is 02:59:25 It was fun to meet you. As always, Sam. Yeah. Yeah. All right, everybody. That's it. Go do something else. Bye.
Starting point is 02:59:31 Bye. Bye. Bye. Bye.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.