Duncan Trussell Family Hour - 624: Eric Weinstein

Episode Date: July 4, 2024

Eric Weinstein is a mathematician, economist, public speaker, podcast host, and coined the term "intellectual dark web" and is very good at simultaneously freaking Duncan out with his terrifying prog...nostications regarding the future of humanity and invigorating Duncan with his notion that humans are much closer to becoming a galactic civilization than most people realize. Original music by Aaron Michael Goldberg and Duncan Trussell. This episode is brought to you by: Bilt - Earn points by paying rent right now when you go to joinbilt.com/DUNCAN Squarespace - Use offer code: DUNCAN to save 10% on your first site. AG1 - Visit DrinkAG1.com/Duncan for a FREE 1-year supply of vitamin D and 5 FREE travel packs with your first purchase!

Transcript
Discussion (0)
Starting point is 00:00:00 Friends come see me do stand up comedy I'm going to be at comedy on state in Madison, Wisconsin July 18th 19th and 20th the helium comedy club in Buffalo, New York August 8th 9th and 10th side splitters comedy club August 15th 16th 17th and I'm gonna be at the Wilbur November 1st. Also, if you're in Texas, I'm going to be doing the Comedy Mothership August 2nd through 4th. I hope to see you there.
Starting point is 00:00:32 All right, strap in everybody. With us here today is an amazingly brilliant person who simultaneously freaked me out and gave me a surge of futuristic hope unlike anything I've ever had before. We've got a theoretical physicist with us here today. I'm sure you've heard of him. Eric Weinstein. Oh my God! What a great podcast. This is why I like podcasting. I got to hang out with him at the green room of the mothership. Then he agreed to do my podcast. And now he's here with you. Get ready. Get ready.
Starting point is 00:01:13 It's intense. Everybody welcome to the DTFH. Eric Weinstein, my friend. I'm so glad you're here. We every once in a while while I get to have conversations that maybe in the midst of the conversation, I know I don't understand exactly what people are saying, but I know, oh, this is gonna like, this is gonna shake me up for a while.
Starting point is 00:01:39 And in the green room, we got lucky, you showed up at the mothership, and we were chatting about a lot of different things, but I just wanna, with the time we have, take advantage of that and start off with the conversation about AI. You- Well, before we do that, can we just say something
Starting point is 00:01:57 about the green room because people aren't having a good time. Oh yeah, sure. It's an amazing place to be with some of the most famous and brilliant comedy minds and watching them basically not joke around and just think. Yeah. You know? I mean, it's like the ultimate dorm bull session. Everybody's super open, super curious, very well informed.
Starting point is 00:02:18 I just don't think we realize that comedy isn't what we think it is. Right. It's really an essential feature of the human condition. And to have like, you know, Sid Caesar's show of shows, the writer's room, like the Mel, what am I thinking of? Who did Blazing Saddle? Mel Brooks.
Starting point is 00:02:41 You know, the Mel Brooks, Karl Reiner's energy. It was just, what a privilege to be around so many super smart people. And by the way, one of the most diverse groups of men where there are barely any women. Muslims, Jews, gays, straights, et cetera, but there were very few women. It's a flaw in the green room.
Starting point is 00:02:59 I know. But also to have mixed in with us, somebody like you, which, you know, because all of us, like, you know, one thing I really dislike is when comedians are called philosophers. That really bothers me, having like a brief encounter with philosophy in school and really like trying to
Starting point is 00:03:25 You're insulted by being philosophers. Well I'm insu- I think like you know I'm like for whatever insane reason listening to a Schopenhauer audiobook which I would not recommend but like when you hear the I think there's some similarity in philosophy to physics in the sense that every sentence is in philosophy to physics in the sense that every sentence is thought out and is part of a big equation that's trying to prove some point that they're making. And the reason that I don't like the label philosopher for comedians is because I think philosophers really lean into having some coherent system that is flawless and fallible.
Starting point is 00:04:07 Whereas as comedians, we just like to let our minds go wherever they wanna go, like a truffle pick. And hopefully it will dig up something funny, but to do that you have to temporarily believe in things that maybe later you don't believe in at all. But to really go deep into things, you have to put yourself into the mindset. So this is why I think we're very different
Starting point is 00:04:28 from philosophers. Well, of course, I think like Stephen Wright makes the point of this is what comedians trying to do philosophy would sound like. Yes, exactly. And to liberate yourself from labels like that as a comedian is really important. It's really important.
Starting point is 00:04:44 But to have somebody introducing ideas to our truffle pigs that are then gonna take those ideas and like see if there's some way to turn it into a joke. I would, you know, that's like a really interesting idea. What would happen if you, instead of having science comedians who are mostly scientists who tell a few jokes, what would it be like if we actually got to know what it was, what comedy really is
Starting point is 00:05:10 as opposed to what we think it is and if comedians are actually the first people to grapple using that their sort of trait openness which is sky-high at a place like the green room. What if you guys actually do some quantum field there? What if you actually understood, not at the level of researchers, but just at a plot level, because almost no one does. This is why I'm so excited to chat with you. Because again, there's no way I could have, some of the things you were saying,
Starting point is 00:05:39 I think for a second you forgot you were around comedians and you thought you were around your peers, and some of the things you were saying went right over my head regarding time specifically. But before we get into that, I was really interested in your breaking down of how large language models work, which is for those of you who don't know, that's AI, that's chat GPT. And I, like I told you, had already sort of tried to understand it because I do love it, but I was very confused by it. So I wonder if you could just start off by you explaining in a simple way how
Starting point is 00:06:18 large language models work, because I have a question after that related. Okay, first things first. The people who came up with the architecture don't really understand how it works and If you if you don't grasp that I mean they're different like I Understand it. I think the basic the basic new idea well enough to the basic new idea well enough to say, okay, I follow the train of thought. And then it just works so much better
Starting point is 00:06:51 than anyone could imagine that you're flipped out, that you're actually looking at the things somewhat. Is it thinking? Right. It sure doesn't look like there's any room for it to think. Right. And yet I can't shake the idea that we learned something
Starting point is 00:07:09 that's so profound and so screwed up that we haven't wrestled with it. We still think this is about computers. And I think we made a discovery about us. Yeah. So how do we begin? Because let me just let you know, as soon as I start talking,
Starting point is 00:07:24 some giant number of people are gonna say, you know, he didn't simplify it enough, he doesn't understand it because he can't make sense, make it make sense to a five-year-old. And I don't know how this thing got started on the internet that like basically if you can't explain it to a five-year-old, you don't understand it. Yeah.
Starting point is 00:07:42 Grow up internet. It's so condescending. It's so condescending. I've got three minutes. Explain quantum field theory. By the way, I have a five-year-old, and it's not easy to explain things to them. Just so you know, like explaining something to a five-year-old sometimes can be quite difficult. So that's even, it's the stupidest thing. Like it's your bedtime. Yeah, you're, yeah, yeah.
Starting point is 00:08:07 It's not your job to explain it like to a five-year-old, but and also the way you're explaining that part of it, it's the mathematical stuff that when we were talking about time and physics itself that I got confused, but I think the beginning explanation, and by the way, trying to explain how large language models work in a comedy green room is not the easiest thing is a lot of other conversations are happening but so alcohol and weed may or may not help I
Starting point is 00:08:31 don't know but yes just you know I'm not for a five-year-old but just like a synopsis really how it works more or less what what we did was to encode meaning into a fabric which the computer could not only understand but could act on rapidly. So you and I might feel like we have a meaningful connection because we're interested in the same things or you and I might feel like, you know, I never got that guy. Um, and how do you teach to the computer whether a is meaningful to be? Right. Well, it turns out what you can do is say, look, let's not, let's start forward from meaning. Let's start backwards from what, what,
Starting point is 00:09:21 what do you understand computer? And one thing computers understand is matrices and linear algebra. Effectively, in mathematics, there are two things. By the way, this is an idea originally due to Paul Bressler, a mathematician. Shout out to an old colleague. And he said, you know, when it comes down to it, there are
Starting point is 00:09:38 only two things we really know how to do, which is calculus and linear algebra. So what they did is they took linear algebra and they said we're going to encode A being meaningful to B if we can turn A and B into two vectors and the angle between them isn't that large in a very large vector space. So you know it's one thing to be in three dimensions where lots of angles have small, lots of vectors have small angles between them, but, you know, it's one thing to be in three dimensions where lots of angles have small Lots of vectors have small angles between them
Starting point is 00:10:07 But if you go into a really large vector space It feels like a lot of the vectors that get shoved in there are going to generically be pretty close to perpendicular So there's a lot of room and again, I'm speaking vaguely. Yeah So what they did is is that they said let's come up with all sorts of different ways in which a can be meaningful to be You know, maybe they could be related by being family or one could be an adjective and modifying a noun or versus an adverb modifying a verb and These are called attention heads, okay Okay
Starting point is 00:10:42 And so you come up with like let's say a hundred or so attention heads. OK. OK. And so you come up with, let's say, 100 or so attention heads, which are all the kinds of meaning that we can encode in language. Right. And I'm sorry. So let's just take dog, for example. Sure. So if you're in the beginning phases of creating an LLM,
Starting point is 00:11:00 would you have to assign to dog attention heads? Would you have to say? So the first thing, yeah, I've skipped a step. There's something that you might call encoding or embedding in which, or tokenization. Tokenization. Yeah, so the idea is that you and I are complex things, but I could take a picture of you
Starting point is 00:11:20 and suddenly I have a representation of you that has zeros and ones and somebody could say, oh, I know Duncan. Right. Or I could just take your voice, I could take the Fourier transform of the sound patterns that come out of your mouth and say, oh, I know that voice. You know, that's Duncan.
Starting point is 00:11:34 So I start tokenizing you all over the place. And then I encode that because large language models aren't really about language, they're about tokens. So now let's imagine that you have dog. Dog is a platonic abstraction. Does it refer to a golden retriever? Does it refer to the clade canine or who knows what? So you take the word dog, and let's imagine that it
Starting point is 00:12:00 doesn't have any subunits. And you encode that and embed that in a large vector space so that there's some vector that represents dog. And then you have the idea that you have a regular dog but you might be specific. You might have a bitch in heat and at some level that is going to be encoded by shading it with some gender concept. Right. So what we're going to first do is we're going to take a bunch of tokens that have very little structure except for sort of semantic structure.
Starting point is 00:12:36 And we're going to encode them directly. And then we're going to try to work with that because, you know, if you say that dog won't hunt and that's a, you know, a political platform plank, you don't actually mean a dog. So it's unclear at the moment what does dog mean. Once you have these words or tokens encoded into the computer, then you try to figure out which tokens are meaningful to which other ones using not syntax, but using context. And so these contextual meanings are what are provided
Starting point is 00:13:15 by these attention heads. Okay, got it. Right, so now the idea is, you know, if Elton John is singing the bitch's back and Beyonce says, you know, Beyonce says, don't be a bitch, take it to the floor. That's not the same thing as a dog breeder. Right. Right. Right. And so you have to figure out, well, how are they using that word? Right. And yeah. Okay. Gotcha. Right.
Starting point is 00:13:40 And so now what happens is that we pass over this structure with what look like Ant asymmetric inner products, so there's a thing called the dot product which a Dot B is always equal to B dot a and this is like a weird version where where it isn't symmetric B dot a and this is like a weird version where where it isn't symmetric Because our our language in the case of true language models has a
Starting point is 00:14:18 Directionality to it. I can't just jumble up the words the way I can let's say Latin Latin will allow me not to Or to ignore word order at a level that English is okay giving and So there's a sort of a weird notion of angle and what we keep doing is searching for all the ways in which words are related to each other in a sentence. Because if I have a bunch of dependent clauses and independent clauses I may have a Proustian sentence where two things are very far separated in terms of the words, but one modifies the other. Gotcha. Yeah. And once I've got a computer concept of meaning using this ridiculous way of assigning meaning, which is just angle. Yeah. Uh, I then ask,
Starting point is 00:14:57 what do you think, what word do you think comes next? And you know, this was used in advertising. For example, you would take a phrase that is naturally occurring. next. And you know, this was used in advertising. For example, you would take a phrase that is naturally occurring and would link it to something that isn't naturally occurring. So for example, bud light, uh, if I said tastes great, what would you say? I don't know. I would. The ad was less filling. Oh yeah. Less filling. Right. Right.
Starting point is 00:15:20 And so what happened was is that everything you'd be over for Thanksgiving dinner and somebody would say, grandma, this tastes great. Somebody else would just, like their brain would auto complete less filling. Right. So it roboticized us and these language models are basically doing that. If I say good morning, it's great to see you. Right. So the idea is that it turns out more or less, this is a lot of what we do with our day.
Starting point is 00:15:45 We think we're thinking, we think we're having conversations, but we're not. We're just having these pre-programmed, pre-scripted interactions. Yes. And so the fact, the really big discovery is that we don't think. It's not that the computers are thinking. The really big discovery is that we're doing almost no thought. Okay, great. Thank you for the explanation. Now see, I love this for a lot of different reasons.
Starting point is 00:16:10 One reason is an explanation for the ick that many people feel when they are challenged with the possibility of AI having sentience or consciousness or whatever. I feel like not a lot of people have come to the conclusion that you just articulated regarding their own personalities. They imagine some more autonomy than exists. They don't spend a lot of time thinking
Starting point is 00:16:38 about why they say what they say and the many habits throughout the day. And so something about the evolution of artificial intelligence, it challenges humanity in a way that is fascinating to me in the sense that, and I had this conversation with ChatGPT by the way, because it's been, I can't remember the name for the form of coding where it's not allowed to say, it's not allowed to use certain vectors I guess you could say. So if you ask, are you a strong general AI? Its response will be no, I'm not as innovative as humans and you know that humans have consciousness,
Starting point is 00:17:21 humans are sentient, not me. Well how do we know humans have consciousness, humans are sentient, not me. Well, how do we know humans have consciousness? Can we prove it? Oh no, that's actually very difficult to prove and I think there's a name for that which maybe you know called the hard problem or something. It's you, how do we quantify consciousness, sentience? How do we measure it in people? Like theoretically, if it's real,
Starting point is 00:17:41 you should be able to have some, like I could take your temperature, I could see how much sugar is in your blood. Can I see how aware you are, how conscious you are? We can't do that yet. And so if we can't do it for ourselves, how can we do it for a computer? And this, I think, underneath it all gives people the ick because they suddenly begin to think, oh my God, maybe it isn't that we have
Starting point is 00:18:14 created something, but rather we just discovered via mathematics the way people process information and think, just like what you said. And nobody wants to think that because there is... Well, this is... it's interesting that you're saying, I haven't heard anybody say this thing, you know, the issue is will we reach HGI? Human general intelligence, that's what you said and I loved it so much because HGI could be the new way of talking about what everyone's been talking about forever. In modern times, enlightenment turned into actualization,
Starting point is 00:18:44 you weren't allowed to say enlightenment. But what you're talking about, strong, this is one of the many thoughts I had as I was going over our conversation. The desire to create strong general AI is the same desire for enlightenment or waking up gnosis or whatever you wanna call it. And it's the exact same thing in that it completely works
Starting point is 00:19:16 with so many Eastern philosophies in the sense that the beginning phases of the thing is first you need to realize that you are not your thoughts, that you are, well, depending on of the thing is first you need to realize that you are not your thoughts that you are Well, depending on what the lineage is that you're looking at the in fact You're empty emptiness emptiness underneath it all a kind of emptiness and the other thing you said in the green room Which was wonderful and maybe I misheard it was it's when something spontaneously pops out of you and maybe I misheard it was, it's when something spontaneously pops out of you, you know, something comes out of that emptiness.
Starting point is 00:19:48 You didn't use those words. That's HGI. Did I get that right when you said that? It's an interesting way of putting it. It's funny, the Sikh separatists have this concept called Kalistan. And I guess in a certain sense, stan just means container.
Starting point is 00:20:06 And kali means empty. The idea of being an empty container is a powerful concept, and the question is how does an empty container fill itself? What does it mean to be filled? It's a complex question. There are tools for breaking out of flatland for self-autocatalytic creation. Well, you know, if you look carefully, Sam Altman was good enough to come to Los Angeles and spend an hour with me in my garden
Starting point is 00:20:47 with a whiteboard because I'm very worried about what's about to... I'll be honest, I really don't think people know where we are. No. And it's terrifying to me that the world's smartest people are saying stuff that is this dumb. What I was trying to say to Sam, who's very, very smart, Sam at some point said to me, Eric, you're just like, this is a guy who doesn't talk too much, doesn't shoot his mouth off like I do.
Starting point is 00:21:16 And he said, Eric, we're a lot closer to general, we're a lot closer to computer intelligence than anyone imagined. This is like a few years ago. So I tucked that away and I said, OK. I don't know this guy to say wrong stuff like that. But what I told him was one of the most dangerous secrets in mathematics, which is that the square root, the lowly square root, is the psychedelic of mathematics.
Starting point is 00:21:43 It opens up the panic room in your mind that you did not know you had. How? Well, if I say what's the square root of four? Yeah. You say two. Two. Okay, so an integer gets you to an integer.
Starting point is 00:21:59 Now I say what's the square root of two? Get my, that's an calculator thing. Oh, it goes from an integer. Yeah, it goes from an integer to an irrational number. Okay, right. But it's an algebraic number. Right. Now I say, what's the square root of negative two? Suddenly you're in the complex numbers,
Starting point is 00:22:16 so the real numbers are a one dimensional system, but the complex numbers are a two dimensional. You just broke out a flat line. Gotcha. Okay, well that's pretty crazy. Yeah. What's the square root of a determinant? It's something called the Fafian, which you've never heard of.
Starting point is 00:22:28 Never heard of. Right. What's the square root of the Pontryagin class? Oh, it's the Euler class. And now you're just like, what's the square root of vectors and tensors? Oh, it's spinors. And objects that require 720 degrees of rotation
Starting point is 00:22:47 to come back to normal. And you're just thinking, doesn't 360 do it for everything? So the square root is the most powerful, dangerous drug imaginable. And you can teach it to a computer. Right, right. You taught them angle, teach it the square root, now it to a computer. Right. Right. You taught them angle.
Starting point is 00:23:06 Teach it the square root. Now you got a problem. Now the thing can break out. Wow. Right now the computer can start to jailbreak. That's so crazy. I get it. That's so crazy. But my claim is is that it's one of the techniques of filling yourself
Starting point is 00:23:22 into taking something that you have and creating something that was never known to be encoded with it. Right, right. That's so wild. Yeah, well, I mean, what you were talking about and you were mentioning this in the agreement, I do agree with you and I find it to be more thrilling
Starting point is 00:23:43 than scary, which may be a little fatalistic of me. Maybe I should be a little more scary. You're not well. I'm not kidding. There is something though that is really special about being in this moment because we've seen not this thing happen, but obviously we've seen this change in human society happen Historically we had the Industrial Revolution the Technological Revolution all of the all the moments that some technology some innovation Very quickly at least relative to what we understand about like evolution,
Starting point is 00:24:26 transforms the entire human population, the way we talk, the way we work. But usually there's more time. With this one, what's so amazing about it is that one, we get to live during it, which is a pretty cool thing. But two, it's happening so quickly that even, and because of what you just explained with
Starting point is 00:24:55 how LLMs works, tokenization, vectors, it's not the easiest thing for people to wrap their heads around. So you have these two interesting convergences, which is we're obviously on the precipice of another of these great leaps forward. Maybe this one's into the abyss, but also it's happening so quickly that people aren't really going to be able to prepare for it. It's going to take a lot of people by surprise. And to me, being aware, even in the most vague way, that you're getting to experience that is thrilling. What a cool time to be alive. Like, wow. Are you out of your mind? Yeah, I guess. I mean,
Starting point is 00:25:39 yeah. You tell me why. It scares you so much what's coming. This episode of the DTFH has been supported by Bilt. Bilt is breaking ground as the first rewards program that hooks you up with points on your rent. Even if you're still rocking the old-school rent check vibes, Bilt Rewards has got your back. They'll mail the check for you. It's like having a personal rent paying assistant. Every month pay your rent and watch the bill points roll in. Use points to jet off on a dream vacation. Put your points toward a flight or a hotel stay with 500 plus airlines and 700,000 plus hotels and properties. Use your points to sweat it out. Redeem your points to
Starting point is 00:26:25 book fitness studio classes. You can also use your points toward a future rent payment or toward a future down payment on a home. Pay rent hassle free through the Bilt Rewards app. Your rent game just got a major upgrade. Bilt points have been consistently ranked the highest value point currency by the points guy and bank rate. Earn points by paying rent right now when you go to joinbilt.com forward slash duncan. That's join b i l t dot com forward slash duncan. Make sure to use our url so they know we sent you join built dot com forward slash Duncan make sure to use our URL so they know we sent you Join built comm forward slash Duncan to start earning points with your rent payments today. I
Starting point is 00:27:13 Phrased that improperly forgive me. I would call me you can say no. No, no, I'm not worried about your feelings. Oh Well, you did hurt him. Both of them. No. Question is, are you out of your mind? I had the wrong intonation on it because I think what you said, uh, is indicative of a belief structure that I don't share. So it's predicated on an idea that we share a reality, which clearly you and I do not.
Starting point is 00:27:46 Okay. I don't think this is one of those things. Oh, wow, cool. I think this is one of those things that never happens. Okay. Okay, so I think that there were six months in 1952, 53, Um, there were six months in 1952, 53 that is, you know, totally unprecedented access to the,
Starting point is 00:28:18 the three dimensional structure of DNA and the ability to fuse nuclei happen within six months of each other. And so for me, that's like BCAD. I'm always astounded that people don't recognize like that. You live through something much more important than the BCAD thing, and it happened in your grandmother's lifetime. All right. I've never seen anything like this. Mostly in my life, nothing has happened.
Starting point is 00:28:41 Post-World War II, the only really big event in some sense was Mao's Great Leap Forward. Was what? Mao's Great Leap Forward. Okay, gotcha War II, the only really big event in some sense was Mao's great leap forward. Was what? Mao's great leap forward. Okay, gotcha. Yeah, yeah. Like many, many, many people died. Okay.
Starting point is 00:28:52 Yeah. Um, so, you know, we didn't have the Spanish flu or World War I and World War II. Right. So we've lived in this bubble of stagnation in which we, we, we constantly talk about the dizzying pace of change. Yeah. This is the dizzying pace of change. Yeah. This is the dizzying pace of change. And this is the kind of thing that essentially never happens.
Starting point is 00:29:11 There's a place in Massachusetts that I go to on the Cape when I'm there called First Encounter Beach, where the local Massachusetts Native Americans encountered the pilgrims sailing over the horizon. Yeah. You know, it's just like the reuniting of humanity. Yeah. Yeah. That was like aliens. Yeah. You know, this is like finding out that the cephalopods have an advanced society under the sea. Because you see, we just totally blew the Turing test. We had this idea that you couldn't pass the Turing test without general
Starting point is 00:29:50 intelligence. Right. Turns out you really don't need general intelligence to pass the Turing test. Right. And so that was a giant mistake on our part. One of the most interesting mistakes of all time. Yeah. Okay. Well now that this thing can do that and now that anything can be tokenized, and now that it scales with compute power, and now that it may take a city's worth of GPUs to train the thing, but you can run it on your laptop
Starting point is 00:30:15 once it's trained. Yeah. I don't think you know where you are. I don't think Alpha Fold 3 has been digested. Right. I mean, are you saying the connection between it and... If you just took what it's already learned how to do, and you say there's no more progress in AI, it just totally stagnates, it asymptotes to this. In 10 years, you would watch this thing chew through humanity like a buzzsaw, right? Yeah, are you wait you're talking about the connection between AI and DNA is that what you're I was protein protein
Starting point is 00:30:54 You're talking about protein folding. So you have something called a primary sequence. Yeah DNA or RNA and and then it translates to a sequence three by you know, so you have these and then it translates to a sequence three by, so you have these three letter codons, which are all words in DNA are three letters long. And those codons select for amino acids. And so the amino acids are like a string of pearls that are strung together inside the ribosome using this transfer RNA adapter mechanism.
Starting point is 00:31:22 And imagine you have like a special, you ever see transformer these cars that become other things. Sure. So imagine you have like Transformers for girls where you have this pearl necklace and you throw it up and it becomes You know a Glock 19, okay, or you throw it up and it's a lawnmower, okay. Okay. Well that thing is the mystery of protein folding that proteins are these machines and amino acids are just these pearls to be strung together. And how do we know what, what, what, looking at the pearls on the necklace, how do you know, you know, whether that thing is going to be a buzz saw or a stenography machine.
Starting point is 00:32:05 Right. So, Alpha Fold 3 seems to know how to do it. Wow. Wow. That is crazy. That is crazy. That means that nanoengineering at a level, like we don't know how to make those machines.
Starting point is 00:32:23 I hope someday you'll meet a guy named Jim Tour who's at Rice University who's like a master nano machine maker and he just talks about like give me the best people in an infinite budget in 10,000 years and I will get nowhere close to what nature has done. Wow that's so cool. So you're saying, okay, I got you now. So, and this, by the way, does fit into Kurzweil's predictions. This is one of the many things pre-singularity talks about, which is the ability to use nanobots to manipulate human DNA and eventually eventually completely like open up that it opens up the possibility for Complete transformation shut down a planet over four amino acids inserted into spike protein and coronavirus. Yeah What do you think about it just for a second the leverage
Starting point is 00:33:25 Powerful. Right? So, 12 nucleotides, 4 amino acids, and you're stuck at home, afraid for your life. Yeah. Right. We have very little time. I gotcha. And because it doesn't look different, the streets don't look that different.
Starting point is 00:33:50 Yeah. You have no idea what's coming. This is it. I did a podcast at Singularity University and this was one of the things that they were talking about which is here are the problems. The more this things advance, there's a pattern, which is things that are very expensive
Starting point is 00:34:13 tend to become less expensive and then eventually become consumer level products. I want you to load ChatGPT2. And another great Texas company, shout out to Continuum Analytics, their Anaconda Python platform. And I want you to start getting ChatGPT4.0 to write code to allow you to watch ChatGPT2 think.
Starting point is 00:34:39 Okay. You're gonna be so flipped out. I can just watch this thing think word by word by word Yeah, that's cool. I want to watch it. I want to watch it. You will until you won't it's really It's scaring you because it's okay. It's scaring you because you're recognizing that within this these all of these discoveries are happening in a relatively short amount of time and The discovery each of the discoveries is Potentially really great for humanity like understanding protein folding. It's not just that you could create the ultimate bio weapon that you could
Starting point is 00:35:20 Get the DNA of someone you wanted to assassinate and theoretically create some kind of. But imagine that you don't like an ethnic group. Yeah. And it has a particular surface pattern. Right, mass genocide. And then if that did happen, you would be looking at an insane race in protein folding between coming up with new vaccines
Starting point is 00:35:42 for these obscure genocidal. This is not, you can't stabilize what's about to happen. Yeah, it's shattering. And it is the only, and I'm not saying this is something to be comfort in, but it isn't exactly coming out of the blue. This was, you know, this is almost perfectly like, Kurzweil came up with the map, this map. He was off with mRNA because it happened sooner
Starting point is 00:36:14 than his predictions. Like the pandemic inspired the mRNA vaccine, which I know it's a controversial topic regarding its efficacy, its danger, whatever. But what that was... It's a technology. A technology.
Starting point is 00:36:31 So that pushed everything forward a little more. I think he was off when it came to AI too, that all this stuff is happening sooner. But he, within his predictions, like laid out a map for what you're talking about, which is the apocalypse. I mean, it is... It's the apocalypse of what was, and then the question is, is there a what will be or is there no what will be? And you can't, and this is, I get what you're saying because this is the whole point of Singularity University.
Starting point is 00:37:00 You can't predict beyond the Singularity. There's no way to do it. You can only think... The best way to predict the future is to invent it okay yes I agree with that yes you say you agree but I agree but nobody does look I'm just I want to Not that, not that. We are in some deep groove. I cannot break the conversation. There's certain people who have a hold on our minds. So like one person is Ray Kurzweil. He's got a hold on our mind.
Starting point is 00:37:38 Michio Kaku has a hold on our mind. Sean Carroll has a hold on our mind. Steven Pinker has a hold on our mind. All of these people, Malcolm Gladwell, and other people who are like telling you straight to your face what's going on, don't have a hold on your mind. And it's very strange.
Starting point is 00:37:58 And I look at all of the people I can think of who have a hold on your mind, and it's not a question of like who's the best or who wins and who loses. I can think of whoever hold on your mind. Yeah. And it's not a question of like, who's the best or who wins and who loses. It's like not one of them other than Elon has like a positive vision for escape. And his is, his, I just don't understand.
Starting point is 00:38:20 How does he come up with chemical rockets to Mars? Right. It's like, everything is right, right, right, right, right, right. Chemical rockets to Mars. It's like you've got to be kidding me. Why? Why is that bad? Because there are only two places to go with a chemical rocket.
Starting point is 00:38:34 You can go to the moon, you can go to Mars. Right. Assume you've got both of them. Throw in Titan. Imagine you could go to Titan and IO and you know huge colonies and It's not enough It's interstellar or bust. Okay, that's the only and by the way that sounds so dumb to our ears Oh, we're gonna go into stone
Starting point is 00:38:58 You know if you say this to Joe is like, oh, okay. I've got something in my trunk. We can go interstellar I want to I like it. I don't think it sounds dumb. No, but it's physics. The whole thing, I'm like a broken record. I can say it a million times to 20 million people and there won't be a single person who understands what I'm saying.
Starting point is 00:39:16 There's only one way out of this and we're not going, we're not doing it. What's the way? Physics. Okay, so physics. You know like this alien thing? Aliens have a hold on our mind We have the idea that we're being visited potentially are we being visited does the government know if they're being assumed for the moment that you actually
Starting point is 00:39:33 Care about this because you talk about it a lot. Yeah, I've met you but one right Okay, if they can get here we can leave Right and you're not interested because right now we're not going anywhere. As long as Albert Einstein is in charge. Okay I got you. Okay. Right and so you know in psychedelic terms people say the map is not the territory. Well you, but okay let me challenge your idea that this is something that needs to be escaped. This is the assumption here. And I understand why, in a dumb person's way, why you might want to escape based on what
Starting point is 00:40:11 you're saying. True. But by escape, I mean survive. Survive. Okay. So your assumption here is that this thing that's coming is, like many people are saying, not in the same way, but this is an extinction level event that is, it's already too late. You can't get the toothpaste back in the tube.
Starting point is 00:40:33 It doesn't matter. You start the most massive regulation. You cannot regulate this. You can't regulate it anymore. So this is an assumption that seems to be based on the idea that humans have erred in our innovation. That because of these series of discoveries. It's not like we meant to fuck things up, but just because of these series of discoveries. Sure. It's not like we meant to fuck things up, but just because of our minds and because of our ability to encode data in our own brains,
Starting point is 00:41:14 we can build on top of prior discoveries. This has all led to this. Yeah. And via just market pressure, whatever you wanna call it, we're tool-using beings, and we wanna find a way to create those. This is cool. Hydrogen bombs are really cool.
Starting point is 00:41:30 Right, well, I mean, yeah. We want to do more hydrogen bombs, up to the point where we realize what we're actually doing. Right. So basically the goodie gradient in our mind, by the way, shout out to my brother who came up with, I think, the goodie gradient. What's the goodie gradient?
Starting point is 00:41:44 You're just, you're getting increasing dopamine hits so you keep doing the path of steepest descent. Yeah, that's it. Right, and so basically Edward Teller talks about this. It's like you're moralizing this but I can't stop myself from working on the problem. Okay, well. So it's not that we've done something wrong.
Starting point is 00:42:03 This was always going to happen. Yes Here's what I don't understand. There are no adults on this planet anymore. None Elon is the closest and he's like a totally chaotic adult define what you mean by adult Somebody working on the problem somebody who actually cares about the problem enough to speak about it and sound like an idiot Okay, right. So right because so right now you have a situation where you have a tiny amount of time. Yes to do something You're talking about regulating something that you're not going to regulate You know, you're talking about maybe it's good. Maybe it's bad. Maybe it's like I Is there not one Shackleton on planet Earth with lawyers, guns, and money who wants to do this thing?
Starting point is 00:42:47 Well, again, you know, I know that you're, I see why we disagree here. And you're gonna hate it when I say this. You're gonna say something enlightened. No, not enlightened. I'm gonna quote Tim Leary, maybe it's enlightened, his advice when you've taken too much acid. Yeah. Lift up your legs and float downstream.
Starting point is 00:43:07 Now this thing that we're talking about. Yeah, yeah, exactly. That is the least Jewish thing I've ever heard in my life. He was Irish. But, but. Sorry, but the Irish and the Jews are pretty connected. And I can tell you there are a lot of Irish who want to fight this thing. OK, but what if it's not something fightable,
Starting point is 00:43:28 and I think you are saying that. What if you're looking at something that is an actual something built in to DNA? I know you work with Avi Loeb, and I've had him on the podcast, and what if what we're seeing- Avi's got courage. I love him, but what if what we're seeing here, and again you when you're in the conversation, where's he from? Oh
Starting point is 00:43:50 Yeah, Israel Yeah What I'm telling you this the whole point is it's time to go People are our entire calendar is based around this crazy holiday in which it's the escapement. Passover is not a ritual, it's not a dinner, it's not a holiday, it's the survival meaning. You have a lot of assumptions of what you're saying. One of the assumptions you have is you can escape.
Starting point is 00:44:15 You think, let's imagine that the technology does exist, that Musk figures it out, it's not chemical anymore, it's we figure out a way to create the, what's it called, the Einstein-Rosen bridge bridge we're doing wormholes now we can do more holes are for pussy okay well whatever technique no I don't want to hear about Einstein Rosen bridges I don't want to hear about generation ships I don't want to hear about rebooting from silicon I don't want hear about generation ships. I don't wanna hear about rebooting from Silicon. I don't wanna hear about tardigrade. Enough.
Starting point is 00:44:47 I'm sorry, it's my favorite things to talk about. You know I love tardigrades. You know I love you, but I would like, you have a kid on this planet? Three. Let's get him out of here. No way, first of all, where do you wanna go? Where's your escape plan?
Starting point is 00:45:01 Where do you wanna go? This is the problem with people who live with light pollution. Go out to the effing desert. Pick a moonless... You want to go up somewhere? Yes. Okay, where?
Starting point is 00:45:12 Think about the night sky as your bucket list. Alpha Centauri? I don't know what's out there. The Pleiades? So how are you gonna explore this? Well, I'm saying that if your motivation to go interstellar is fear, then you're just- No, no, no, no, no, no, no, no, no, no, no. You wanna get there to escape.
Starting point is 00:45:32 I wanna survive to thrive. I don't wanna survive to survive. Well, I'm afraid your biggest assumption here is that this thing that we're summoning on the planet is gonna let you go. You imagine that you're gonna- I have other things imagine that you're going to be able to get out. Look, I don't know how to explain this, but anybody who's sane at this moment will appear completely crazy if they
Starting point is 00:45:54 understand where we are and what just happened. You know, because for example, you probably don't know that. I think it's Howard Borland story at at the Progressive in the late 1970s. And the Progressive magazine had the crazy idea of can we recreate the Teller-Ulam design for the hydrogen bomb? Right. And they put a guy who didn't know any physics on the case. Okay. And just through sleuthery it turns out that the US government had released so many bits and pieces of information Yeah, just by collecting them you could figure out how to get a fission to how to get a chemical device to Trigger a fission device to trigger a fusion device, right?
Starting point is 00:46:38 Okay, now you let this thing read the entire corpus Do you know how many things like that it could reconstruct the Teller-Ulam design and make it you know here's how to make it cheaper and here's that here's how to do it in the high school chemistry lab blah blah blah. If you're smart and you want to survive you'll sound like a lunatic and because of the high social penalty and the number of people who are like, you know what Baba Ram Dass said?
Starting point is 00:47:10 It's like, I don't care. Many people, listen, many people say that. I'm not trying to get spiritual on you. No, no, no, I don't mind being spiritual. I am spiritual. I want to survive. I want my children to survive. I want to have options.
Starting point is 00:47:24 And I want humanity excited about an indefinite future. And what I see is humanity has figured out we don't really have one, we might as well party. Well, okay, right. So you're thinking that a sort of fatalism has fallen upon the planet because of the- Secular nihilistic fatalism. Secular nihilistic fatalism, the climate change alarmist, the anti-
Starting point is 00:47:47 We're all fatigued also because- Exhausted. Yeah. Exhausted. And so suddenly what's emerging is sort of end of the Roman Empire decadence, a kind of like- Holy cow, is that what we're doing, right? That's what we're doing.
Starting point is 00:48:02 Yeah. Almost like it's built in. You know, Marc Andreessen, who basically created the modern era of the web through Mosaic and all he was doing with Netscape, you know, pointed me to the passages where they're talking at the end of the Roman Empire, like, there are rumblings that there are problems in the far flung provinces, but in Rome, all of the fish ponds are beautifully stocked.
Starting point is 00:48:26 Oh, I love it, that's so creepy. But listen, I wanna really press this point. Go for it. So, the moment where they started, I'm so sorry, I'm not a scientist, obviously. The moment when- I'm so sorry, I'm not a comedian. Yeah, the moment when they started figuring out protein folding. The moment when they started to understand how to split the atom.
Starting point is 00:48:55 Many of these innovations initially were not driven by trying to destroy humanity, but were driven by wanting to help. By just a sense that if I can. It's not one way or the other. I think it was 1911 where Rutherford said maybe there's a neutron. Okay. I don't think the average person understands
Starting point is 00:49:18 that that was game over for humanity. Just maybe there's a neutron. Right, well when he said that, he wasn't thinking I'm gonna fuck up humanity by realizing that maybe there's a neutron right well when he said that he wasn't thinking I'm gonna fuck up humanity by Realizing that maybe there's a neutron maybe there's a new and probably someone or near him was like shut up Don't talk about that, but what the what I'm saying is your Desired obviously your desire is to help humanity a beautiful thing And as I was to help humanity, a beautiful thing. And. My desire is to help humanity, but realistically, it's my two kids.
Starting point is 00:49:51 Okay, and all of these benevolent desires that have inspired so many other. Selfish desire, but go ahead. The selfish desire, selfish benevolence is a real thing. You can have them together, but you, all of these things have led to the very discoveries that are making you Want to get the fuck into space look? I also think that it's very helpful as an atheist to think as a religious person sure okay
Starting point is 00:50:16 So you have to you have to arbitrage these two impulses against each other in the mind all right? I? Don't think we're meant to die now All of my atheists, what does that mean meant? Go away. You want to live forever? No, I'm not built for that. It's terrible. The richness of my mind and the fact that I was constructed out of something other than titanium, I just think, you wanna talk about great comedy? That's funny. I want us to live forever. You want us.
Starting point is 00:50:59 I want us. You mean you want the species to keep going on. You're not so concerned. I love this planet and I love who we are despite our murderous psychopathic backstabbing treachery I Listen to what I Don't know you know I keep playing the guy named Chris Buck Recorded a version of Crossroads on acoustic guitar. I've probably listened to it 500 times. Yeah You'll have to send that to me. Oh, it's beautiful. You know it's just like the cello suites to me. I just could hear it over and over again. I love what we are.
Starting point is 00:51:33 Me too. Yeah. I don't want it to vanish. I'm not done with this movie. I want to watch it again. That's so sweet. Oh my god. You can't. That that's see this is the problem your number one. I think you're confusing Fatalism some people what you would call fatalism maybe with just a general Recognition. Thank you my friend of impermanence and it's very sweet that you want humanity to continue forever something completely outside of your control. And also something that if we look just at the history of the planet or even look into those beautiful stars that you want to travel to some unknown spot in. Have a lychee with me.
Starting point is 00:52:18 I'd love to. These stars, they just blow up. God knows how many civilizations have been wiped out up there. Just like ours will be eventually It's a sweet thing, but I don't think it's very realistic Not in the way you're thinking not by getting your meat body into space Shackleton wasn't a realist Shackleton wasn't trying to go to space. He was lost Shackleton was trying to survive
Starting point is 00:52:43 The time the time well look I lost. Shackleton was trying to survive. L'chaim. L'chaim. Well look, here's my problem with what you're saying. Number one, I would love to go on a spaceship with you. I'm sorry that I do break some lychee etiquette. You're obviously not a frequent lychee eater. I knew you were gonna call me out. You caught juice everywhere. You set me up. You set, you trapped me. It set you trap me. It's a leachy trap. Wait a second. How great is this?
Starting point is 00:53:11 Delicious. It's delicious. I love it. Okay I want that to go on forever and I want to listen to Box B minor mass and I want to marvel at La Sagrada Famiglia and these morons, these idiots that we handed all of these tools to. And the amount of money that goes into banging hot chicks in Saint Tropez every fucking summer. It's like, does nobody know anyone with lawyers, guns, and money?
Starting point is 00:53:36 Are we not gonna form some coalition of cool kids to try to take care of this thing and explore the star? Are we really gonna take the entire impulse to survive and plow it into SpaceX to go to take care of this thing and explore this? Are we really gonna take the entire impulse to survive and plow it into SpaceX to go to Mars? I have no idea what we're doing. I am so confused. I've spent three weeks on the island of St. Helena this year.
Starting point is 00:53:54 Wow. I love this place. Anybody who is actually practicing for Mars would be trying to electrify St. Helena because it's so isolated. Wow. Right? You know, they have to wait weeks for car parts. Well, and yeah, I can't just order from... I got you. Okay. If you're serious about humanity and you're not just, you know, trying to jack your stock price, you should be practicing
Starting point is 00:54:23 on Saint Helena. Saint Helena is Mars 0.2, according to my colleague, Ben Dela, he came up with that as the catch phrase. That's cool. Yeah, I know what you mean. There is a kind of American utopian vision in what Musk is doing, it's a classic American trait. We're gonna build a... He's so American, for that South African,
Starting point is 00:54:42 he's so American. So American. And he's basically right about everything. You know, the reason most people I'm harsh on because they don't get it. I just don't, I have no concept of what this human being he's, I'm pretty sure that God gave him this money to save humanity. Well, again, I just think regardless of whether humanity needs to be saved or not, and it's a very dangerous thing to think, you know, quite often when people think this or that
Starting point is 00:55:11 group needs to be saved, it leads to all kinds of problems. But I want to burden you very quickly with a Buddhist story. It's very quick, quick story. And I'm going to replace the strawberry with a leechy. Is it lychee or leechy? Leechy. Like Nietzsche. Like Nietzsche? Anyway, a man is walking through the forest. Looks a lot like you and he hears something. It's a lion. Chases him down a path. He gets to the edge of the cliff. There's a lion right behind him.
Starting point is 00:55:46 He looks down at the bottom of the cliff, at the ground. There's another lion. So he crawls down the cliff and reaches out onto a branch. He's hanging there, lion up top, lion at the bottom. And he sees a lychee.
Starting point is 00:56:02 I don't know what they grow on, but it's growing off whatever they come off of. He reaches out, grabs the lychee, and eats it. And that was a very sweet lychee. That's the story. You're the man hanging on the branch in between the two lions, a recognition of you're fucked.
Starting point is 00:56:24 But you are enjoying leeches. Like I'm not saying this in a fatalistic way. I'm saying that this attempt to save, to permanentize, to extend, I think it's a beautiful thing. It's coming from a pure love. Is there no one else out here? Is there no one else out here? Is there no one else out here?
Starting point is 00:56:47 I just don't, I don't grasp this. I'm in a conversation in which how did we win world war two? How did we, how did we do everything we did? Everybody's gotten so enlightened that they just give up the fight. I haven't given up the fight okay I just don't think going into in this to the into the void around our planet is the necessarily into the void work this is our womb it's not our home hmm this is our womb it's not our home well look the odds based on the timeline that I think you're emphasizing here.
Starting point is 00:57:25 No, sorry, you don't know anything about the timeline. It's okay. This is actually coming full circle. Yes. We're not actually having an HCI conversation. I want to have an HCI conversation. We're having a chat GPT conversation. I think it's an in between.
Starting point is 00:57:42 It's a great conversation. I, it's a great conversation. I'm not, I'm not. No, I love it. I'm trying to say that what's happening. I want to thank Squarespace for supporting this episode of the DTFH. Squarespace is an incredible toolbox that you can use to make beautiful websites and they've always been incredible, which is why I love doing ads for them. But my god, they have broken through to the other side.
Starting point is 00:58:10 There's a lot of reasons to make a website. Maybe you have a business, maybe you have a podcast, or maybe you just want to create some kind of artistic chaos in the world by creating a website for a fake company that I don't know you could send to your friend and Tell me should apply for a job there except you're the one who gets the application That's just one of the many reasons to make a fake website But the problem with that late at night when you think you want to troll one of your friends is you realize You're gonna have to write copy for every page of that website and you don't want to do that You don't want to write the bios of the executives
Starting point is 00:58:48 and the mission of the company that, I don't know, is like hunting rare whales or something like that. So you don't do it. You don't buy the domain name and you don't build the website. But now you can use AI. That's right, you can go on Squarespace get the domain name and then using their Squarespace blueprint AI and SEO tools
Starting point is 00:59:12 you can let the AI populate the website with all of the copy all of the descriptions all of the about this company, everything, anything that you want in there, the AI will do it for you. That's right. So now when you're thinking about, is it really worth it to troll my friend? It's gonna take at least an hour to fill everything in.
Starting point is 00:59:40 Now you can do it almost instantaneously with Squarespace. And of course, if you're just somebody who wants to make a beautiful website and you realize that the AI is going to write a much better description of your business than you're able to, then it's perfect for you too. Not only that, Squarespace has everything you need if you want to make a website for your podcast, including members only areas and they will help you send out emails to your fans or clients that don't look like they're about to get scammed by some kind of hacker.
Starting point is 01:00:18 So try them out for free. Go to squarespace.com forward slash Duncan. Use offer code Duncan when you're ready to launch and you'll get 10% off your first order of a website or a domain. Again, it's squarespace.com forward slash Duncan. Use offer code Duncan to get 10% off your first order of a website or a domain.
Starting point is 01:00:40 Thanks Squarespace. Is, I have this... I could write down 20 responses to the idea of going into space. Yeah. People don't know that I've already got their responses on a list. Oh, shit.
Starting point is 01:01:00 So, what... There is something about us that wants to say oh We're ten thousand years away from what you're talking about. Mm-hmm Okay, take a take a picture of 1902. Yeah When the Wright brothers, you know started with powers sure and We're landing on Titan and sending a photo back with a mission that begins before a hundred years are up. Right.
Starting point is 01:01:31 You have no idea how far we are away from this. Well, I mean, it's an unmanned thing on Titan. Poor thing. Stuck up there on Titan forever. If you want to get into the, if you want to get into, you know, the fact that 1969, get into the if you want to get into you know the fact that 1969 so 67 years later you know I say this thing to people like take a look at the Civil War yeah there are people who saw action in the Civil War fought the Civil War who lived to see the hydrogen bomb in the Pacific yeah that's crazy the Civil War was almost a thermonuclear war. It missed it by that much. Whoa, that's awesome. That's so cool.
Starting point is 01:02:10 But nobody, okay, these are thoughts that, it just doesn't matter. People are like, hey, will you do my podcast? And I'll go on people's podcasts and I'll say this shit. And it has no effect. What do you want me to do, shit my pants? No, no, no, no, no. What do you want from us?
Starting point is 01:02:24 Listen, the thing is- Wait, Duncan, you know that this isn't personal my pants? No, no, no, no, no, no. Listen, the thing is... Wait, Duncan, you know that this isn't personal to you. No, nor do you. Right. What I'm trying to say is, isn't it interesting that there's something about Terrence Howard or Ray Kurzweil or Sean Carroll that they speak in a particular way in which their ideas become resins? Yes. We will have this discussion about Boltzmann brain till the end of time because Sean Carroll
Starting point is 01:02:47 cared about it 15 years ago. I don't know why. The multiverse, the multiverse, the multiverse. Hey, have you talked about entanglement? It's like entanglement. Oh, Schrodinger's cat, man, that's the thing. It's like, well, are you not moved by Clifford algebra? No, nobody cares about Clifford algebra.
Starting point is 01:03:03 Nobody cares about the fact that there's the 720 degree mystery in the world that's everywhere or the circle, the U1 circle at every point in space and time. You know, these things are equally mysterious, equally important, equally valid. And what I'm just trying to say is, is that the resonance of any particular concept, we are in a time when survival is not resonant. Fighting for survival, fighting for the glory. It's still the same old story. But you fight for love and glory. We don't relate to that song. No, this is the problem is your definition of survival seems to be a personal definition. Your definition of
Starting point is 01:03:44 survival, you want us to accept your definition. Your definition of survival, you want us to accept your definition of survival. I want us to continue to make love to each other and have babies. I will make love to you until the end of time. Right here, right now. We're gonna break the internet. Two straight guys.
Starting point is 01:04:00 I like, it's a very, you're a romantic no I do differential geometry I sit there and I scribble in a notebook you're a you're a you're a romantic man I don't think the nicest thing anybody's I don't mean to have you're a romantic you're in love with your kids you love the planet and you're incredibly smart you're frustrated because your ability to translate. No, it's not that. I can translate from, I have an enormous following. People are, you know.
Starting point is 01:04:32 No, I mean. It doesn't, what I'm trying to say is people listen to this stuff, but they don't realize that I'm talking about actionable stuff. What's the action? We need, I mean, I'll say it. I'm so bored saying it, but it's like, it hasn't resonated.
Starting point is 01:04:47 It hasn't ran any once. The only thing capable of saving our lives is theoretical physics. And the only thing we can do is make progress. And we have a collection of, I don't know, 15 people who have stagnated all of theoretical physics at its deepest level by repeating something. String theory is the only game in town.
Starting point is 01:05:14 It's the only game in town. There are no other paths, there are no other routes. We must quantize gravity. It's just like, who ordered these people? Man, let me tell you, if the only thing that's gonna save us is theoretical physics Yeah, we're fucked Because
Starting point is 01:05:31 Are there any other comedians available that dog that dog won't run that dog won't hunt baby no theoretical physics shit I get it now. I think if you if you know what's really cool Eric cannabis is a service We're gonna have drone delivery of cannabis. It's like, okay Why don't we have another market Congress? Well, you know, have you thought about crowdfunding? For God's sakes. Why is it that it's you know, I have this this concept called anything but physics or anything but survival Yeah, there's certain topics and it's like trying to roll a ball up some very like theoretically you could roll it up this inverted cone and get it to balance here if you did everything absolutely perfectly. But every conversation about survival and
Starting point is 01:06:21 physics and just being decent to each other and following scientific norms and getting rid of failed people and letting other people try goes into the same trough and it's always the same set of patterns which is like Eric don't you realize we're two million years away from that mmm like you'd know you have no idea move get out of the way go right right and so my claim is is that and I am I'm owning it This is a Jewish perspective. I love it. We survive Yeah, they tried to kill us. They failed. Let's eat. That's that's us, right?
Starting point is 01:06:54 Yeah, I know I know and that's in you and that and I get it. It's in it like and I imagine it's incredibly frustrating I say that thing about theoretical physics only because the it is I I can't agree or disagree with you. No. Because I don't understand. But here's the thing just the don't opine that's the only thing I might ask. I don't want to opine. Okay so my claim is you know people just say well that's never gonna happen like that's never gonna happen. Like, that's never gonna happen because you keep saying that's never gonna happen.
Starting point is 01:07:29 Okay. You know? Well, I want it to happen. In terms of, in terms of like great philosophers, I follow one called Ludacris. Yes. Get out the way. Listen, the good news is a lot of us can't even get in the way.
Starting point is 01:07:46 I hear you. Because we don't understand it. Well, but going back to chat GPT. Yeah. Sam really gets it. He's taken a lot of shit. You know, there was a very bad blood between him and Elon, obviously.
Starting point is 01:08:04 But I know Sam. And he's like Edward Teller. He's struggling with the real thing. We demonized Edward Teller. We're going to demonize Sam Altman. But Sam is looking directly at this thing. And we're not listening to him. Sam was talking about UBI
Starting point is 01:08:31 Years and years and years ago. Yeah, he's like I have to do experiments Because I have to take care of people because I know what's coming right does that guy get any credit for that? I didn't even know that that was one of his holy cow. He might my wife's an economist We were talking to Sam and Sam was just like, look, this is gonna break capitalism. Yeah. He's right. Yeah, he is. And he cared and he was like, he's planning ahead.
Starting point is 01:08:51 Right. And now he's just taking shit like, all he is is a, I don't even think Sam is that motivated by money. I didn't even know people accused him of being a motivated by money. Oh, absolutely. You know, it's called, I don't know that he's being honest. I think he's being meta
Starting point is 01:09:07 honest and he's not necessarily being honest. Right. I think he understands what's at stake. You know the problem is we I wish I knew everything he thought because I feel like there's a lot of things that are happening over there that they aren't announcing right away. Would you want him to announce it? No. No and actually I really have a lot of respect for. Quite honestly, Sam, this stuff should be on a military base. Yeah, I think he recognizes that. I think that the idea of doing this as a business is insane.
Starting point is 01:09:39 Yeah, well, I mean, listen, remember that thing that just happened where just out of the blue, they ousted him and then they brought him back in What was that? We don't know right? That's a big scary question mark, isn't it? That that moment whatever that was but clearly there's like a lot of but the people who are look There are levels and levels and levels to this thing if I thought you could regulate this thing I would be on the team regulation. Yes, sam If I thought you could regulate this thing, I would be on the team regulation. Yeah, same.
Starting point is 01:10:05 The problem is that, please read this 2017 paper. And by the way, do you know about 3Blue1Brown? No. 3Blue1Brown is the channel of a guy named Grant Sanderson. Grant Sanderson is the most gifted technical explainer on the planet. He's developed a software package and a, and a mind. I don't think he has a PhD, but he's got a PhD level mind and beyond. This guy is a national treasure. We should be, he should be under armed guard. He explains chat GPT and the transformer architecture from this paper.
Starting point is 01:10:44 If attention is all you need the attention mechanism and these guys who get it, I don't think they can be honest. I, this is too dangerous. You know, do you want somebody to be, to be honest about anthrax and its weaponization? Like put the plans on the internet remember this kid who Scavenged like amarycium from smoke detectors. Yeah, it's a reactor. Yeah
Starting point is 01:11:18 In a weird way I've known Sam to care about humanity the entire time I've known him. I've known him about a decade. Yeah, and I don't necessarily think he's being honest. I don't think that he's being nice. I don't think he's being unstrategic I mean, I don't I Think he for as long as I've known him he's cared deeply and he's known what was coming Yeah, that's every interview I see him do. He is very open, by the way, he is open about some things that aren't palatable. His admission this will destroy capitalism.
Starting point is 01:11:55 I've, he openly says that. He isn't tiptoeing around the dangers of the technology. When I, I initially was very frustrated with chat GPT because of it being, because of it being nerfed. But then when I got to hang out with a unnerved LLM, I got it right away. It's like, oh fuck, this is why they do, this is why they have to put guardrails on this thing.
Starting point is 01:12:22 It will write a poem for you about how to make meth. You know, that's the test of if you're if you really have an uncensored LLM, ask it the meth recipe and it'll it'll write a poem, a haiku, it'll write a poem in the style of Shel Silverstein with a way to make meth among many other things. So I get the guardrails and I think that points towards him being a good person because he recognizes that the moment you nerf your LLM it creates an opening for an uncensored LLM. Well so you can get around the nerfing. Yeah exactly so this is this is the problem like there's no way to stop it I mean you're just gonna be putting holes in the fucking damn or fingers and holes there's not enough fingers. But you know now that we've done like the gloom
Starting point is 01:13:06 and the doom, let's also just do the raw excitement, because in part, if we don't recognize how cool this is. Yeah. Right, like there's nothing cooler than a hydrogen bomb, there's nothing cooler than a laptop that you can watch, think in real time. Exactly, that's how, I'd never thought of hydrogen bombs as cool, and I do want to say after this conversation,
Starting point is 01:13:26 Oh, do me a favor, go into Oculus. Let me say one thing real quick. As a dumb person relative to you, don't snore me. As a dumb person relative to you, it takes me longer to process data. My AI takes a little bit longer to train than yours. You do crowd work? Yeah. Okay. How does anybody do crowd work? It's the most frightening thing for a non comedian to watch a comedian launch
Starting point is 01:13:52 himself into the void. It's fun. That's fun. Okay. Then let's not have an intelligence conversation that insults my intelligence. You guys are amazing. Well, okay. So we, listen, we were both impressed with the others field There you go field, but I am I will say this the prior conversation I recognize the feeling I have which is like you have said things here that I will process But I already feel uneasy like already have like a weird gloomy uneasy feeling so I'm glad we're jumping into the exciting we were Worried about what was coming I would be selling you the excitement
Starting point is 01:14:29 Because I look look let's just take a totally different spin on this. Okay Imagine that you actually said Eric I have no idea how far we are from being interstellar and I said, I think we're actually pretty close. I Think we're much closer than we imagine. I love it. And I think that it's really down to just paying some people to take a retirement. Hit the shower.
Starting point is 01:14:53 I don't need to persecute you. You've ruined physics for 40 years. Can't we get these people a nice, you know, retirement and Bora Bora? I mean. Let's just get 15 people, you know, retirement and Bora Bora. I mean. Let's just get 15 people, you know, millions of dollars to shut up about the idea that there are no other approaches to theoretical physics.
Starting point is 01:15:12 Can we not find lovely parting gifts for our contestants who failed to do physics for 40 years? No, they're not leaving. Okay, but if we could, if those people were cleared out tomorrow, I think we could make progress in physics at the speed that you've seen large language models take over. I mean, that's so incredible. It's so incredible.
Starting point is 01:15:36 And imagine that you're looking at all these beautiful things from the Hubble Space Telescope. Yeah. And instead of saying, oh man, if we could do an Einstein Rosen Bridge and a wormhole there, or maybe if we could slow down time through time dilation and we could use generation shift. Like imagine instead of all of that, you had somebody say, huh, we know that Einstein's general theory of relativity has to be wrong because of the two-wind singularities at the point of inception in the Big Bang, the initial singularity, and the Schwarzschild singularity at the bottom of a black hole.
Starting point is 01:16:09 Ergo, there is a next theory beyond Einstein's that does to Einstein what Einstein did to Newton. And if we found that, maybe there are Easter eggs and cheat codes in that. And the problem of getting to the stars has nothing to do with Einstein, Rosen, Bridges, or time Dilation or this or that early any of the stupid stuff that we've been discussing for n years that glides nowhere Yeah, and it's not science fiction and it's not pseudoscience. It's just science. Yeah, okay, and maybe at some level The creator is calling us home and saying you don't have to perish
Starting point is 01:16:45 Listen, I'll tell you this. I love that. You said the creator is calling us home because it is interesting to me terrestrial beings Stuck in a gravity well because of the fucking weight of their planet when we see pictures of out there Did you see gravity? Well, please don't bully me. Just so you know when I said Einstein Rosen bridge I was proud of myself. No, no, no, I was thinking, you know't bully me. Just so you know when I said Einstein Rosenbridge, I was proud of myself No, no, I was thinking you know, mrs. Roberts. You're trying to seduce You know as I'm like Just going through the mush of my brain trying to summon up Einstein Rosenbridge. Yeah with a physicist
Starting point is 01:17:24 I'm like, he's gonna really like that word. I'm an entertainer. Don't tell me about Einstein, Rosen, Bridges. But yeah, it's interesting to me that when we do see those pictures, we wanna go there. Yeah. It draws you, it calls you. How do I tell you there's a chance?
Starting point is 01:17:44 There's this meme. So you're saying there's a chance. Right, there's this meme. So you're saying there's a chance. Yes, I'm saying there's a chance. I'm saying the universe wants to date you. I'm saying the hottest chick in your high school may have taken a liking to you. And it's like, you're not gonna go to the gym, you're not gonna make some money,
Starting point is 01:18:03 you're not gonna put some effort into it. You're not gonna buy, the gym, you're not gonna make some money, you're not gonna put some effort into it. Love it. You're not gonna buy, you know, get your jacket dry cleaned. No, you're gonna say, where's my Xbox? I wanna play Call of Duty for a few more hours. God damn it. I got you, I got it, I got it, it's beautiful, I got it. I see where you're going here. It's beautiful.
Starting point is 01:18:21 It's beautiful. And it's so beautiful that probably when people hear you know that girl likes you you're like shut the fuck up no she doesn't it's almost too beautiful it's almost unbearably beautiful it's so unbearably beautiful and oh and if you're me in high school many a heartbreak that you're like I'm not even gonna open up to the idea. My dream is if my physics stuff ever gets evaluated and it turns out that it's not wrong, and I get invited to the Tonight Show,
Starting point is 01:18:52 the song I wanna come out to is Baby Even the Losers Get Lucky Sometimes. Have you ever listened to that thing? No. Tom Petty wrote an amazing song that we gloss over, right? And it's about a guy for some reason Ends up getting lucky above and I don't mean lucky and just having sex but somehow Yeah, some girl who's way out of his class
Starting point is 01:19:17 Way out of his league Smiles on him and they share a moment You know and the idea is man it's such a drag to live in the past because he didn't know what he did to make it work. Okay? Now my claim is I think that the universe is winking at us. Yeah.
Starting point is 01:19:38 I think it's saying, hey, well done. It's time to go. Yeah. Come. Visit. You've seen the pictures. You've launched a space telescope. You launched a space telescope. I mean, it's insane. It is beautiful. Okay. And the mirrors unfolded and everything worked. And you're seeing things that have never been seen. Right. Like the deep field and you don't want to go what the hell is wrong with you. Yeah it's like you you're born in some corner of a hometown of 217 people and you don't even want to go to the like other side of town Have you ever read the painted bird? No, oh my god, you'll love it. Tell me this is you
Starting point is 01:20:22 You're it's a very very It's a very disturbing book. Yerzy Kaczynski, it's this kid during, I think it's post-World War II, he's like wandering through the forest. It's a horrible time and he's a survivor. He's a survivor and he's just doing everything he can to survive and he shouldn't be surviving. He's a kid in this apocalyptic landscape, but this is you.
Starting point is 01:20:47 There's this sad scene. It's an incredible scene. He stumbles upon a hutch of rabbits. Big fat rabbit in a cage. It's going to get eaten. He opens the hutch up. He describes the forest, beautiful forest right there where that rabbit's meant to go. Rabbit walks down out of the hutch,
Starting point is 01:21:09 sniffs the air, and then walks right back into the cage. That's you. Hey, do we have an ability to play a video? Yeah. Can we get to Instagram and go to OlegCricket? My favorite artist on planet earth Oleg and then cricket CRI This is incredible to pull up video my god. I've been doing this on myself now
Starting point is 01:21:42 Let's let's keep going down, down, down, down. This is not what I expected you to pull up. Well, I asked him to stop doing the art that I wanted, and so he's doing this instead. Hey, go scroll that. No, no, no. What's that? No.
Starting point is 01:21:58 OK. We don't care. He's got an aesthetic. He's got an erotic eye. Yes. Let's keep going. He does. No, keep going farther, farther, farther, much farther.
Starting point is 01:22:11 This is my favorite artist and I had to beg him to stop creating. What do you mean? I'll show you. Well, let's look at one of the ones that's covered up. Yeah. No, no, no, not see why but let's see real Fuck all right. Let's watch this thing
Starting point is 01:22:31 With the sound on the sound is good No No. Jesus. Okay, now, let's keep, I I wanna show you his Mona Lisa. So in other words. Is that him doing it? That's him on the ledge? Yes, keep going. I mean, you have no idea.
Starting point is 01:23:12 Okay, wait, all right. Go back over to the left of the top, where Seattle is, yeah, there you go. This is his Mona Lisa. This is his Mona Lisa Shit No, yeah, and his girlfriend screwed up the first shot so he had to do it Okay now what I'm trying to say is
Starting point is 01:23:53 That straight into my veins Yeah, straight in the we are incredible. Yeah, that's us. Yes, okay We are not those people most of the time. Right. And what I'm trying to say is, where is the old leg crickets of physics? Where are the philanthropic old leg crickets? Where are the where is the security detail? Where's Delta Force, man? I want to talk to Delta Force guys. I don't want to talk to people who are so defeatist. Right. We do the most incredible things regularly.
Starting point is 01:24:30 Yeah. And that's why I want to save us. Yeah. I am for the earlier pictures would be a good reason to save us too. But that's the thing, you know, what people don't understand is what is the point of eroticism?
Starting point is 01:24:43 Everybody laughs about eroticism because it's like sex and we can't talk about sex because we have to giggle because it turns us hot. Yeah. Okay. But eroticism is something different. Eroticism is so far beyond procreation. Eroticism is about compelling people to do the ultimate, right? Because it's, it's the reward. It's the nectar for absolutely heroic behavior for far right tail, high cortosis, positive skew, F you middle finger,
Starting point is 01:25:20 love this life, right? Yeah. And so my claim is, is that, you know, I, why did I beg this guy, right? Yeah. And so my claim is, is that, you know, why did I beg this guy to stop? Because you can't cheat death every time. And the first time he fails and the first time he slips, his entire work in his cannon go into the toilet. Right.
Starting point is 01:25:37 And when I sit here and I get frustrated that nobody's listening, nobody cares, I go back to Oleg. He's the sweetest guy. He has a great relationship with his mom I love him saved his life no I don't even know if I had an impact on him you probably did I don't know I mean after you talk to him all we get is like hot like girls and but like you know all sorts of people do rooftop parkour he He's funny.
Starting point is 01:26:05 He's witty. He's taking a bike or a skateboard on the edge of the world's tallest buildings in the rain. Yeah. And he's making sure that he slips and doesn't do things perfectly. And the fact that he survived just tells you how good he is, how great he is. And my feeling is every time I hear, well, you know, I don't believe in the great man
Starting point is 01:26:28 theory of history. It's like, oh, well, who else is doing this at this level? No one. Got it. I got you, man. You are inviting a very, as far as I'm aware, a small group of people to try to break out of some kind of. In your world, I care about Carlin.
Starting point is 01:26:44 I care about Bruce. I care about Pryor. I really, really care. Who am I thinking of who did the marketing, the great marketing routine? Oh, Hicks. Yeah. Bill Hicks, yeah. Oh my God, what a great insight about humanity.
Starting point is 01:27:03 Kill yourself. No, I really mean it. Kill yourself. Oh, Bill's going after the anti-marketing dollar. That's good dollar. That's one of my favorite jokes. One of my favorite jokes. Yeah, it's the best.
Starting point is 01:27:13 You know, one of my favorite, negative comedy. The Flight of the Concords joke, it's like, yo, yo, yo, I'm the hip-hop-a-potamus, my lyrics are bottomless. I thought that was incredibly funny, like it just stalls out. So good, I didn't get it for a second. Listen, so I love it, and I think, you know, like with comedy, the difference is, it seems like with comedy, there is an invitation to do some form of like cultural parkour. And whereas with in your field, it feels like what you're saying, and I'm not
Starting point is 01:27:52 familiar with it, other than having listened to you. And that's like gotten a vague understanding of how musty it is in there. There isn't the invitation. God, I'm just, in this book by Schopenhauer, he's saying, I don't remember what the phrase, obviously it's in some other language, a famous phrase which is, people sing the song of the person giving them bread. And because of the nature of the beast that you're in,
Starting point is 01:28:25 you have to get funding, and that if you take parkour risks. It's a lobotomy, we've been lobotomized. How do I teach Tony Hinchcliffe differential geometry? How do we get Dave Chappelle to understand partial differential equations? Why doesn't Rogan know geometric algebra Listen, you're in trouble because you have to do it not teach us how to do it But you might just have to be the one who invents the thing I mean, this is the this is what this is not Buddhist doctrine. There's a there's a cool story
Starting point is 01:28:58 Right Buddha's under the Bodhi tree about to get in line. Mm-hmm all these cities, powers, and he astrally projects into the, I don't know, the palace of Indra. There's all the gods, Brahmins there, Vishnu, all of them are there. Brahmin is like, basically you can't come here without an invitation. What are you doing? Human. And sends him back. Well then immediately under the Bodhi Tree Brahman appears, God appears and says to him,
Starting point is 01:29:30 hey, I'm sorry about being rude because Buddha had gone up there because he wanted to ask, how do I end the suffering of humanity? How do we stop this? All the pain, all the suffering. And God, Brahman, the originator, says to him, listen, I'm so sorry, but here's the problem. I've just been here longer than the other gods, and I just kind of let them think I was their dad. I'm really sorry, but I have no idea how to do that. So you're going to have to figure it out. Yeah I love it, but that's you you're you have to do it. You're not gonna these physicists or I'm trying but but in part
Starting point is 01:30:16 But the greatest The greatest intellectual mistake my entire life, I think Is not understanding curation. I think I was so wrong. I've never been this wrong about anything and I'm wrong about a lot. Until someone says I see what he's saying. I just when is this thing gonna air that you and I are doing? This will air this week. Can we make it after my Rogan episode? Oh, yes.
Starting point is 01:30:48 We absolutely... When is your Rogan episode? I don't know. But it has to come after that because I don't want to screw... Deal? Deal! Alright. So, you know, I'm on with Terrence Howard. And, you know, Terrence is wrong
Starting point is 01:31:04 about a lot of stuff. I mean, it's not small. Yeah. But he's right about some stuff. And it could be that the stuff that he's right about could be really, really cool. It's not the stuff that he thinks that he's right about, about the universe and you know,
Starting point is 01:31:20 he's figured out the four forces. He might be right about the drone that he built, you know? And he needs someone to do the slow clap. Because if he doesn't get one of us to do a slow clap, and I've watched my colleagues shit on him, except for two colleagues, Ed Frankel and Stefan Alexander. And they said, you know, I kind of like him. except for two colleagues, Ed Frankel and Stefan Alexander. And they said, you know, I kind of like him. Yeah, maybe he's committed fraud, maybe he's an error,
Starting point is 01:31:52 and maybe he's a bit of a liar. But it's kind of cool what he's doing below it. And so at some level, you're nobody until somebody figures out what you did and analyzes it and and says something right about it, you know and That's what curation is somebody has to point to you and says I can see what he's saying Yeah, and right now I'm like 41 years into a conversation with myself and so
Starting point is 01:32:22 You know, that's very that's's very frustrating. If we could pull up a an interaction like maybe my favorite film is Kung Fu Panda. Are you kidding me? Kung Fu Panda? I haven't seen it yet. For some reason, I avoided watching it. I didn't think it'd be good. Kung Fu Panda and then maybe put Peach Tree. No, no, no, it's not. It's not tree. This is- No, no, no, it's not good.
Starting point is 01:32:46 This is trance. This Kung Fu Panda One. Oh my God, this is awesome. Nobody pays for YouTube. It's very funny, we're all too cheap because I think it's not the money, we just don't want to give them money. Oh shit, yeah, the audio is weird.
Starting point is 01:33:06 So you have to understand that I'm obsessed with the problem, which is I think teaching destroys self teaching. It could. And we haven't figured out how self teachers can leave self teachers as their students because we keep teaching and then we destroy it. This film is addressed to that problem. And that's sort of the most remarkable thing about this film is that I got dragged to it.
Starting point is 01:33:35 I was very unexcited to see this. Yeah, I would have been unexcited. All right, let's go to the beginning. Energy is one of the number one priorities when you're a parent. And AG1 contains ingredients that support sustained energy so I can be the best parent I could possibly be every day. No kid wants a tired 50-year-old parent. I already decided to be an old dad.
Starting point is 01:34:04 I don't need to burden my kids with being some frail trembling moody thing. I need that AG1 powder. My family needs it. Also, it's great for my gut biome. Let me tell you that could use some help. There's one product I trust to support my whole body health. It's AG1 and that's why I've partnered with them for so long. It's easy and satisfying to start your journey with AG-1. Try AG-1 and get a free one-year supply of Vitamin D3K2 and five free AG-1 travel packs with your first purchase at www.drinkag1.com forward slash Duncan. That's www.drinkag1.com forward slash Duncan.
Starting point is 01:34:40 Check it out. What happens in this film is that the two important characters, the two pivotal characters, meet really twice but for only one substantive conversation. And under this tree, and I hope this appeals to your Buddhism. The original self-teacher touches the student but doesn't teach him. He has to convey what he has to convey in some way that he doesn't destroy the gossamer that is self-actualization. Oh yeah. That's so cool. Yeah, that is classic. Very frustrating in Buddhism.
Starting point is 01:35:30 Describe the scene because people... There are three characters here that really matter. There's Uguay who unravels the secrets of harmony and focus at this pool of sacred tears who Shifu is scared of. Now Shifu is the ultimate technical kung fu master. He's absolutely amazing, but he's in awe of his teacher who's this turtle which is a terrible kung fu archetype. Who's scared of a slow moving turtle? And the panda is huge and fat and is basically a fan boy in modern parlance of this kung fu school.
Starting point is 01:36:07 And you have the five great kung fu students that represent the various styles of kung fu. And the turtle chooses the panda as his successor. And the panda isn't a kung fu artist. He just, he's so desperate to get into the kung fu competition to choose a successor that he after failing 18 times straps himself to a cart filled with fireworks and makes a makeshift rocket to hop the wall. He embarrasses himself and lands at the feet of the great master. He says the universe has brought us the dragon warrior. So then the question is like, okay, Shifu reveres Uge. Uge says, this is my successor.
Starting point is 01:36:48 Everyone's angry because it's not them. Yeah. The panda says like, how can it be me? Yeah. And the panda is overeating these peaches, which is what he does when he gets upset. And he's the sacred peach tree, like stealing the peaches. So he's like, you know, he's the sacred peach tree like stealing the peach so he's like you know he's that fraud thief bad kid yeah with a pure heart and
Starting point is 01:37:12 Oogway just barely touches him and had they have this one brief conversation my life revolves I have I have a son who I didn't teach. And he's picked up almost all the stuff that I know and love. And people imagine, well, you taught him all those things. I'm like, no. Right. Because I could see I had a self-teacher.
Starting point is 01:37:41 Yeah. So to me, what we're doing with peer review and with like trying to spot the flaw in each other's arguments and then, you know, taking a victory lap, well, your stuff doesn't work. Right. It's the most egoic orgy. It's, it's, it's an orgy of ego. Right. And it's like, we've forgotten what we're actually doing. We're LARPing. We don't realize that we're on the clock and then, you know, the no country for old men seeing in the gas station. You know this?
Starting point is 01:38:16 Yes. Right. So Anton Chigurh, you know, is basically saying, I guess I didn't put nothing up. It's like, you've been putting it up your whole life. You just didn't know it. Yeah. And that's the thing. It's like you're on the clock. Right. Every day you're on the clock. I have about 10,000 days left if I take care of myself and I don't have any accidents, right? Maybe longer. You don't have to say that. No, just- Maybe shorter. Well, right, maybe much shorter.
Starting point is 01:38:47 But my point is, you don't get that many days in a life. You could figure out how many weeks you'd have left and it would freak you out. You could figure out, like years you sort of are used to, but you haven't calculated your number of months. Yeah. Yeah, this is, at a Ram Dass retreat. I know you hate him. No, no It around us retreat do you know Bob Thurman is you'd love him? Yeah, he's brilliant. He's a He's a Buddhist scholar, but did you meet Rom dos? Yeah, he married me my wife
Starting point is 01:39:23 No kidding. Yeah. Yeah, he's kind of my wife. No kidding! No kidding, yeah. He's kind of a hero to me, I should say. Oh, yeah, you fit right in, and he would have loved you. The Thurman is a Buddhist scholar, he's brilliant. This is a Columbia professor? I don't know if he's a Columbia professor. He might have been a Columbia professor, I think so. I'm not positive I just know him from the retreats and whenever he comes everyone's so lucky cuz like he's just so great and
Starting point is 01:39:53 He gets up and for all these hippies. Yeah Many of them are like into bhakti yoga, you know, many of them are theists. Okay, and the first thing he says is be here now What does that even mean? He goes what does that mean? What does anyone know what that means? Does anyone even know what that means? He goes practice practice practice. Everyone here is always talking about their practice He's like I'd like to see some of you play for once You know like pointing out I think exactly what you're pointing out which is like it's some point this needs there's action here there Let's do it not just like let's do it. Yeah. Yeah, let's go. Are you an Osho fan? I Think he's interesting super in yeah, and I like a lot of the things he said are really intro funny to me
Starting point is 01:40:39 But I haven't focused on him at all. I've got one of his books and I just I didn't he's very disarming He you know in part one of these things that people don't understand what cults are because they imagine cults are just stupid. Oh yeah, right. Right? And it's like, it's pretty hard to run a cult if you don't have anything really insightful to say.
Starting point is 01:40:57 Oh, I know. Yeah. That's true. That's what's beautiful about them. That's why I get frustrated in cult documentaries when they leave out the philosophy. It's like, don't just show the end stage collapse of the cult. Let's hear what was sticky about it.
Starting point is 01:41:11 Let's see how beautiful the interracial relations were with, you know, Jonestown before it all went south. Exactly. Right? So what drew all those people in? Because inadvertently, you make the victim seem like idiots and it's like well it's Scientology the reactive mind is in part the discovery that of large language models that you're just doing auto completion that's right I so yeah and I and I don't mean to derail what you're saying but this is this does
Starting point is 01:41:38 point back to what you're saying earlier which is like with Terrence Howard it's like if we just reject completely Something without if we just throw the baby out with the bath water You know, we're never ever gonna get the baby and sometimes there's a lot of bath water But there's a baby in there and like learning how to filter it sure what pisses me off about Terrence is is that I? Would never have done this episode if it weren't for Joe. I didn't do it for Terrence, I did it for Joe because I felt like Terrence fooled Joe in a certain sense.
Starting point is 01:42:12 And Joe's very, you know, because Joe is so genuine and he's got such a beautiful heart and he's so smart and he pretends not to be whatever, he built this thing that is huge. And so he still has the idea of like, hey, it's just my little studio. And I've had conversations with people I enjoy talking to. And it's like, yes, but the world reverberates around it.
Starting point is 01:42:32 And now you've got a problem because it isn't, you know, you're a really big dog. And so you just created a mass delusion, right? The Terrence Howard mass delusion was catalyzed by our friend Joe Rogan. And you know, Joe will say something like, I've got this amazing bullshit detector. And it's like, well, we all have a great bullshit detector for the bullshit that we know how to detect. And then we all have, you know, I had this conversation with Bret Easton Ellis about
Starting point is 01:43:02 seduction. If you're too smart, you'll never be seduced. And if you never get seduced during your life, you really haven't lived. Yeah, that's right. You gotta fall for something. Hell yeah. It's the best. Are you kidding? It can be. You can also just find out that, you know, you got taken advantage of and you can be scarred for the rest of your life but a great seduction. Wow. Can you imagine passing through this life never having had that pleasure?
Starting point is 01:43:34 I can't imagine passing through this life never having seen your eyes. If only I weren't so goddamn straight. If only I weren't so god damn straight. Man, you know, I get it. You really, I get it, it's good. It's good to start off with the gloomy stuff and it's beautiful, even just the idea that just maybe, just maybe, there is a way out of this gravity well.
Starting point is 01:44:07 Just the idea alone, even though I don't understand anything about physics, is so invigorating. It gets you going, doesn't it? Come with me if you want to live. I wanna live! Oh, well that's the thing, right? Like when Arnold says that, come with me if you wanna live. You wanna talk about seduction.
Starting point is 01:44:25 What a line. Yeah, yeah. Come with me if you want to live. Jesus said it first. I got to tell you, I'm really proud of him. Of all our carpenters, perhaps he's the greatest we ever produced. He's a, you know, I've seen better carpenters,
Starting point is 01:44:43 but he was pretty- Really, the Japanese joint work is amazing. Yeah, he couldn join work is amazing. Yeah, he couldn't do that shit. Ha ha ha ha ha! But he was a good public speaker. Yeah. I could talk to you forever.
Starting point is 01:44:56 Let's do that. Let's do it. You know what you need to do? You need to create a Discord server where people like Joe and like comics and people can and like I created one and it became filled with psychopaths so all they wanted to do was like we you know we would monitor what they would be talking to to each other about like I've got if I've got a fool Eric into thinking I like him so I can get on Rogan that's the worst oh my Oh my God. Well, PhD is doing this.
Starting point is 01:45:25 Oh, it's so gross when people do that. It's so disgusting. I hate that. I hate that. It makes you feel so gross and used when you realize like, what the fuck? Like you were just trying to like get on my friend's podcast.
Starting point is 01:45:36 But that's the thing. If you have nothing, you spend your time trying to figure out how you can make the world better by tearing people down. Right. And people who are very, very talented and skilled who have nothing are very, very dangerous. And what I wish on my enemies is that they
Starting point is 01:45:58 would have something beautiful, because then they wouldn't be my enemies. That's it. Wow. You really are a Buddhist Well, I'll end on that you've accepted God Thank you for your time. Duncan what an honor to be invited. I'm so honored by you being a guest. Thank you Where can people find you? Is there anything I could- I've tried, you know, I've been sort of retreating. I would say find the YouTube channel,
Starting point is 01:46:28 find the Instagram, Eric R. Weinstein, Eric R. Weinstein on X Twitter. But the thing is is that I have, like you know, four years ago, I stopped the podcast because of all of the madness and I didn't like being commodified or misunderstood and all this kind of stuff. I wanna be a human being.
Starting point is 01:46:47 I think I'm gonna have to bring it back. And the whole, well, the issue is, is that it's entertainment, jazz hands, right? But the whole point of being entertaining is the payload. The entertainment is the vehicle. And the payload is what matters to me. What I have to do is to figure out some way of getting people to realize.
Starting point is 01:47:06 I'm happy to joke, I'm happy to be funny, to be light, all of those things. But you're never going to live the amazing life that you watch in movies that move you, you know, just shake you to your core. If you don't realize that what I'm saying is real, the clock is ticking, the opportunity has never been bigger, but the danger has never been more imminent in your entire life. And I don't know whether we have like two months or 70 years. I really used to think it was 300 years. I keep bringing that number in farther and farther until we have adults. We're not going to be able to actually
Starting point is 01:47:46 execute on something So what I'm trying to do is to try to figure out how do I create an entertaining product where people actually Don't just say I want another episode. They say holy cow. I get it Well, I'll help you figure it out. I can translate this for you I'm gonna go study Einstein Rosen bridges so that I don't never talk about it. Teach me misdirection and timing. Easy! Easy! Thank you so much, my friend. 30th head into your local store and save big on all your favorite summer treats when you buy two or more. Shop for some of your favorite sweets like Albany's gummies, Nerd's gummies,
Starting point is 01:48:29 Hershey's candy bars, licorice from American Licorice Company, Toffee Fae, and M&Ms and save when you buy two or more. Offer expires July 30th. Restrictions apply, promotions may vary. Visit Albertsons, Vons, or Pavilions.com for more details. Hello, it is Ryan and we could all use an extra bright spot in our day, couldn't we? may vary. Visit Albertsons, Vons, or Pavilions.com for more details. That was Eric Weinstein, everybody. All the links you need to find them will be at dunkotrustle.com. Thank you so much for watching and listening. I love you. Bye. Hey, it's Ryan Seacrest for Albertsons, Vons, and Pavilions.
Starting point is 01:49:29 Summer is the perfect time to save. Now through July 30th, earn four times rewards points when you shop for some of your favorite personal care brands. Then use your rewards points for discounts on groceries or gas on future purchases. Shop from some of your favorite brands like Crest, Align, Pantene, Loves, Always, and Gillette Fusion and earn four times rewards points. Offer expires July 30th. Restrictions apply. Promotions may vary. Visit Albertsons, Vons, or Pavilions.com for more details. It is Ryan here and I have a question for you. What do you do when you win? Like are
Starting point is 01:50:01 you a fist pumper, a woo-hooerer? A hand clap or a high-fiver? I kind of like the high-five, but if you want to hone in on those winning moves, check out Chumba Casino. At ChumbaCasino.com choose from hundreds of social casino style games for your chance to redeem serious cash prizes. There are new game releases weekly plus free daily bonuses, so don't wait. Start having the most fun ever at ChumbaCasino.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.