Lex Fridman Podcast - #314 – Liv Boeree: Poker, Game Theory, AI, Simulation, Aliens & Existential Risk

Episode Date: August 24, 2022

Liv Boeree is a poker champion and science educator on topics of game theory, physics, complexity, and life. Please support this podcast by checking out our sponsors: - Audible: https://audible.com/le...x to get 30-day free trial - GiveWell: https://www.givewell.org and use code Lex Fridman Podcast - Linode: https://linode.com/lex to get $100 free credit - Indeed: https://indeed.com/lex to get $75 credit - ExpressVPN: https://expressvpn.com/lexpod to get 3 months free EPISODE LINKS: Liv's Twitter: https://twitter.com/liv_boeree Liv's Instagram: https://instagram.com/liv_boeree Liv's Facebook: https://facebook.com/livboeree Liv's YouTube: https://youtube.com/user/LivBoeree Books and resources mentioned: Novacene: https://amzn.to/3wcVqEo POSITIVITY: https://amzn.to/3K2pfxj Meditations on Moloch: https://slatestarcodex.com/2014/07/30/meditations-on-moloch PODCAST INFO: Podcast website: https://lexfridman.com/podcast Apple Podcasts: https://apple.co/2lwqZIr Spotify: https://spoti.fi/2nEwCF8 RSS: https://lexfridman.com/feed/podcast/ YouTube Full Episodes: https://youtube.com/lexfridman YouTube Clips: https://youtube.com/lexclips SUPPORT & CONNECT: - Check out the sponsors above, it's the best way to support this podcast - Support on Patreon: https://www.patreon.com/lexfridman - Twitter: https://twitter.com/lexfridman - Instagram: https://www.instagram.com/lexfridman - LinkedIn: https://www.linkedin.com/in/lexfridman - Facebook: https://www.facebook.com/lexfridman - Medium: https://medium.com/@lexfridman OUTLINE: Here's the timestamps for the episode. On some podcast players you should be able to click the timestamp to jump to that time. (00:00) - Introduction (06:23) - Poker and game theory (13:26) - Dating optimally (18:18) - Learning (26:30) - Daniel Negreanu (31:38) - Phil Hellmuth (34:11) - Greatest poker player ever (38:29) - Bluffing (48:50) - Losing (58:05) - Mutually assured destruction (1:03:00) - Simulation hypothesis (1:19:38) - Moloch (1:48:50) - Beauty (2:00:58) - Quantifying life (2:21:08) - Existential risks (2:39:31) - AI (2:49:22) - Energy healing (2:56:32) - Astrophysics (2:59:26) - Aliens (3:25:08) - Advice for young people (3:27:13) - Music (3:35:02) - Meaning of life

Transcript
Discussion (0)
Starting point is 00:00:00 The following is a conversation with Liv Burry, formerly one of the best poker players in the world trained as a natural physicist and is now a philanthropist and an educator on topics of game theory, physics, complexity, and life. And now onto the full ad reads. As always, no ads in the middle. I tried to make this interesting, but if you skipped them, please still check out our sponsors. I enjoy this stuff. Maybe you will too.
Starting point is 00:00:27 First we've got Audible. It's an audiobook service that gives me thousands of hours. I don't think I'm exaggerating when I say that of educational material. I've been running every single day, almost with no exception, at least eight miles. That's often anywhere from one to two and a half hours, depending on the distance I'm doing. And my trusted companion through that is either brown noise, which I just listened to YouTube videos or brown noise. If I'm not listening to brown noise when I'm running and focusing on my thoughts, I am listening to audiobooks. And that's usually the source of fun, the source of intense
Starting point is 00:01:05 thinking for me, exploration, it's a journey into another world, into another time, and an audio book reading of a book can really take you to that place, it can really transport you there. New members can try the thing free for 30 days at audible.com slash Lex or text Lex to 500 500. This show is brought to you by GiveWell. The research charitable organizations and only recommend the highest impact evidence-backed charities. It may be counterintuitive but giving money or helping others is actually a really difficult
Starting point is 00:01:43 process. It's not as simple as throwing cash up in the air and hoping to win catches it and redistribute it somehow optimally to the people who need it most. It really is a complicated process. Obviously, you have to avoid corruption, you have to avoid bureaucracy, all the overhead that has to do with the giving process. You know, if you build too big of an organization, you're going to spend more money on running the organization than you are and actually giving money directly to people or helping people directly in some way. And that's what GiveWell optimizes and really understands and helps you figure out which are the good charities to give money to. Donors have used GiveWell to donate more than $750 million. Go to givewell.org, pick Podcast, and select Lex Friedman Podcast, head check out.
Starting point is 00:02:34 This episode is also brought to you by Linode. Linux Virtual Machines. I literally smile every time I say Linux. It must be genetic. I think the first time I interacted the linux terminal, it could have been a unix terminal, maybe a son. So basically the command line where you could do have access to bash and purl. Anyway, all of that was a linux and it opened up to me the world of power, the fingertips of a programmer. I mean, Linux really makes that clear to you, much, much, much more than those windows.
Starting point is 00:03:10 Anyway, I'm a huge fan of computer infrastructures, like Linux, my favorite one, that makes all that power available to you in the cloud, has a million things you could do. Small projects, huge systems, all of it. I use Linux for all kinds of stuff, and stuff and it really really really is a great computer infrastructure and made super simple and easy to access and monitor and all that kind of stuff. Visit Linode.com slash Lex for free credit. This show is also brought to you by Indeed, a hiring website. I've used them for many hiring efforts I've done for the teams I've led in the past. It really is the most important thing you
Starting point is 00:03:51 could do in your life, which is optimize the inner circle, the folks you surround yourself with. Of course, you could do that with family, you could do that with friends, but because a lot of us spend such a large percentage of our time at work, we should also do that with the people who work with. And if you're in the position of a manager or a hiring manager, and you get to optimize the inner circle, this isn't just about productivity. It's not just about kind of filling a gap in the team. It's about bringing meaning and happiness to everybody involved. Companies come and go, but a deep, meaningful life journey is something bigger than all of that. So yeah, use the best tools for the job. I like indeed. They do a really great job. They
Starting point is 00:04:39 have a special offer only available for a limited time. so go check it out at indeed.com slash Lex. This shows also about Dubai ExpressVPN. I use them to protect my privacy on the interwebs. It protects you a bit more from ISPs, Internet service providers from being able to collect your data, even when you navigate to shady websites, which I know you do in incognito mode and Chrome. You can watch shows that are geographically restricted, especially PM as a VPN. So it should do the VPN thing well.
Starting point is 00:05:15 It should be fast, it should work anywhere, like on any device, again, anywhere, everywhere, at least anywhere I can think of, it works, it works fast, it's very intuitive, it's very simple, it does the thing it's supposed to do and does it well. In terms of execution, I mean, there's very few things that admire then system, software systems that do the job, they're supposed to do and do it well. And all the improvements, all the version updates that they do, just push further the ability to do the thing well. They don't shove all the kind of weird features into it. It just does the thing it's supposed to do and does a well.
Starting point is 00:05:53 Go to expressweepen.com slash lexpod for an extra three months free. This is the Lex Friedman podcast. To support it, please check out our sponsors in the description. And now, dear friends, friends here's live Bore. What role do you think luck plays in poker and in life? You can pick whichever one you want, poker or life and or life. The longer you play, the less influenced luck has, you know, like with all things, the bigger your sample size, the more the quality of your decisions or your strategies matter.
Starting point is 00:06:44 So to answer that question, yeah, in poker, it really depends. If you and I sat and played 10 hands right now, I might only win 52% of the time, 53%, maybe, but if we played 10,000 hands, then I'll probably win like over 98, 99% of the time. So it's a question of sample sizes. And what are you figuring out over time? The betting strategy that this individual does, or it literally doesn't matter against any individual over time. Against any individual over time, the better player, because they're making better decisions. So what does that mean to make a better decision?
Starting point is 00:07:16 Well, to get into the realness of Christie already, basically poker is the game of math. There are these strategies familiar with like Nash Equilibria, that's true. So there are these game theory optimal strategies that you can adopt. And the closer you play to them, the less exploitable you are. So because I've studied the game a bunch, although admittedly not for a few years, but back in, you know, when I was playing all the time, I would study these Game Theory Optimal Solutions and try and then adopt those strategies when I go
Starting point is 00:07:50 and play. So I'd play against you and I would do that. And because the objective when you're playing Game Theory Optimal, it's actually, it's a loss minimization thing that you're trying to do. Your best bet is to try and play a similar style. You also need to try and adopt this lost minimization. But because I've been playing much longer than you, I'll be better at that. So, first of all, you're not taking advantage of my mistakes. But then on top of that, I'll be better at recognizing
Starting point is 00:08:21 when you are playing suboptimally and then deviating from this Game Theory Optimal strategy to exploit your bad plays. Can you define Game Theory and national equilibrium? Can we try to sneak up to it in a bunch of ways? What's the Game Theory framework of analyzing poker, analyzing any kind of situation? So Game Theory is just basically the study of decisions within a competitive
Starting point is 00:08:48 situation. I mean, it's technically a branch of economics, but it also applies to like wider decision theory. And usually when you see it, it's these little payoff matrices and so on, that's how it's depicted. But it's essentially just like study of strategies under different competitive situations. And as it happens, certain games, in fact, many, many games have these things called Nash Equilibria. And what that means is when you're in a Nash Equilibrium, basically, it is not, there is no strategy
Starting point is 00:09:22 that you can take that would be more beneficial than the one you're currently taking assuming your opponent is also doing the same thing. So it would be a bad idea, if we're both playing a game three optimal strategy, if either of us deviate from that, now we're putting ourselves at a disadvantage. RockPapuses is actually a really great example of this. Like, if we were to start playing rockpapers, you know, nothing about me and we're going to play for all our money, let's play 10 rounds of it. What would your sort of optimal strategy be, do you think? What would
Starting point is 00:09:55 you do? Let's see, I would probably try to be as random as possible. Exactly. You want to, because you don't know anything about me, you don't want to give anything about a way about yourself. So ideally, you'd have a little dice or somewhat perfect randomizer that makes you randomize 33% of the time each of the three different things. And in response to that, well, actually, I can kind of do anything, but I would probably just randomise back too, but actually it wouldn't matter because I know that you're playing randomly.
Starting point is 00:10:29 So that would be us in a Nash equilibrium where we're both playing this like unexploidable strategy. However, if after a while you then notice that I'm playing rock a little bit more often than I should. Yeah, you're the kind of person that would do that, wouldn't you? Sure, yes, yes, yes, I'm more of a scissors girl. But anyway, no, I'm a randomizer. So you notice I'm throwing rock too much for someone like that. Right. Now you'd be making a mistake by continuing playing this game theory optimal strategy because, well, the previous one, because you are now, I'm making a mistake and you're
Starting point is 00:11:03 not deviating and exploiting my mistake. So you'd want to start throwing paper a bit more often. In whatever you figure is the right sort of percentage of the time that I'm throwing rock too often. So that's basically an example of where, you know, what game three optimal strategy is in terms of loss minimization. But it's not always the maximally profitable thing if your opponent is doing stupid stuff, which in that example. So that's kind of then how it works in poker, but it's a lot more complex. And the way poker players typically nowadays they study, the games change so much, and I think we
Starting point is 00:11:38 should talk about how it's sort of evolved. But nowadays, the top pros basically spend all their time in between sessions running these simulators using software where they do basic Monte Carlo simulations, sort of doing billions of fictitious self-play hands. You input a fictitious hands scenario like, what do I do with Jack 9 suited on a King 10 for two spade board and against this bet size. So you'd input that press play, it'll run, it's billions of fake hands, and then it'll converge upon what the Game Theory optimal strategies are. And then you want to try and memorize what these are, basically they're like ratios of how often, what types of hands you want to bluff and what percentage of the time.
Starting point is 00:12:26 So then there's this additional layer of inbuilt randomization built in. Yeah, those kinds of simulations incorporate all the betting strategies and everything else like that. So they're supposed to some kind of very crude mathematical model of what's the probability you win just based on the quality of the card. It's including everything else too. The game theory of it. Yes, essentially.
Starting point is 00:12:46 And what's interesting is that nowadays, if you want to be a top pro and you go and play and he's really like the Super High stakes tournaments or tough cash games, if you don't know this stuff, you're gonna get eaten alive in the long run. Yeah. But of course, you could get lucky over the short run and that's where this like luck factor comes in
Starting point is 00:13:01 because luck is both a blessing in a curse. If luck didn't, you know, if there wasn't this random element and there wasn't the ability for worse players to win sometimes, then poker would fall apart. You know, the same reason people don't play chess, professionally for money against, you know, you don't see people going and hustling, chess like not knowing, trying to make a living from it,
Starting point is 00:13:22 because you know there's very little luck in chess, but there's quite a lot of luck in poker. Have you seen a beautiful mind that movie? Years ago. Well, what do you think about the game, theoretic formulation of what is it, the hot blonde at the bar? Do you remember the way they illustrated it, they're trying to pick up a girl at a bar and there's multiple girls, they're like a friend, it's like a friend group and you're trying
Starting point is 00:13:43 to approach. I don't remember the details, but I remember. Don't you like then speak to her friends? Yeah, yeah, yeah. Just like that, fame, disinterest. I mean, it's classic pickup artists stuff, right? You wanna? And they were trying to correlate that somehow
Starting point is 00:13:55 that being an optimal strategy game theoretically. Why? What, what, like, I don't think, I remember. I wanna imagine that they were, I mean, there's probably an optimal strategy. Is it, does that mean that there's an actual national equilibrium of like picking up girls? Do you know the, the marriage problem?
Starting point is 00:14:14 It's optimal stopping. Yes. So where it's an optimal dating strategy where you, do you remember- Yeah, I think it's like something like you, you know, you've got like a set of 100 people you're going to look through and after how many do you now after that after going on this many dates out of a hundred, at what point do you then go, okay, the next best person I see is that the right one? And I think it's like something like 37%.
Starting point is 00:14:38 Oh, one over E, whatever that is. Right, which I think is. Yeah, yeah. I'm going to seven. Yeah. I'm going to fact check that. Yeah, so, but it's funny under those strict constraints, then yes, after that many people, as long as you have a fixed size pool, then you just picked the next person that is better than anyone else before.
Starting point is 00:15:02 Anyone else see him? Yeah. Have you tried this? Have you incorporated it? I wanted those people. I'm a, and we're going to discuss this. I, and what do you mean those people? I try not to optimize stuff.
Starting point is 00:15:17 I try to listen to the heart. I don't think I like my mind immediately is attracted to optimizing everything and I think that if you really give into that kind of addiction that you lose the the joy of the small things the minutiae of life, I think. I don't know. I'm concerned about the addictive nature of my personality in that regard. In some ways, while I think on average, people under try and quantify things or under optimize, there are some people who, you know, it's like with all these things, it's a balancing act. I've been a dating house, but I've never used them. I'm sure they have data on this, because they probably have the optimal stopping control problem.
Starting point is 00:16:10 Because there aren't a lot of people that use social dating apps are on there for a long time. So the interesting aspect is like, all right, how long before you stop looking, before it actually starts affecting your mind negatively such that you see dating as a kind of game. A kind of game versus an actual process of finding somebody that's gonna make you happy for the rest of your life.
Starting point is 00:16:40 That's really interesting. They have the data, I wish they would be able to release that data. And I do want to help. It's not a cupid, right? I think they ran a huge study on all of that. Yeah, there are more data driven, I think, okay, keep it folks out. I think there's a lot of opportunity for dating apps and you know, even bigger than dating apps, people connecting on the internet. I just hope they're more data drivendriven and it doesn't seem that way. I think like I've always thought that good reads
Starting point is 00:17:10 should be a dating app. Like the- I've never used it. The good reads is just lists, like books that you've read and allows you to comment on the books you read and what the books you're currently reading. But it's a giant social networks of people reading books. And that seems to be a much better database of interest. Of course, it constrains you to the books you're reading, but that really reveals
Starting point is 00:17:33 so much more about the person. It allows you to discover shared interests because books are kind of windowed into the way you see the world. Also, like the kind of places people you're curious about, the kind of ideas you're curious about, the kind of ideas you're curious about, are you romantic, are you cold calculating rationalists, are you into iron rand, or are you into Bernie Sanders, are you into whatever. Right. And I feel like that reveals so much more than like a person trying to look hot from a certain angle and a tender profile.
Starting point is 00:18:02 Well, and it would also be a really great filter in the first place for people, it's the likes of people who read books and are willing to go and write them and give feedback on them and so on. So that's already a really strong filter of probably the type of people you'd be looking for. Well, at least be able to fake reading books. I mean, the thing about books,
Starting point is 00:18:18 you don't really need to read it. You can just look at the books. Yeah, game the dating app by feigning intellectualism. Can I admit something very horrible about myself go on? The things that you know, I don't know how many things in my closet, but this is one of them I've never actually reached a red Shakespeare. I've only read cliff notes and I got a five in the AP English exam and I
Starting point is 00:18:43 Which books have I read? Well, yeah, which was the exam on which oh no, they they include a lot of them But hamlet I don't even know if you read Romeo and Juliet Macbeth I don't I don't remember but I don't understand it. It's like really cryptic It's really I don't and it's not that pleasant to read. It's like ancient speak. I don't understand it And you know, maybe I was too speak. I don't understand it. Anyway, maybe I was too dumb. I'm still too dumb, but I did. But you go to five, which is, I don't know how the US grading system is. Oh, no. So AP English is a, there's kind of this advanced versions of courses in high school.
Starting point is 00:19:19 And you take a test that is like a broad test for that subject and includes a lot. It wasn't obviously just Shakespeare I think a lot of it was also writing written you have like AP physics AP computer science AP biology AP chemistry and then AP English or AP literature I forget what it was But I think Shakespeare was a part of that But I and you and you game of the point is you game a fight it game a fight. Well, entirety, I was into getting a's, I saw it as a game. I don't think any, I don't think all
Starting point is 00:19:52 the learning I've done has been outside of the outside of school, the deepest learning I've done has been outside of school with a few exceptions, especially in grad school, like deep computer science courses, but that was still outside of school with a few exceptions, especially in grad school, like deep computer science courses. But that was still outside of school because it was outside of getting it, it was outside of getting the A for the course. The best stuff I've ever done is when you read the chapter and you do many of the problems at the end of the chapter, which is usually not what's required for the course, like the
Starting point is 00:20:20 hardest stuff. In fact, textbooks are freaking incredible. If you go back now and you look at like biology textbook or any of the computer science textbooks on algorithms and data structures, those things are incredibly, they have the best summary of a subject, plus they have practice problems of increasing difficulty that allows you to truly master the basic, the fundamental idea behind that. I got to remind you, physics degree, with one textbook that was just this really comprehensive one that they told us at the beginning of the first year, by this, but you're going to have to buy 15 other books for all your supplementary courses.
Starting point is 00:20:58 And I was like, every time I was just checked to see whether this book covered it, and it did. And I think I only bought like two or three extra, and thank God, because they're so super expensive textbooks. It's a whole racket they've got going on. Yeah, they are. They could just, you get the right one, it's just like a manual for, but what's interesting though is,
Starting point is 00:21:17 this is the tyranny of having exams and metrics. The tyranny of exams and metrics, yes. I loved them because I loved, I'm very competitive and I loved, I'm very competitive and I liked. Yes. I liked finding ways to game-ify things and then like sort of dust off my shoulders
Starting point is 00:21:31 after as when I get a good grade or be annoyed at myself when I didn't. But yeah, you're absolutely right in that the actual, you know, how much of that physics knowledge I've retained. Like, I've, I learned how to cram and study and please an examiner. But did that give me the deep lasting knowledge
Starting point is 00:21:48 that I needed? Yes and no. But really, nothing makes you learn a topic better than when you actually didn't have to teach it yourself. I'm trying to wrap my teeth around this game theory molloc stuff right now. And there's no exam at the end of it that I can gamify. There's no way to gamify and sort of like shortcut my way through it.
Starting point is 00:22:10 I have to understand it so deeply from like deep foundational levels to them to build upon it and then try and explain it to other people. And like, you know, you're about to go and do some lectures, right? You can't sort of just like, you probably presumably can't rely on the knowledge that you got through when you were studying for an exam to re-teach that. Yeah, and especially high-level lectures, especially the kind of stuff you do on YouTube, you're not just regurgitating material, you have to think through what is the core idea here.
Starting point is 00:22:43 And when you do the lectures live, especially live especially you have to there's no second takes That is a luxury you get if you're recording a video for YouTube or something like that but It definitely is a luxury shouldn't lean on I've gotten to interact with a few YouTubers that lean on that too much. And you realize, oh, you're, you've gamified this system because you're not really thinking deeply about stuff. You're through the edit, both written and spoken. You're crafting an amazing video, but you yourself as a human being have not really deeply
Starting point is 00:23:21 understood it. So live teaching or least recording video with very few takes is a different beast. And I think it's the most honest way of doing it. Like as few takes as possible. That's what I'm nervous about this. Don't go back to like Alex do that. Don't fuck this up live. The tyranny of exams. I do think, you know, people talk about, you know, high school and college as a time to do drugs and drink and have fun and all this kind of stuff, but, you know, looking back,
Starting point is 00:23:56 of course, I did a lot of those things. No, yes, but it's also a time when you get to, Yes, but it's also a time when you get to like read textbooks or read books or learn with all the time in the world. Like you don't have these responsibilities of like you know laundry and having to sort of pay for mortgage, all that kind of stuff, pay taxes, all this kind of stuff, in most cases, there's just so much time in the day for learning. And you don't realize at the time, because at the time it seems like a chore, like, why the hell does there's so much homework? But you never get a chance to do this kind of learning, this kind of Ever again in life unless later in life you really make a big effort out of it
Starting point is 00:24:49 You get like you basically your knowledge gets solidified. You don't get you don't get to have fun and learn Learning is really is really fulfilling and really fun if you're that kind of person like some people Like to you know like knowledge is not something that they think is fun, but if that's the kind of thing that you think is fun, that's the time to have fun, and do the drugs and drink and all that kind of stuff. But the learning, just going back to those textbooks, the hour spent with the textbooks
Starting point is 00:25:18 is really, really rewarding. Do people even use textbooks anymore? Yeah. Do you think? Is these days with their TikTok and then? Well, I'm not even that, but it's just like so much information, really high quality information. It's now in digital format online.
Starting point is 00:25:34 Yeah, but they're not, they're using that, but college is still very, there's a curriculum. I mean, so much of school is about rigorous study of a subject and still on YouTube that's not there. Right. YouTube has Grand Sanderson talks about this. He's this math teacher. That's what you want, Brown. Yeah, three people want Brown.
Starting point is 00:25:56 He says like, I'm not a math teacher. I just take really cool concepts and I inspire people. But if you want to really learn calculus, if you want to really learn linear algebra, you should do the textbook. You should do that. And there's still the textbook industrial complex that charges like $200 for textbook in somehow. I don't know.
Starting point is 00:26:18 It's ridiculous. There's this. Well, I'm sorry, new edition, edition 14.6. Sorry, you can't use 14.5 anymore. It's like, oh, sorry, new addition, addition 14.6. Sorry, you can't use 14.5 anymore. It's like, what's different? We've got one paragraph different. So we mentioned offline, Daniel Nagarano.
Starting point is 00:26:33 I'm going to get a chance to talk to him on this podcast. And he's somebody that I was a fan fascinating in terms of the way he thinks about poker, verbalizes the way he thinks about poker, the way he plays poker. So and he's still pretty damn good. He's been good for a long time. So you mentioned that people are running these kinds of simulations and the game of poker has changed.
Starting point is 00:26:55 Do you think he's adapting in this way? Do you think like the top pros, do they have to adopt this way, or is there still like over the years you basically developed this gut feeling about? Like you get to be like good the way like Alpha Zero is good. You look at the board and somehow from the fog comes out the right answer. Like this is likely what they have. This is likely the best way to move. And you don't really, you can't really put a finger on exactly why, but it just comes from your gut feeling or no. Yes and no. So gut feeling is definitely very important. You know, the we've got our two-mo, or you can distill it down to two modes of decision making, right? You've got your sort of logical linear voice in your head system to, as it's often called,
Starting point is 00:27:49 and your system on your gut intuition. And historically, in poker, the very best players were playing almost entirely by their gut. You know, you often they do some kind of inspired play and you'd ask them why they do it and they wouldn't really be able to explain it. And that's not so much because their process was unintelligible, but it was more just because no one had the language with which to describe what optimal strategies were, because no one really understood how poker worked. This was before, you know, we had analysis software, you know, no one was writing, you know, if I guess some people would write down their hands in a little notebook, but there was no way to
Starting point is 00:28:28 assimilate all this data and analyze it. But then, you know, when computers became cheaper and software started emerging and then obviously online poker where it would like automatically save your hand histories. Now all of a sudden you can't, I had this, this body of data that you could run analysis on. And so that's when people started to see, you know, these mathematical solutions. And so what that meant is the role of intuition essentially became smaller. And it went more into as we talked before before, about this Game Theory Optimal style. But also, as I said, like Game Theory Optimal is about loss, minimization, and being unexploitable.
Starting point is 00:29:12 But if you're playing against people who aren't, because no one person, no human being, can play perfectly Game Theory Optimal in poker. Not even the best AIs, they're still like, they're 99.99% of the way there or whatever, but it's kind of like the speed of light you can't reach it perfectly. So there's still a role for intuition?
Starting point is 00:29:28 Yes, so when your plane is unexploitable style, but when your opponents start doing something suboptimal that you want to exploit, well now that's where not only your logical brain will need to be thinking, well, okay, I know I have this, I'm in the sort of top end of my range here with this hand, so that means I need to be calling X% of the time,
Starting point is 00:29:51 and I put them on this range, et cetera. But then sometimes you'll have this gut feeling that will tell you, you know what, this time, I know mathematically I'm meant to call now. I've got, I'm in the top end of my range and this is the odds I'm getting. So the math says I should call, but there's something in your gut saying they've got it this time. They're beating you, maybe your hand is worse. So then the real art, this is where the last remaining art in poker, the fuzziness,
Starting point is 00:30:26 is like, do you listen to your gut? How do you quantify the strength of it? Or can you even quantify the strength of it? And I think that's what Daniel has. I mean, I can't speak for how much he's studying with the simulators and that kind of thing. I think he has, like, he must be, to still be keeping up. But he has an incredible intuition for just, he's seen so many hands of poker in the flesh, he's seen so many people the way they behave when the money's on the line
Starting point is 00:30:58 and he's got him staring you down in the eye. You know, he's intimidating. He's got this kind of X factor vibe that he, you know, gives out. And he talks a lot, which is an interactive element, which is, he's getting stuff from other people. Yes. Yeah. And just like the subtleties.
Starting point is 00:31:15 So he's like, he's probing constantly. Yeah, he's probing and he's getting this extra layer of information that others can't. Now that said though, he's good online as well, you know, I don't know how, again, would he be beating the top cash game players online? Probably not. No. But when he's in person and he's got that additional layer of information, he can not only extract it, but he knows what to do with it. Still so well. There's one player who I would say is the exception to all of this. And he's one of my favorite people to talk about in terms of, I think he might have cracked the simulation. It's Phil Helmuth.
Starting point is 00:31:51 He. In more ways than one, he's cracked the simulation, I think. Yeah, he somehow to this day is still, and I love you Phil, don't, I'm not in any way knocking you. He is still winning so much at the World Series of Pogas specifically. He's now on 16 bracelets, the next nearest person I think is 110. And he is consistently year in year out going deep or winning these huge field tournaments, you know, with like 2,000 people, which statistically he should not be doing.
Starting point is 00:32:23 And yet you watch some of the plays he makes, and they make no sense. Like mathematically, they are so far from Game Theory Optimal. And the thing is, if you went and stuck him in one of these high stakes cash games with a bunch of like GTO people, he's gonna get ripped apart. But there's something that he has that when he's in the halls of the World Series of Pokers, specifically,
Starting point is 00:32:52 amongst sort of amateurish players, he gets them to do crazy shit like that. And but my little theory is that also he just, he's like a wizard and he gets the cards to do what he needs them to. and he gets the cards to do what he needs them to. Because he just expects to win and he expects to get flopper set with a frequency far beyond what the real percentages are. And I don't even know if he knows what the real percentages are. He doesn't need to because he gets that. I think he has found the cheatbook. Because when I've seen him play, he seems to be annoyed that the
Starting point is 00:33:25 long shot thing didn't happen. Yes. He's like annoyed and it's almost like everybody else is stupid because he was obviously going to win with this. Mentor win if that silly thing hadn't happened and it's like you don't understand the silly thing happens 99% of the time and it's 1% not the other way around but genuinely for his lived experience at the world's only at the Walter's Apoca, it is like that. So I don't blame him for feeling that way. But he does. He has this ex-factor and the poker community has tried for years to rip him down saying
Starting point is 00:33:55 like, you know, he does it. He's no good, but he's clearly good because he's still winning. Or there's something going on. Whether that's, he's figured out how to mess with the fabric of reality and how all cards, you know, a randomly shuffled deck of cards come out. I don't know what it is, but he's doing it right still. Who do you think is the greatest of all time? Would you put hell, Muth?
Starting point is 00:34:15 No, not me's definitely. He seems like the kind of person when mentioned he would actually watch this so you might want to be careful. No, as I said, I love Phil. And I have, I would say this to his face, I'm not saying anything, I don't, he's got, he truly, I mean, he is one of the greatest. Yeah. I don't know if he's the greatest. He's certainly the greatest at the World Series of Poker.
Starting point is 00:34:39 And he is the greatest at despite the game switching into a pure game, almost an entire game of math, he has managed to keep the magic alive. And there's like the just through sheer force of will making the game work for him. And that is incredible. And I think it's something that should be studied because it's an example. Yeah, there might be some actual game theoretic wisdom. There might be something to be said about optimality from studying him. Right. Me Maybe what do you mean by optimality? Meaning, or rather game design perhaps. Meaning, if what he is doing is working, maybe poker is more complicated than what we're
Starting point is 00:35:16 currently modeling it as. So like, oh, there's an extra layer, and I don't need to get too weird and will we? But or there's an extra layer of Ability to manipulate the things the way you want them to go that we don't understand yet Do you think Phil? How many of them understands them? Is he just generally? positivity Book on positivity and he has did not like the trolling book. No, I seriously straight up. Yeah.
Starting point is 00:35:48 If you tell me you wrote a book about positivity. Yes. OK, not I wrote it. About, I think, and I think it's about sort of manifesting what you want and getting the outcomes that you want by believing so much in yourself and in your ability to win, like eyes on the prize. And I mean, it's working. The man's delivered. Where do you put like, phil IV and all those kinds of people?
Starting point is 00:36:12 I mean, I'm too, I've been, to be honest, too much out of the scene for the last few years to really, I mean, phil IV's clearly got, again, he's got that ex-factor. He's so incredibly intimidating to play against. I've only played against him a couple of times, but when he looks him in the eye, and you're trying to run a bluff on him, oh, no one's made me sweat harder than Phil Ivy, just my bluff got through, actually.
Starting point is 00:36:35 That was actually one of the most thrilling moments I've ever had in poker. It was in Monte Carlo and a high roller. I can't remember exactly what the hand was, but I, you know, I three bit, and then just barrelled all the way through, but I, you know, I three bit and then like just barreled all the way through and he just like put his laser eyes into me and I felt like he was just scouring my soul and I was just like hold it together, live, hold it together
Starting point is 00:36:55 and he's like, you need your hand with weaker. Yeah, I mean, I was bluffing. I, I, I presume which, you know, there's a chance I was bluffing with the best hand, but I, I'm pretty sure my hand was worse. And he folded. I was truly one of the deep highlights of my career. Did you show the cards are you beautiful? What are you... You should never show in game.
Starting point is 00:37:17 Because especially as I felt like I was one of the worst players at the table in that tournament, so giving that information, unless I had a really solid plan that I was now advertising, oh look, I'm capable of bluffing Phil Ivy, but why? It's much more valuable to take advantage of the impression that they have of me, which is like, I'm a scared girl playing a high roller for the first time. Keep that going. Interesting, but isn't there a layers to this? Like, psychological warfare that the scared girl
Starting point is 00:37:50 Might be way smart and then like to flip the tables. Do you think about that kind of stuff? Or is that a not reveal information? I mean generally speaking you want to not reveal information You know the goal of poker is to be as Deceptive as possible about your own strategies while Elucidating as much out of your opponent about their own. So giving them free information, particularly if they're people who you consider very good players, any information I give them is going into their little database and being calculated and used well. So I have to be really confident that my meta gaming
Starting point is 00:38:20 that I'm going to then do, or they've seen this, so therefore that, I'm going to be on the right level. So it's better just to keep that little secret to myself in the moment. So how much is bluffing part of the game? Huge amount. So, yeah, I mean, maybe actually, let me ask, like, what did it feel like with Phil Ivey or anyone else, when it's a high stake, when it's a big bluff? So a lot of money on the table. And maybe, I mean, what defines a big bluff?
Starting point is 00:38:49 Maybe a lot of money on the table, but also some uncertain tenure, mind and heart about, like, self-doubt. Well, maybe I miscalculated what's going on here, what the best said, all that kind of stuff. Like, what does that feel like? I mean, it's, I imagine comparable to, you know, running a, I mean, any kind of big bluff where you have a lot of something that you care about on the line, you know,
Starting point is 00:39:19 so if you're bluffing in a courtroom, not that anyone should ever do that, or, you know, something equateable to that, it's, it's in, you know, in a courtroom, not that anyone should ever do that. Something equateable to that. It's in that scenario, I think it was the first time I'd ever played a 20, I'd won my way into this 25k tournament. So that was the buy-in, 25,000 euros. And I had satelliteed my way in because it was much bigger than I would never, ever normally play. And I wasn't that experienced at the time.
Starting point is 00:39:43 And now I was sitting there against all the big boys, the n'granus, the philives and so on. And then to like, each time you put the bets out, you know, you put another bet out, you caught, yeah, I was on a, what's called a semi bluff? So there were some cards that could come that would make my hand very, very strong and therefore win, but most of the time those cards don't come. So that is a semi-buff because you're representing, are you representing that you already have something? So, I think in this scenario, I had a flush draw, so I had two clubs, two clubs came out
Starting point is 00:40:20 on the flop, and then I'm hoping that on the turn and the river one will come. So I have some future equity. I could hit a club, and then I'll have that on the turn and the river one will come. So I have some future equity. I could hit a club, and then I'll have the best hand in which case, great. And so I can keep betting, and I want them to call. But I'm also got the other way of winning the hand, where if my card doesn't come, I can keep betting and get them to fold their hand. And I'm pretty sure that's what the scenario was. So I had some future equity, but it's still,
Starting point is 00:40:45 most of the time I don't hit that club. And so I'd rather him just fold because the pot is now getting bigger and bigger. And in the end, like I jam all in on the river, that's my entire tournament on the line. As far as I'm aware, this might be the one time I ever get to play a big 25K. This was the first time I played one.
Starting point is 00:41:03 So it felt like the most momentous thing. And this is also when I was trying to play a big 25k. You know, this was the first time I played one. So it was, it felt like the most momentous thing. And this is also when I was trying to build myself up, I, you know, build my name, a name for myself in poker. I want to get respect. So this could destroy everything for you. It felt like it in the moment. Like, I mean, it literally does feel like a form of life and death. Like, your body physiologically is having that flight or flight response. What are you doing with your body? What are you doing with your face? I know. You're just like, what are you doing with your body? What are you doing with your face? I know. What are you just like? What are you thinking about?
Starting point is 00:41:27 More like a mixture of, okay, what are the cards? So in theory, I'm thinking about like, okay, what are cards that make my hand look stronger, which, you know, which cards hit my perceived range from his perspective, which cards don't, what's the right amount of bet size to maximize my fold equity in this situation. That's the logical stuff that I should be thinking about, but I think in reality because I was so scared, because at least for me, there's a certain threshold of nervousness or stress beyond which the logical brain shuts off.
Starting point is 00:41:58 Now it just gets into this, it feels like a game of wit, basically. It's like of nerve. Can you hold your resolve? And it certainly got by that, like by the river. I think by that point, I was like, I don't even know if this is a good bluff anymore, but fuck it, let's do it.
Starting point is 00:42:14 Your mind is almost numb from the intensity of that feeling. I call it the white noise. And it happens in all kinds of decision making. I think anything that's really, really stressful. I can imagine someone in an important job interview. If it's a job they've always wanted, and they get engrilled, like Bridgewater style, where they ask these really hard, like mathematical questions.
Starting point is 00:42:36 It's a really learned skill to be able to subdue your flight or flight response, I think get from the sympathetic, into the parasympathetic so you can actually, you know, engage that voice in your head and do those slow logical calculations. Because evolutionarily, you know, if we see a lion running at us, we didn't have time to sort of calculate the lion's kinetic energy. And you know, is it optimal to go this way or that way? You just react it.
Starting point is 00:43:01 And physically our bodies are well attuned to actually make right decisions. But when you're playing a game like poker, this is not something that you ever evolved to do. And yet, you're in that same flight or flight response. And so that's a really important skill to be able to develop to basically learn how to meditate in the moment and calm yourself so that you can think clearly.
Starting point is 00:43:22 But as you were searching for a comparable thing, it's interesting, because you just made me realize that bluffing is like an incredibly high stakes form of lying. You're lying. And I don't think you can. Telling a story. It's straight up lying.
Starting point is 00:43:40 In the context of a game, it's not a negative kind of lying. Not, but it is. Yeah, exactly. You're representing something that you don't have. And I was thinking, how often in life do we have such high stakes of lying? Because I was thinking, certainly in high level military strategy, I was thinking when Hitler was lying to Stalin about his plans to invade the Soviet Union. And so you're talking to a person like your friends and you're fighting against the enemy, whatever the formulation of that enemy is. But meanwhile, whole time
Starting point is 00:44:20 you're building up troops on the border. That's extremely. Wait, so Hitler and Stalin were like pretending to be friends from my history and all of his terrible. That's crazy. Yeah, that they were, and they worked because Stalin until the troops crossed the border and invaded in Operation Barbarossa where they, this storm of Nazi troops invaded large parts of the Soviet Union and hence one of the biggest wars in human history began. Stalin for sure was thought that this was never going to be, that Hitler is not crazy enough to invade the Soviet Union. And it makes, geopolitically, makes total sense to be collaborators.
Starting point is 00:45:08 And ideologically, even though there's a tension between communism and fascism or national socialism, however you formulated, it still feels like this is the right way to battle the West. Right. They were more ideologically aligned, you know, they in theory had a common enemy, which is the West. Right. They were more ideologically aligned, you know, they in theory had a common enemy, which is the West. So it made total sense. And in terms of negotiations and the way things were communicated, it, it seemed to stall and that for sure that they would remain at least for a while, peaceful collaborators. And that because of that, in the Soviet Union, believe that it was a huge shock when Kiev was invaded. And you hear echoes of that.
Starting point is 00:45:52 I want to travel to Ukraine, sort of the shock of the invasion. It's not just the invasion on one particular border, but the invasion of the capital city. And just like holy shit, especially at that time when you thought World War 1, you realized that that was the war that to end all wars. You would never have this kind of war. And holy shit, this person is mad enough to try to take on this monstrous Soviet Union. So it's not no longer going to be a war of hundreds of thousands dead. It'll be a war of tens of millions dead. And yeah, but that, like, you know, that's a very large scale kind of lie. But I'm sure there's impolitics and geopolitics that kind of lying happening all the time.
Starting point is 00:46:38 And a lot of people pay financially and with their lives for that kind of lying. But in our personal lives, I don't know how often we... Maybe we... I think people do. I mean, like think of spouses cheating on their partners, right? And then like having to lie like, where were you last night? It's stuff like that. Oh shit, that's tough, yeah. Like, that's, I think, you know, I mean, I'm fortunate that stuff happens all the time, right?
Starting point is 00:47:00 Yeah. So... Or having like multiple families, that one is great. When each family doesn't know the other about the other one having like multiple families that one is great when when each family doesn't know the other about the other one and like maintaining that life there's probably a sense of excitement about that too. It seems unnecessary. Yeah. But why? Well just lying like like you know the truth finds a way of coming out you know. Yes, but hence that's the thrill. Yeah, perhaps. Yeah, people. I mean, I know.
Starting point is 00:47:26 You know, that's what, that's why I think actually like poker. What's, what's interesting about poker is most of the best players I know, they're always exceptions, you know, they're always bad eggs, but actually poker players are very honest people. I would say they are more honest than the average, you know, if you just took random, uh, random populations, so sample, because a, you know, I think, you know, humans like to have that. Most people like to have some kind of, you know, mysterious, you know, an opportunity to do something like a little edgy. So we get to sort of scratch that itch of being edgy at the poker table where it's like it's part of the game.
Starting point is 00:48:05 Everyone knows what they're in for and that's allowed and you get to like really get that out of your system. And then also like poker players learned that, you know, I would play in a huge game against some of my friends, even my partner, Igor, where we will be, you know, absolutely going at each other's throats trying to draw blood in terms of winning each money off each other and like getting under each other's skin, winning each other up, doing the craftiest moves we can. But then once the game's done, you know, the winners and the losers will go off and get a drink together and have a fun time. And like, talk about it in this like weird
Starting point is 00:48:40 academic way afterwards, because that's why games are so great because you get to like live out or like this competitive urge that you know most people have. What's it feel like to lose? Like we talked about bluffing when it worked out. What about when you when you go broke? So like in a game I'm you know, I've never gone broke So like in a game I'm I've never gone broke I mean like for life no, I know plenty of people who have And I don't think you would mind me saying he went you know He went broke once in poker bowl. You know early on when we were together
Starting point is 00:49:16 I feel like you haven't lived unless you've gone broke. I yeah, I I Some sense right I mean, I'm happy. I sort of live through it. Vicariously through him when he did it at the time. But yeah, what is it like to lose? Well, it depends. So it depends on the amount. It depends what percentage of your network you've just lost. Um, it depends on your brain chemistry.
Starting point is 00:49:36 It really, you know, very, from person to person. Well, you have a very cold calculating way of thinking about this. Uh, so it depends what percentage you, what it's it. It really does, right? Yeah, it's very true. But that's another thing, poker trains you to do. You see everything in percentages, or you see everything in ROI, or expected hourly, or cost benefit, et cetera.
Starting point is 00:49:57 So one of the things I've tried to do is calibrate the strength of my emotional response to the win or loss that I've received. Because it's no good if you have a huge emotional, dramatic response to a tiny loss. Or on the flip side, you have a huge win and you're dead inside that you don't even feel it. Well, that's a shame. I want my emotions to calibrate with reality as much as possible. So yeah, what's it like to lose? I mean, I've had times where I've lost, you know, busted out of a torment that I thought I was going to win in, especially if I got really unlucky or where I make a dumb play, where I've gone away and like, you know, kicked the wall,
Starting point is 00:50:42 punched the wall, I like nearly broke my hand one time. I'm a lot less competitive than I used to be. I was pathologically competitive in my late teens early 20s. I just had to win everything. And I think that's sort of slowly waned as I've gotten older. According to you, yeah. According to me. I don't know if others would say the same, right?
Starting point is 00:51:03 I feel like ultra competitive people. I I've heard Joe Rogan say this to me. I think he's a lot less competitive than he used to be. I don't know what that. Oh, I believe it. No, I totally believe it. Like, because as you get, you can still be like, I care about winning. Like when, you know, I play a game with my buddies online
Starting point is 00:51:20 or, you know, whatever it is, polytopi is my current obsession. Like, thank you for passing on your obsession to me. Are you playing now? Yeah, I'm playing now. We got to have a game. But I'm terrible and I enjoy playing terribly. I don't want to have a game because that's going to pull me into your monster of competitive play.
Starting point is 00:51:39 It's important, it's an important skill. I'm enjoying playing on the, I can't, I... You just do the points thing of you know against the bots Yeah, I guess the bots and I can't even do the There's like a hard one and there's a very crazy. Yeah, it's crazy I can't I don't even enjoy the hard one the crazy I really don't enjoy because it's intense You have to constantly try to win as opposed to enjoy building a little world and yeah No, there's no time for exploration in polytopia you got to get.
Starting point is 00:52:05 Well, once you graduate from the crazies, then you can come play the... Graduate from the crazies. Yeah, in order to be able to play a decent game against like, you know, our group, you'll need to be consistently winning like 90% of games against 15 crazy bots. Yeah. And you'll be able to, like,
Starting point is 00:52:25 there'll be, I could teach you it within a day, honestly. How, how to be the crazies? How to be the crazies. And then, and then you'll be ready for the big leagues. Generalizes to more than just polytopia, but okay. Why were we talking about polytopia? Losing hurts. Losing hurts.
Starting point is 00:52:41 Oh, yeah. Yes, competitiveness over time. Oh, yeah. I think it's more that, at least for me, I still care about playing, about winning when I choose to play something. It's just that I don't see the world as zero summers I used to be, you know. I think as you one gets older and wiser, you start to see the world more as a positive something, or at least you're more aware of externalities of scenarios of competitive interactions. And so, yeah, I just,
Starting point is 00:53:11 like, and I'm more aware of my own, you know, like, if I have a really strong emotional response to losing, and that makes me feel shitty for the rest of the day, and then I beat myself up mentally for it, like, I'm now more aware that that's unnecessary negative externality. So I'm now more aware that that's unnecessary negative externality.
Starting point is 00:53:26 So I'm like, okay, I need to find a way to turn this down, dial this down a bit. Was poker the thing that, if you think back at your life, and think about some of the lower points of your life, like the darker places you've gone in your mind, did it have to do something with poker? Like, did losing spark the descent into darkness?
Starting point is 00:53:48 So was it something else? Um, I think my darkest points in poker were when I was wanting to quit and move onto other things, but I felt like I hadn't ticked all the boxes I wanted to tick. Like I wanted to tick. I wanted to be the most winningest female player, which is by itself a bad goal.
Starting point is 00:54:11 That was one of my initial goals, and I was like, well, I haven't, and I wanted to win a WPT event. I've won one of these, I've won one of these, but I've won one of those as well. And that, sort of, again, it's a drive of overoptimization to random metrics that I decided were important
Starting point is 00:54:27 without much wisdom at the time, but then carried on. That made me continue chasing it longer than I still actually had the passion to chase it for. And I don't have any regrets that I played for as long as I did because I wouldn't be sitting here. I wouldn't be living this incredible life that I'm living now. This is the height of your life, right now. This is it. Peek experience. Absolute pinnacle here in your robot land. It's a creepy life. No, it is. I mean, I wouldn't change a thing about my life right now and I feel very blessed to say that.
Starting point is 00:55:10 So, but the dark times were in the sort of like 2016-18, even soon, I really, where I was like, I'd stopped loving the game and I was going through the motions and I would take the losses harder than I needed to because I'm like, oh know, I would take the losses harder than I needed to. Because I'm like, it's another one. And I was aware that I felt like my life was ticking away. And I was like, is this going to be what's on my tombstone? Oh, yeah, she played the game of, you know, this zero-sum game of poker. It's slightly more optimally than her next opponent.
Starting point is 00:55:38 You know, it's cool or great, legacy, you know? So I just wanted, you know, there was something in me that knew I needed to be doing something more directly impactful and just meaningful. It was like a search for meaning and I think it's a thing a lot of poker players. Even a lot of, I imagine any games players who sort of love intellectual pursuits,
Starting point is 00:56:01 you know, I think you should ask Magnus Koulson this question. Yeah, walking away from chess, right? Yeah, like it must be so hard for him. He's been on the top for so long. And it's like, well, now what? He's got this incredible brain, like what to put it to. And yeah, it's...
Starting point is 00:56:17 It's this weird moment where I was just spoken with people that won multiple gold medals at the Olympics and the depression hits hard after you win. It's a kind of a good buy saying goodbye to that person to all the dreams you had that thought you thought would give meaning to your life but in fact life is full of constant pursuits of meaning. It doesn't you don't like arrive and figure it all out and there's endless bliss. Now it continues going on and on.
Starting point is 00:56:48 You constantly have to figure out to rediscover yourself. And so for you, like that struggle to say goodbye to poker, you have to like find the next. There's always a bigger game. That's the thing. That's my motto. It's like, what's the next game? And more importantly,
Starting point is 00:57:06 because obviously game usually implies zero sum. Like, what's the game, which is like, Omni win? Look, what would it be? Omni win? Omni win? Omni win is so, so important. Because if everyone plays zero sum games, that's a fast track to either completely stagnate as a civilization, but more actually far more likely to extinguish ourselves. The playing field is finite. Nuclear powers of playing a game of poker with their chips of nuclear weapons, right? And the stakes have gotten so large that if anyone makes a single bet, you know, fires some weapons, the playing field breaks I made a video on this like you know
Starting point is 00:57:48 The fight the playing field is finite and if we keep playing these adversarial zero-sum games thinking that we you know In order for us to win someone else has to lose or if we lose that you know someone else wins that that will extend us It's just a matter of when. What do you think about that mutually-sured destruction, that very simple, almost to the point of character in game theory, idea that does seem to be at the core of why we haven't blown each other up yet with nuclear weapons?
Starting point is 00:58:20 Do you think there's some truth to that? This kind of stabilizing force of mutually shared destruction. And do you think that's gonna hold up through the 21st century? I mean, it has held. Yes, there's definitely truth to it that it was a, you know, it's an ashtag equilibrium. Yeah, are you surprised it held this long? It's not crazy.
Starting point is 00:58:43 It is crazy when you factor in all the near-miss accidental firings. Yes, that makes me wonder, you know, you familiar with the quantum suicide thought experiment, where it's basically like you have a, you know, like a Russian roulette type scenario hooked up to some kind of quantum event, you know, particle splitting, or paraparticle splitting. And if it goes A, then the gun doesn't go off and it goes B, then it does go off and it kills you. Because you can only ever be in the universe,
Starting point is 00:59:19 you know, assuming like the Everett branch, you know, multiverse theory, you'll always only end up in the branch where you continually make, you know, option A comes in. But you run that experiment enough times, it starts getting pretty damn, you know, out of the, the, the tree gets huge. There's a million different scenarios in it, but you'll always find yourself in this, in the one where it didn't go off. And, uh, and so from that perspective, you are essentially immortal because someone, and you will only find yourself
Starting point is 00:59:47 in the set of observers that make it down that path. So it's kind of a... But that doesn't mean that it doesn't mean you're still not gonna be fucked at some point in your life. No, of course not. I'm not advocating that we're all immortal because of this. It's just like a fun thought experiment.
Starting point is 01:00:03 And the point is it raises this thing of these things called observer selection effects, which Bostrom, Nick Bostrom talks about a lot. And I think people should go read. It's really powerful, but I think it could be overextended that logic. I'm not sure exactly how it can be. I just feel like you can get,
Starting point is 01:00:20 you can or generalize that logic somehow. Well, no, I mean, it leaves you into like solipsism, which is a very dangerous mindset. Again, if everyone like falls into solipsism of like, well, I'll be fine. That's a great way of creating a very, you know, self-terminating environment. But my point is, is that with the nuclear weapons thing,
Starting point is 01:00:39 there have been at least, I think it's 12 or 11, neomisses of of just stupid things. There was moon rise over Norway, and it made weird reflections of some glaciers in the mountains, which set off, I think, the alarms of Norad radar. And that put them on high alert, nearly ready to shoot. And it was only because the head of the Russian military happened to be at the UN in New York at the time, that they go like, well, wait a second, why would they fire now when their guy is there? And it was only that lucky happens, which doesn't
Starting point is 01:01:14 happen very often where they didn't then escalate it into firing. And there's a bunch of these different ones. Stanislav Petrov, like saved the person who should be the most famous person on earth, because he's probably on expectations saved the most human lives of anyone, like billions of people by ignoring Russian orders to fire because he felt in his gut that actually this was a false alarm and it turned out to be, you know, very hard thing to do. And there's so many of those scenarios
Starting point is 01:01:37 that I can't help but wonder at this point that we aren't having this kind of like selection effect thing going on, because you look back and you're like, geez, there's a lot of near misses. But of course, we don't know the actual probabilities that they would have lent each one would have ended up in nuclear war. Maybe they were not that likely.
Starting point is 01:01:51 But still, the point is, it's a very dark, stupid game that we're playing. And it is an absolute moral imperative, if you ask me, to get as many people thinking about ways to make this like very precarious, because we're in a Nash equilibrium. But it's not like we're in the bottom of a pit. If you would map it topographically, it's not like a stable ball at the bottom of a thing. We're not in equilibrium because of that. We're on the top of a hill with a ball balanced on top, and just at any little nudge could send it flying down and nuclear war pops off and hellfire and bad times.
Starting point is 01:02:25 On the positive side, life on earth to probably still continue. And another intelligence civilization might still pop up. Maybe. So we'll move on. It would depend, pick your ex-risk, depends on the ex-risk.
Starting point is 01:02:35 Nuclear war sure, that's one of the perhaps less bad ones. Green goo through synthetic biology, very bad. We'll turn, you know you know destroy all organic matter through you know it's basically like a biological paperclip maximizer also bad or AI type you know mass extinction thing as well would also be bad. Shhh, they're listening. There's a robot right behind you. Okay wait, so let me ask you about this from a game theory perspective.
Starting point is 01:03:05 Do you think we're living in a simulation? Do you think we're listening and living inside the video game created by somebody else? Well, so what was the second part of the question? Do I think we're living in a simulation and a simulation that is observed by somebody for purpose of entertainment. So like a video game, are we listening? Are we because there's a cre... it's like Phil Helmuth type of situation, right? There's a creepy level of like, this is kind of fun and interesting. Like there's a lot
Starting point is 01:03:39 of interesting stuff going on. Maybe that could be somehow integrated into the evolutionary process where the way we perceive and... Are you asking me if I believe in God? Sounds like it. Kind of, but God seems to be not optimizing in the different formulations of God that we conceive of. He doesn't seem to be, or she,
Starting point is 01:04:04 optimizing for like personal entertainment. Maybe the older gods did, but the, you know, just like the basically a teenager in their mom's basement watching, created a fun universe to observe. So kind of crazy shit might happen. Okay, so to try and ask this, do I think there is some kind of extraneous intelligence to like our, you know, classic measurable universe that we, you know, can measure with, you know, through all current physics and instruments. I think so, yes. Partly because I've had just small little bits of evidence in my own life, which have made me question, like, so I was a diehard atheist, even five years ago. You know, I got into like the rationality community, big fan of less wrong, continued to be incredible resource. But I've just started to have too many little snippets of experience which don't make sense with the current sort of purely materialistic explanation of
Starting point is 01:05:31 how reality works. Isn't that just like a humbling practical realization that we don't know how reality works? Isn't that just a reminder to yourself? Yeah, no, it's a reminder of epistemic humility because I felt too hard, same as people, I think many people who are just like, my religion is the way, this is the correct way, this is the law, you are immoral if you don't follow this blah, blah, I think they are lacking epistemic humility, they're a little too much hubrister,
Starting point is 01:06:02 but similarly, I think the sort of the Richard Dawkins brand of atheism is too rigid as well. And there's a way to try and navigate these questions, which still honors the scientific method, which I still think is our best realm of reasonable inquiry, a method of inquiry. So an example, I have two kind of notable examples that really rattled my cage. The first one was actually in 2010, early on in my poker
Starting point is 01:06:37 career. And I remember the Icelandic volcano that erupted, that shut down all Atlantic airspace. It meant I got stuck down in the south of France. I was there for something else. I couldn't get home. Someone said, well, there's a big poketormon happening in Italy. Maybe you want to go.
Starting point is 01:06:57 I was like, all right, sure. Let's go train across, found a way to get there. The buy-in was 5,000 euros, which was much bigger than my bankroll would normally allow. And so I played a feed-atornament, won my way in, kind of like I did with the Monte Carlo big one. So then I won my way, you know, from 500 euros into 5,000 euros to play this thing.
Starting point is 01:07:19 And on day one of then the big tournament, which turned out to have, it was the biggest tournament ever held in Europe at the time. It got over like 1,200 people, absolutely huge. I remember they dimmed the lights before the normal shuffle up and deal to tell everyone to start playing. They played Chemical Brothers, Hey Boy, Hey Girl,
Starting point is 01:07:40 which I don't know why it's notable, but it was a song I always liked. It was one of these pump me up songs. I was sitting there thinking, yeah, it's exciting, but it was a song I always liked. It was like one of these pumped me up songs. And I was sitting there thinking, yeah, it's exciting. I'm playing this really big tournament. And out of nowhere, just suddenly there's voice in my head. It sounded like my own sort of, you know, when you think in your mind, you hear a voice kind of, right? And he said, I do.
Starting point is 01:08:01 And so it sounded like my own voice. And it said, you are going to win this tournament. And it was so powerful that I got this wave of like, you know, just sort of goosebumps down my body. And I even, I remember looking around being like, did anyone else hear that? And obviously people are in their phones like, no, no one else heard it. And I was like, okay, six days later, I win the fucking tournament out of 1,200 people. And I said, I don't know how to explain it. Okay, yes, but maybe I have that feeling before every time I play and it's just that I happen to, you know, because I won the tournament, I retroactively remembered it. Or the feeling gave you a kind of,
Starting point is 01:08:47 now from the film, Hell, Museum. Well, exactly. Like, it gave you a confidence, a deep confidence. And it did, it definitely did. Like, I remember then feeling this like sort of, well, although I remember then on day one, I then went and lost half my stack quite early on, and I remember thinking like,
Starting point is 01:09:01 oh, that was bullshit, you know, what kind of premonition is this? Yes. Thinking, oh, I'm out. But, you know, I managed to, like, keep it together and recover. And then, and then just went, like, pretty perfectly from then on. And either way, it definitely instilled me with this confidence. And I don't want to put, I don't, I can't put an explanation. Like, you know, was it some, you know, huge extra extra supernatural thing driving me? Or was it just my own self confidence and someone that just made me make the right decisions? I don't know. And I'm not going to put a frame on it. I think I know a good explanation. So we're
Starting point is 01:09:37 a bunch of NPCs living in this world created by in the simulation. And then people, not people, creatures from outside of the simulation sort of contune in and play your character and that feeling you got is somebody just like they got to play a poker tournament through you. Honestly, it felt like that. It did actually feel a little bit like that but it's been 12 years now. I've retold the story many times. I don't even know how much I can trust my memory. You're just NPC retelling the same story this because they just played the tournament and left Yeah, they're like oh that was fun cool. Yeah cool next time Um, it and now you're for the rest of your life left as a boring NPC retelling this greatness
Starting point is 01:10:16 And what was interesting was that after that then I didn't obviously went but made major tournament for quite a long time and It left that was that was actually another dark period because I had this incredible, the highs of winning that, just on a material level, we're insane winning the money. I was on the front page of newspapers because it was this girl that came out of nowhere and won this big thing. And so again, chasing that feeling was difficult.
Starting point is 01:10:41 But then on top of that, that was this feeling of almost being touched by something bigger that was like, uh, um, maybe did you have a sense that I might be somebody special. Like this kind of, I think that's the confidence thing that, uh, maybe you could do something special in this world after all, kind of feeling? Definitely. I mean, this is a thing I think everybody wrestles with to an extent, right?
Starting point is 01:11:12 We are truly the protagonists in our own lives. And so it's a natural bias, human bias, to feel, to feel special. And I think, and in some ways, we are special. Every single person is special because you are that the universe does, the world literally does revolve around you. That's the thing. In some respect. But of course, if you then zoom out and take the amalgam of everyone's experiences, then no, it doesn't. So there is this shared sort of objective reality, but sorry, this objective reality that is shared, but then there's also this subjective reality, which is truly unique to you.
Starting point is 01:11:45 And I think both of those things coexist and it's not like one is correct and one isn't and again, anyone who's like, oh no, your lived experience is everything versus your lived experience is nothing. No, it's a blend between these two things they can exist concurrently. But there's a certain kind of sense that at least I've had my whole life and I think a lot of people have this is like, well, I'm just like this little person. Surely I can't be one of those people that do the big thing.
Starting point is 01:12:11 There's all these big people doing big things. There's big actors and actresses, big musicians. There's big business owners and all that kind of stuff. Scientists and so on. I have my own subject experience that I enjoy and so on, but there's like a different layer, like surely I can't do those great things. I mean, one of the things just having interacted with a lot of great people, I realize, no, they're just the same, the same, the same humans as me.
Starting point is 01:12:45 And that realization, I think, is really empowering. And to remind you. For all they, for all they are they. Well, in terms of, yeah. They're like a bag of insecurities and peculiar sort of like their own little weirdnesses and so on. I should say also not, they have the capacity for brilliance, but they're not generically brilliant. Like, you know, we tend to say this person or that person is brilliant. But really, no, they're just like sitting there and thinking
Starting point is 01:13:26 through stuff, just like the rest of us. I think they're in the habit of thinking through stuff seriously. And they've built up a habit of not allowing them their mind to get trapped in a bunch of bullshit and minutiae of day-to-day life. They really think big ideas, but those big ideas, a minutiae of day-to-day life, they really think big ideas, but those big ideas, it's like allowing yourself the freedom to think big, to realize that you, you, you, you can be one that actually solve this particular big problem. First, identify a big problem that you care about, and then like, I can actually be the one that solves this problem. And like allowing yourself to believe that, and I think sometimes you do need to have like, that shock go through your body and a voice tells you,
Starting point is 01:14:06 you're gonna win this tournament. Well, exactly. And whether it was, it's this idea of useful fictions. So again, like going through the, or like the classic rationalist training of less wrong where it's like, you want your map, you know, the image you have of the world in your head to as accurately match up with how the world actually is. You want the map
Starting point is 01:14:29 and the territory to perfectly align as, you know, you want it to be as an accurate representation as possible. I don't know if I fully subscribed to that anymore, having now had these moments of like feeling of something either bigger or just actually just being overconfident. There is value in overconfidence sometimes. If you take Magnus Carlson, if he, I'm sure from a young age, he knew he was very talented, but I wouldn't be surprised if he was also had something in him to, well, actually, maybe he's a bad example because he truly is the world's greatest. But someone who, exactly, whether they were going
Starting point is 01:15:09 to be the world's greatest, but ended up doing extremely well because they had this innate, deep self-confidence, this like, even overblown idea of how good their relative skill level is. That gave them the confidence to then pursue this thing and the like, with the kind of focus and dedication that it requires to excel in whatever it is you're trying to do, you know? And so there are these useful fictions and that's where I think I diverge slightly with the classic, the classic sort of rationalist community because that's a field that is worth studying of like how the stories we tell, what the stories we tell to ourselves, even if they are actually false and even if we suspect they might be false. How it's better to have that like little bit of faith, like value in faith, I think,
Starting point is 01:15:58 actually. And that's partly another thing that's now led me to explore, you know, the concept of God, whether you wanna call it a simulator, the classic theological thing. I think we're all like elucidating to the same thing. And now I don't know, I'm not saying, you know, because obviously the Christian God is like, or you know, all benevolent, endless love.
Starting point is 01:16:18 The simulation, at least one of the simulation hypothesis is like, as you said, like a teenager in its bedroom who doesn't really care, doesn't give a shit about the individuals within there, just like wants to see how the thing plays out, because it's curious and it could turn it off like that. You know, where on the sort of psychopathy to benevolence spectrum, God is, I don't know. But having this, just having a little bit of faith that there is something else out there that might be interested in our outcome is, I think, an essential thing, actually, for people to find. A, because it creates commonality between.
Starting point is 01:16:54 It's something we can all share. And it is uniquely humbling of all of us, to an extent, it's like a common objective. But B, it gives people that little bit of reserve and things get really dark. And I do think things are gonna get pretty dark over the next few years. But it gives that like, to think that there's something out there
Starting point is 01:17:15 that actually wants our game to keep going. I keep calling it the game, you know. It's a thing C and I recall it the game. You and C is AKA Grimes. We call it what the game, everything, the whole thing. Yeah, we talk about like... So everything's a game? Not the universe, like what if it's a game
Starting point is 01:17:37 and the goal of the game is to figure out like, well, either how to beat it, how to get out of it. You know, maybe this universe is an escape room, like a giant escape room. And the goal is to beat it, how to get out of it. Maybe this universe is an escape room, like a giant escape room. And the goal is to figure out, put all the pieces of puzzle figure out how it works in order to unlock this hyper-dimensional key and get out beyond what it is. No, but then so you're saying it's like different levels and it's like a cage within a cage within a cage.
Starting point is 01:18:01 You never locate one cage at a time, you figure out how to see that. Like a new level up, you know, like us becoming multi planetary would be a level up or us, you know, figuring out how to upload our consciousnesses to the thing that would probably be a leveling up or spiritually, you know, humanity becoming more combined and less adversarial and blood thirsty and us becoming a little bit more enlightened
Starting point is 01:18:23 that would be a leveling up, you know, there's many different frames to it, whether it's physical, digital or metaphysical. I think level one for Earth is probably the biological evolutionary process, going from single cell organisms to early humans. There may be level two is whatever's happening inside our minds and creating ideas and creating technologies. That's like evolutionary process of ideas. Mm-hmm. And then, multi-planetary is interesting.
Starting point is 01:18:59 Is that fundamentally different from what we're doing here on Earth? Probably, because it allows us to like, exponentially scale. And it delays the Malthusian trap, right? It's a way to keep the playing field get larger so that we can accommodate more of our stuff, more of us. And that's a good thing, but I don't know if it like fully solves this issue of
Starting point is 01:19:30 while this thing called mollock, which we haven't talked about yet, but which is basically, I call it the God of unhealthy competition. Yeah, let's go to mollock. What's mollock? You did a great video on mollock, one aspect of it, the application of it to one aspect of our concept. Instagram beauty filters. Through. Very niche. I wanted to start off small. So Mollock was originally coined,
Starting point is 01:19:59 well, so it's apparently back in the like, Canaanite times, it was this ancient Carthonidgynian, I can never say it Carthonidgyn. Some are around like 300 BC or 200 AD, I don't know. There was supposedly this death cult who would sacrifice their children to this awful demon god thing, they called Mollock, in order to get power to win wars.
Starting point is 01:20:24 So really dark, horrible things. And it was literally like about child sacrifice. Whether they actually existed or not, we don't know. But in mythology, they did. And this god of that they worshiped was this thing called Mollock. And then, I don't know, it seemed like it was kind of quiet throughout history in terms of mythology beyond that,
Starting point is 01:20:40 until this movie Met, in 1927, talked about this... You see that there was this incredible futuristic city that everyone was living great in, but then the protagonist goes underground into the sewers and sees that the city is run by this machine. And this machine basically would just like kill the workers all the time because it was just so hard to keep it running. They were always dying, so there was all this because it was just so hard to keep it running. They were always dying. So it was all this suffering that was required in order to keep the city going.
Starting point is 01:21:09 And then the protagonist has this vision that this machine is actually this demon mollock. So again, it's like this sort of like mechanistic consumption of humans in order to get more power. And then Alan Ginsburg wrote a poem in the 60s, which the incredible poem called Howl about this thing, Molek. And a lot of people sort of quite understandably
Starting point is 01:21:31 take the interpretation of that. He's talking about capitalism. But then the sort of pieced a resistance that's moved Molek into this idea of Game Theory was a Scott Alexander of Slate Cell Codex, wrote this in pre-reduct, one literally I think it might be my favorite piece of writing of all time. It's called Meditations on Molek. Everyone must go read it. And... I think Codex is a blog. It's a blog, yes. We can link to it in the show notes or something, right?
Starting point is 01:22:01 No, don't... Yes, yes. how you how you assume I have a professional operation going on. I, I, I, I shall try to remember. What do you, what do you want? What do you want? You're giving the impression of it. Yeah. I'll look please.
Starting point is 01:22:17 If I, if I don't please somebody in the comments remind me, yes, if you don't know this blog, it's one of the best blogs ever probably you should probably be following it. Our blog's still a thing. I think they are still a thing. Yeah, he's migrated onto substag but yeah, it's still a blog. Obstag better not fuck things up. I hope not. Yeah, I hope they don't I hope they don't turn mollicky. Which will mean something to people when we continue. When they stop interrupting for one for it, it's good. So anyway, he writes this piece, Meditations on Molek.
Starting point is 01:22:50 And basically, he analyzes the poem and he's like, OK, so it seems to be something relating to where competition goes wrong. And Molek was historically this thing of where people would sacrifice a thing that they care about, in this this case children, their own children, in order to gain power, a competitive advantage. And if you look at almost everything that sort of goes wrong in our society, it's that same process. So with the Instagram beauty filters thing, you know, if you're trying to become a famous Instagram model, you are incentivized to post the hottest pictures of yourself that you can. You're trying to play that game. There's a lot
Starting point is 01:23:32 of hot women on Instagram. How do you compete against them? You post really hot pictures, and that's how you get more likes. As technology gets better, make up techniques come along. Then more recently, these beauty filters, where like at the touch of a button, it makes your face look absolutely incredible compared to your natural, natural face. These technologies come along. It's, everyone is incentivized to that short-term strategy. But over on net, it's bad for everyone, because now everyone is kind of feeling like they have to use these things.
Starting point is 01:24:06 And these things like they make you, the reason why I talked about them in this video is because I noticed it myself. You know, like I was trying to grow my Instagram for a while, I've given up on it now, but you know, and I noticed these filters, how good they made me look. And I'm like, well, I know that everyone else is kind of doing it.
Starting point is 01:24:21 You'll subscribe to live Instagram. Please, so I don't have to use the filters. I'll post a bunch of, yeah, make it blow up. So yeah, it's... Well, those do you thought the pressure actually? Exactly. These short-term incentives to do this thing that either sacrifices your integrity or something else in order to stay competitive, which on aggregate,
Starting point is 01:24:46 turns, creates this race to the bottom spiral, where everyone else ends up in a situation which is worse off than if they hadn't started, than it were before. Kind of like if a football stadium, the system is so badly designed, a competitive system of everyone sitting and having a view, that if someone at the very front
Starting point is 01:25:05 stands up to get an even better view, it forces everyone else behind to like adopt that same strategy, just to get to where they were before. But now everyone's stuck standing up. So you need this like top down God's I coordination to make it go back to the better state. But from within the system, you can't actually do that. So that's kind of what this malloc thing is.
Starting point is 01:25:22 It's this thing that makes people sacrifice values in order to optimize for the winning the game in question, the short-term game. But this malloc, can you attribute it to anyone centralized source or is it an emergent phenomena from a large collection of people? Exactly that. It's an emergent phenomena. It's a force of game theory. It's a force of bad incentives on a multi-agent system. Where you've got more, you know, prisoners dilemma is technically a kind of molecule system as well, but it's just a two-player thing. But another word for molecules is multipolar trap. Basically, you just got a lot of different people or competing for some kind of prize. And it would be better if everyone didn't do this one shitty strategy, but because that strategy gives you a short-term advantage, everyone's incentivized to do it, and so everyone ends up doing it. So, there's responsibility for...
Starting point is 01:26:16 Social media is a really nice place for a large number of people to play game theory. And so, they also have the ability to then design the rules of the game. And is it on them to try to anticipate what kind of, like, to do the thing that poker players are doing to run simulation? Ideally, that would have been great. If Mark Zuckerberg and Jack and all that, the Twitter founders and everyone,
Starting point is 01:26:42 if they had at least just run a few simulations of how their algorithms would, you know, different types of algorithms would turn out for society. That would have been great. That's really difficult to do that kind of deep philosophical thinking about thinking about humanity actually. So not Not kind of this level of how do we optimize engagement? not kind of this level of how do we optimize engagement or what brings people joy in the short term. But how is this thing going to change the way people see the world? How's it going to get morphed in iterative games played into something that will change society forever?
Starting point is 01:27:22 That requires some deep thinking. I hope there's meetings like that and say companies, but I haven't. That's the problem. And it's difficult because when you're starting up a social media company, you're aware that you've got investors to please, there's, you bills to pay, there's only so much R&D you can afford to do. You've got all these incredible pressures, it's bad incentives to get on and just build your thing as quickly as possible and start making money. And I don't think anyone intended
Starting point is 01:27:54 when they built these social media platforms and just to like prefaces. So the reason why social media is relevant because it's a very good example of like, everyone these days is optimizing for clicks, whether it's a social media is relevant because it's a very good example of like everyone these days is optimizing for, you know, clicks. Whether it's a social media platform themselves because, you know, every click gets more, you know, impressions and impressions pay for, you know, they get advertising dollars or whether it's Individual influencers or, you know, whether it's a New York Times or whoever they're trying to get their story to go viral. So everyone's got this bad incentive of using, you know, as you called it,
Starting point is 01:28:25 the clickbait industrial complex. That's a very mollicky system because everyone is now using worse and worse tactics in order to like try and win this attention game. And yeah. So ideally, these companies would have had enough slack in the beginning in order to run these experiments to see, okay, what are the ways this could possibly go wrong for people? What are the ways that molluck, they should be aware of this concept
Starting point is 01:28:50 of molluck and realize that it's, whenever you have a highly competitive multi-agent system, which social media is a classic example of millions of agents, or trying to compete for likes and so on, and you try and bring all this complexity down into like very small metrics, such as number of likes, number of retweets, whatever the algorithm optimizes for. That is a guaranteed recipe for this stuff to go wrong and become a race to the bottom. I think there should be an honesty when founders, I think there's a hunger for that kind of transparency of like, we don't know what the fuck we're doing. This is a fascinating experiment. We're all running as a human as a human civilization. Let's try this out.
Starting point is 01:29:29 And like, actually, just be honest about this. The wall like these weird rats and amaze, not none of us are controlling it. There's this kind of sense like the founders, the CEO of Instagram or whatever, Mark Zuckerberg has a control and he's like, like, with strings playing people. No, they're... He's at the mercy of this, like everyone else. He's just like, trying to do his best. And like, I think putting on a smile and doing over polished videos about how Instagram and Facebook are good for you, I think it's not the right way to actually ask some of the deepest questions we get to ask as a society. How do we design the game such that we build about a world?
Starting point is 01:30:13 I think a big part of this as well is people, there's this philosophy, particularly in Silicon Valley of well, techno optimism, technology will solve all our issues. And there's a steel man argument to that where, yes, technology has solved a lot of problems and can potentially solve a lot of future ones. But it can also, it's always a double-edged sword and particularly as technology gets more and more powerful and we've now got like big data and we're able to do all kinds of like psychological manipulation with it and so on. Technology is not a values neutral thing. People think, I used to always think this myself,
Starting point is 01:30:51 it's like this naive view that, oh, technology is completely neutral, it's just, it's the humans that either make it good or bad. No, to the point we're at now, the technology that we are creating, they are social technologies, they literally dictate how humans now form social groups and so on, beyond that. And beyond that, it also then, that gives rise to the memes that we then coalesce around.
Starting point is 01:31:17 And if you have the stack, that way, where it's technology driving social interaction, which then drives like memetic culture and like which ideas become popular, that's molluck. And we need the other way around, we need it so we need to figure out what are the good memes, what are the good values that we think are, we need to optimize for that like makes people happy and healthy and like keeps society as robust and safe as possible, then figure out what the social structure around those should be, and only then do we figure out technology. But we're doing the other way around.
Starting point is 01:31:51 And as much as I love, in many ways, the culture of Silicon Valley, and I do think that technology, I don't want to knock it. It's done so many wonderful things for it, same as capitalism. There are, We have to be honest with ourselves. We're getting to a point where we are losing control of this very powerful machine that we have created. Can you redesign the machine within the game? Can you just have... Can you understand the game enough? Okay, this is the game and this is how we start to
Starting point is 01:32:26 Re-emphasize the memes that matter The the memes them bring out the best in us You know like the way I try to be in real life and the way I try to be online Is to be about kindness and love and I feel like I'm Sometimes get like criticized or being naive and all those kinds of things, but I feel like I'm sometimes get like criticized or being naive and all those kinds of things, but I feel like I'm just trying to live within this game. I'm trying to be authentic. Yeah, but also like, hey, it's kind of fun to do this. Like, you guys should try this too, you know, that and that's like trying to redesign some aspects of the game within the game.
Starting point is 01:33:05 Is that possible? I don't know, but I think we should try. I don't think we have an option but to try. Well, the other option is to create new companies or to pressure companies that or anyone who has control of the rules of the game. I think we need to be doing all of the above. I think we need to be doing all of the above. I think we need to be thinking hard about what are the kind of positive, healthy memes. As Elon said, he controls the memes, controls the universe. He said that.
Starting point is 01:33:39 I think he did. But there's truth to that. It's very, there is wisdom in that because memes have driven history. We are cultural species. That's what sets us apart from chimpanzees and everything else. We have the ability to learn and evolve through culture as opposed to biology or like, you know, classic physical constraints. And that means culture is incredibly powerful. And we can create and
Starting point is 01:34:08 become victim to very bad memes or very good ones. But we do have some agency over which memes, we, we, but not only put out there, but we also like subscribe to. So I think we need to take that approach. We also need to, you know, because I don't want the, the, the, the, you know, I'm making this video right now called the Attention Wars, which is about, like, how molluck, it's like the media machine is this molluck machine. Well, is this, is this kind of, like, blind dumb thing that where everyone is optimizing for engagement in order to win their share of the attention pie. And then if you zoom out, it's really like molluck that's pulling the strings because the only thing that benefits from this in the end.
Starting point is 01:34:46 You know, like, oh, our information ecosystem is breaking down. Like, we have, you look at the state of the US, it's in, we're in a civil war. It's just not a physical war. It's an information war. And people are becoming more fractured in terms of what their actual shared reality is. Like, truly, like an extreme left person, an extreme right person, like they literally live in different worlds in their minds at this point, and it's getting more and more amplified, and this force is like a like razor blade
Starting point is 01:35:15 pushing through everything. It doesn't matter how innocuous a topic is, it'll find a way to split into this bifurcated culture war and it's fucking terrifying. Because that maximizes attention, and that's like an emergent malloc type force that takes anything, any topic and cuts through it so that you can split nicely into two groups. One that's... Well, it's whatever, yeah, it's...
Starting point is 01:35:41 All everyone is trying to do within the system is just maximize whatever gets them the most attention because they're just trying to make money so they can keep their thing going, right? And the way the the best emotion for getting attention in well because it's not just about attention on the internet, it's engagement, that's the key thing, right? In order for something to go viral, you need people to actually engage with it, they need to like comment or retweet or whatever. actually engage with it, they need to like comment or retweet or whatever. And of all the emotions, there's like seven classic shared emotions that studies have found that all humans even from like un, previously uncontacted tribes have. Some of those are negative, you know, like sadness, disgust, anger, etc., some positive happiness, excitement, and so on. The one that happens
Starting point is 01:36:28 to be the most useful for the internet is anger, because anger is such an active emotion. If you want people to engage, if someone's scared, and I'm not just talking out my ass here, there are studies here that have looked into this, where it's like if's like disgusted or fearful, they actually tend to then be like, I don't want to deal with this. So they're less likely to actually engage and share it and so on. They're just going to be like, whereas if they're enraged by a thing, or now that like that trick is all the like the old tribalism emotions. And so that's how then things get sort of spread much more easily. They outcompete all the other memes in the ecosystem. And so this, like, the attention economy, the wheels that make it go around is rage.
Starting point is 01:37:15 I did a tweet. The problem with raging against the machine is that the machine is learned to feed off rage because it is feeding off our rage. That's the thing that's now keeping it going. So the more we get angry, the worse it gets. So the mollic in this attention and the war of attention is constantly maximizing rage. What it is optimizing for is engagement
Starting point is 01:37:38 and it happens to be that engagement is well propaganda. I mean, it just sounds like everything is putting, more and more things being put through this propaganda lens of winning whatever the war is in question, like it's the culture war or the Ukraine war. Well, I think the silver lining of this, do you think it's possible that in the long arc of this process, you actually do arrive at greater wisdom and more progress. It just in the moment, it feels like people are turning each other to shreds over ideas, but if you think about it, one of the magic things about democracy and so on,
Starting point is 01:38:15 is you have the blue versus red constantly fighting. It's almost like they're in this course creating devil's advocate, making devil's out of each other and through that process, discussing ideas, like almost really embodying different ideas, just to yell at each other and through the yelling over the period of decades, maybe centuries figuring out a better system. Like in the moment, it feels fucked up, but in the long
Starting point is 01:38:43 arc, it actually is productive. I hope so. That said, we are now in the era of just as we have weapons of mass destruction with nuclear weapons, you know, that can break the whole playing field. We now are developing weapons of informational mass destruction, information white pins. WMDs that basically can be used for propaganda or just manipulating people, however, they, you know, is needed, whether that's through dumb TikTok videos or, you know, there are significant resources being put in. I don't mean to sound like, you know,
Starting point is 01:39:26 too doom and gloom, but there are bad actors out there. That's the thing. There are plenty of good actors within the system who are just trying to stay afloat in the game. So effectively doing monarchy things. But then on top of that, we have actual bad actors who are intentionally trying to like manipulate the other side into doing things.
Starting point is 01:39:42 And using, so because of the digital space, they're able to use artificial actors, meaning bots. Exactly, botnets, you know, and this is a whole new situation that we've never had before. Yeah, it's exciting. You know what I want to do? You know what I want to do that?
Starting point is 01:40:01 Because there is, you know, people are talking about bots manipulating and have like malicious bots that are basically spreading propaganda, I want to create like a bot army for like, that like fights that, yeah, exactly for love that fights though that, I mean, you know, there's, I mean, there's truth to fight fire with fire. It's like, but how you always have to be careful whenever you create, again, like, molluck is very tricky. Yeah, Hitler was trying to spread the love too. Well, yeah, sorry, thought.
Starting point is 01:40:30 But, you know, I agree with you that, like, that is a thing that should be considered, but there is, again, everyone, the road to hell is paved in good intentions. And this is, there's always unforeseen outcomes, externalities of you trying to adopt a thing, even if you do it in the very best of faith. But you can learn lessons of history. If you can run some sims on it first, absolutely. But also there's certain aspects of a system as we've learned through history that do better than others. For example, don't have a dictator.
Starting point is 01:41:01 So if I were to create this bot army, it's not good for me to have full control over it. Because in the beginning, I might have a good understanding of what's good and not. But over time, that starts to get deviated because I'll get annoyed at some assholes, and I'll think, okay, it wouldn't be nice to get rid of those assholes, but then that power starts getting to your head, you become corrupted. That's basic human nature. So distribute the power. We need a love botnet on a dow. A dow love botnet. Yeah, but and without a leader, like without was actually a distributed right without any kind of centralized. Yeah, without even, you know, basically is the more control, the more you can decentralize the control of a thing to people, you know, but the, the, the, the, the, the, the, the, the, the ability
Starting point is 01:41:50 to coordinate because that's the issue when you, if something, something is too, you know, that's really, to me, like the culture wars is the bigger war we're dealing with is actually between the pat, like the sort of the, I don't know what even the term is for it, but like centralization versus decentralization, that's the tension we're seeing, power in control by a few versus completely distributed. And the trouble is if you have a fully centralized thing, then you're at risk of tyranny, you know, Stalin type things can happen, or completely distributed.
Starting point is 01:42:21 Now you're at risk of complete anarchy and chaos where you can't even coordinate to like on, you know, when there's like a pandemic or anything like that. So it's like, what is the right balance to strike between these two structures? Camp Malik really take hold in an fully decentralized system. That's one of the dangers too. Very vulnerable to malik. So the dictator can commit huge atrocities, but they can also make sure the infrastructure works and train as well. They have that God's eye view, at least.
Starting point is 01:42:49 So they have the ability to create like laws and rules, like force coordination, which stops Molek. But then you're vulnerable to that dictator getting infected with like this, with some kind of psychopathy type thing. What's reverse Molek? So great question. So that's where I've been working on this series. It's been driving me insane for the last year and a half.
Starting point is 01:43:13 I did the first one a year ago. I can't believe it's nearly been a year. The second one, hopefully, we're coming out in like a month. And my goal at the end of the series is to like present, because basically I'm painting the picture of like what molecules and how it's affecting almost all these issues in our society and how it's, you know, driving. It's like kind of the generator function as people describe it of existential risk. And then at the end of that.
Starting point is 01:43:37 Wait, wait, wait, the generator function of existential risk. So you're saying, molecules sort of the engine that creates a bunch of exeriscs. Yes, not all of them. Like a, you know, a... It's a cool phrase, generator function function. It's not my phrase, it's Daniel Schmackt and Burger. Oh, sure. I got that from him, of course.
Starting point is 01:43:53 All things, it's like all roads lead back to Daniel Schmackt and Burger, I think. The dude is brilliant. He's really... After that, it's Mark Twain. Anyway, sorry. Totally rudelys for me. No, it's fine.
Starting point is 01:44:07 So not all likes risks. So an asteroid technically isn't because it's just like this one big external thing. It's not like a competition thing going on. But synthetic bio weapons, that's one because everyone's incentivized to build even for defense, bad viruses, just threaten someone else, etc. or AI, technically, the race to AGI is potentially a molecule situation. But yeah, so if molecule is this generator function that's driving all of these issues over
Starting point is 01:44:41 the coming century that might wipe us out. What's the inverse? And so far, what I've gotten to is this character that I want to put out there called win-win. Because Mollik is the God of Lose Lose, ultimately. It masquerades is the God of Win-Lose, but in reality, it's Lose Lose. Everyone ends up worse off. So I was like, well, what's the opposite of that?
Starting point is 01:45:00 It's win-win. And I was thinking for ages, like, what's a good name for this character? And then the morrow's like, okay, well, I don't want to try and think through it logically. What's the vibe of win-win? And to me, in my mind, Mollik is like, and I address as it in the video,
Starting point is 01:45:15 it's red and black, it's kind of like very, my hyper focused on it's one goal you must win. So win-win is kind of, actually, these colors. It's like purple turquoise. It's loves games too. It loves a little bit of healthy competition but constrained, like kind of like before, like knows how to ring fence zero some competition into just the right amount whereby its externalities can be controlled and kept positive. And then beyond that, it also loves cooperation, coordination, love, all these other things.
Starting point is 01:45:48 But it's also kind of mischievous. It will have a good time. It's not boring. Oh, god. It's had to have fun. It can get down. But ultimately, it's unbelievably wise, and it just wants the game to keep going.
Starting point is 01:46:06 And I call it win-win. That's a good pet name. Win-win. I think the win-win, right? And I think it's formal name when it has to do like official functions is omnia. Omnia. Yeah. From like omniscience, kind of what's a white omnia?
Starting point is 01:46:24 It's like omni-win. Omni-win. But I'm an missions kind of what's with why I'm yeah, you just like I'm yeah, I'm new when But I'm open to suggestions. I'm like, you know, and this is I'll come yeah. Yeah, yeah Like there's an angelic kind of sense to Omnia though. So win win is more fun. So it's more like a It embraces the The fun aspect I mean there is something about sort of There's some aspect to win, win interactions that requires embracing the chaos of the game and enjoying the game itself. I don't know. I don't know what that is. That's almost like a zen-like appreciation of the game itself, not optimizing for the consequences of the game.
Starting point is 01:47:06 Right. Well, it's recognizing the value of competition in of itself, about it's not like about winning. It's about you enjoying the process of having a competition and not knowing whether you're going to win or lose this little thing. But then also being aware that, you know, what's the boundary? How big do I want competition to be? Because where what's the boundary, how big do I want competition to be? Because one of the reason why Molek is doing so well now in our civilization is because we haven't been able to ring fence competition. And so it's just having all these negative externalities.
Starting point is 01:47:35 And we've completely lost control of it. It's, I think my guess is, and now we're getting really like metaphysical, technically. But I think we'll be in a more interesting universe if we have one that has both pure cooperation, you know, lots of cooperation and some pockets of competition than one that's purely competition, cooperation entirely. Like, it's good to have some little zero- zero sum this bits, but I don't know that fully and I'm not qualified as a philosopher to know that.
Starting point is 01:48:09 And that's what reverse mollux, so this kind of win-win creature is an system as an antidote to the mollux system. Yes. And I don't know how it's going to do that. But it's good to kind of try to start to formulate different ideas, different frameworks of how we think about that. Exactly. At the small scale of a collection of individuals or our scale of a society.
Starting point is 01:48:33 Exactly. It's a meme. I think it's an example of a good meme. And I'm open. I'd love to hear feedback from people if they think it's, you know, they have a better idea or it's not, you know, but it's the direction of memes that we need to spread this idea of like look for the win-wins in life. Well, on a topic of beauty filters, so in that particular context where Mollick creates negative consequences, what do you know, the St. Husky said, beauty will save the world.
Starting point is 01:49:02 What is beauty anyway? What It would be nice to just try to discuss what kind of thing we would like to converge towards in our understanding of what is beautiful. So to me, I think something is beautiful when it can't be reduced down to easy metrics. Like if you think of a tree, what is it about a tree, like a big ancient beautiful tree, right? What is it about it that we find so beautiful? It's not, you know, the, you know, what are the sweetness of its fruit or the value of its lumber. It's, it's this entirety of it that is, there's these immeasurable qualities. It's like almost like
Starting point is 01:49:56 a qualia of it. That's both, like it walks this fine line between pattern, what it's got, lots of patternicity, but it's not overly predictable. You know, again, it walks this fine line between pattern, it's got lots of patonicity, but it's not overly predictable. Again, it walks this fine line between order and chaos. It's a very highly complex system. And you can't, it's evolving over time. The definition of a complex versus, and this is another Schmackt and Burger thing, a complex versus a complicated system.
Starting point is 01:50:22 A complicated system can be sort of broken down into bits, understood, and then put that together. A complex system, it's kind of like a complicated system. A complicated system can be sort of broken down into bits, understood and then put that together. A complex system, it's kind of like a black box. It does all this crazy stuff, but if you take it apart, you can't put it back together again because there's all these intricacies. And also very importantly, there's some of the parts, sorry, there's some of the whole
Starting point is 01:50:40 is much greater than the some of the parts. And that's where the beauty lies, I think. And I think that extends to things like art as well. Like, there's something immeasurable about it. There's something we can't break down to, a narrow metric. Does that extend to humans, you think? Yeah, absolutely.
Starting point is 01:50:58 So how can Instagram reveal that kind of beauty, the complexity of a human being? Good question. And this takes us back to our dating sites and good reads, I think. Very good question. I mean, well, I know what it shouldn't do. It shouldn't try and like, right now, you know,
Starting point is 01:51:18 one of the, I was talking to like a social media expert recently because I was like, I hate that. It's such a things the social media expert. Oh, yeah. You, there are like, I hate that. There's such a things the social media expert. Oh, yeah. You, there are like agencies out there that you can like outsource, because I'm thinking about working with one. So like, I so I want to start a podcast.
Starting point is 01:51:33 Yeah. You should. You should have done it a long time ago. Working on it. It's going to be called win-win. Nice. It's going to be about this like positive stuff. And the thing that, you know, they all come back and say,
Starting point is 01:51:46 well, you need to like figure out what your thing is. You know, you need to narrow down what your thing is and then just follow that. Have a, like, a sort of a formula because that's what people want. They want to know that they're coming back to the same thing. And that's the advice on YouTube, Twitter, you name it. And that's why, and the trouble with that is that it's a complexity reduction.
Starting point is 01:52:07 And generally speaking complexity reduction is bad. It's making things more, it's an oversimplification. Not that simplification is always a bad thing. When you're trying to take, what is social media doing? It's trying to encapsulate the human experience and put it into digital form and commodify it to an extent. So you do that, you compress people down into these narrow things. And that's why I think it's kind of ultimately fundamentally incompatible with at least my definition of beauty. It's interesting because there is some sense in which a simplification, sort of in the
Starting point is 01:52:47 Einstein kind of sense, of a really complex idea, a simplification in a way that still captures some core power of an idea of a person is also beautiful. And so maybe it's possible for social media to do that. A presentation, a sort of a slither, a slice, a look into a person's life that reveals something real about them. But in a simple way, in a way that can be displayed graphically or through words, some way, I mean, in some way, Twitter can do that kind of thing. A very few set of words can reveal the intricacies of a person. Of course, the viral machine that spreads those words often results in people taking the thing out of context. Often don't read tweets in the context
Starting point is 01:53:43 of the human being that wrote them. The full history of the tweets that were in the education level, the humor level, the world view they're playing around with, all that context is forgotten and people just see the different words. So that can lead to trouble. But in a certain sense, if you do take it in context, it reveals some kind of quirky little beautiful idea or profound little idea from that particular person that shows something about that person. So in that sense, Twitter can be more successful. If we talk about Mollix,
Starting point is 01:54:17 is driving a better kind of incentive? Yeah, I mean, how they can like if we were to rewrite, it is there a way to rewrite the Twitter algorithm so that it stops being the like the fertile breeding ground of the culture wars because that's really what it is. It's, I mean, maybe I'm giving it, you know, Twitter too much power, you know, power, but just the more I looked into it and I had conversations with Tristan Harris from Central Fumane Technology and he explained it as like, Twitter is where you have this amalgam of human culture and then this terribly designed algorithm that amplifies the craziest people. And the angriest, the angriest, most divisive takes and amplifies them. And then the media, the mainstream media, because all the journalists are also on Twitter,
Starting point is 01:55:14 they then are informed by that. And so they draw out the stories they can from this already like very boiling lava of rage and then spread that, you know, to their millions and millions of people who aren't even on Twitter. And so I honestly, I think if I could press a button, turn them off, probably would at this point, because I just don't see a way at being compatible with healthiness, but that's not going to happen. And so at least one way to like stem the tide and make it less molecule would be to change, at least if it was on a subscription model, then it's now not optimizing for, you know, impressions,
Starting point is 01:56:01 because basically what it wants is for people to keep coming back as often as possible. That's how they get paid, right? Every time an ad gets shown to someone and the way is to get people constantly refreshing their feed. So you're trying to encourage addictive behaviors. Whereas if someone, if they moved on to at least a subscription model, then they're getting the money either way, whether someone comes back to the site once a month or 500 times a month. They get the same amount of money. So now that takes away that incentive, to use technology to design an algorithm that
Starting point is 01:56:33 is maximally addictive. That would be one way, for example. Yeah, but you still want people to... Yeah, I just feel like they're just slows, creates friction in the virality of things. But that's good. We need to slow down virality. It's good. It's one way. Virality is mollock to be clear.
Starting point is 01:56:55 So mollock is always negative then. Yes. By definition. Yes. But then I disagree with you. Competition is not always negative. Competition is neutral. I disagree with you. No, always negative competition is neutral I disagree with you that all virality is negative then as malloc them because I It's a good intuition because we have a lot of data on virality being negative But I happen to believe that the core of human beings so most human beings want to be good
Starting point is 01:57:26 More than they want to be bad to each other. And so I think it's possible, it might be just harder to engineer systems that enable virality, but it's possible to engineer systems that are viral, that enable virality, and the kind of stuff that rises to the top is things that are positive and positive not like Lala positive is more like win-win meaning a lot of people need to be challenged What why is things? Yeah, you grow from it to my challenging You might not like it, but you often the grow from it And ultimately bring people together as opposed to tell them apart. Yeah, I
Starting point is 01:58:04 Deeply want that to be true. And I very much agree with you that people at their core are on average good, as opposed, you know, care for each other as opposed to not. Like, you know, I think it's actually a very small percentage of people are truly like wanting to do just like destructive malicious things. Most people are just trying to win their own little game. And they don't mean to be, you know, they're just stuck in this badly designed system. That said, the current structure, yes,
Starting point is 01:58:29 is the current structure means that virality is optimized towards molloc. That doesn't mean that on exceptions, you know, sometimes positive stories do go viral. And I think we should study them. I think there should be a whole field of study into understanding, you know, identifying memes that, you know, above a certain threshold of the population, agree as a positive, happy, bringing people together meme, the kind of thing that, you know, brings families together
Starting point is 01:58:54 that would normally argue about cultural stuff at the table, at the dinner table. Identify those memes and figure out what it was, what was the ingredient that made them spread that day? And also, like, not just like happiness and connection between humans, but connection between humans in other ways that enables like productivity, like cooperation, solving difficult problems and all those kinds of stuff. You know, so it's not just about let's be happy
Starting point is 01:59:24 and have a fulfilling lives. It's also like, let's be happy and have a fulfilling lives. It's also like let's build cool shit. Yeah. Which is the spirit of collaboration, which is deeply anti-molic, right? That's, it's not using competition. It's like, you know, more like hates collaboration and coordination and people working together. And that's, you know, again, like the internet started out as that and it could have been that, but because of the way it was sort of structured in terms of, you know, very lofty ideal.
Starting point is 01:59:51 They wanted everything to be open source or open source and also free, but they needed to find a way to pay the bills anyway because they were still building this on top of our old economics system. And so the way they did that was through third party advertisement, but that meant the things were very decoupled. You know, the, you've got this third party interest, which means that you're then like people are having to optimize for that. And that is, you know, the actual consumer is actually the product, not the, not the, the, the, the person you're making the thing for. You're in, in the the end you stop making the thing for the advertiser. And so that's why it then breaks down.
Starting point is 02:00:28 Yeah, like it's, there's no clean solution to this. And I, it's a really good suggestion by you actually to like figure out how we can optimize virality for positive sum. Topics. I shall be the general of the love bot army. Distributed. Distributed. No, okay, yeah, the power just even insane at the power already went to my head. No, okay, you've talked about quantifying your thinking.
Starting point is 02:01:00 We've been talking about this sort of a game theoretic view on life and putting probabilities behind estimates Like if you think about different trajectories you can take through life Just actually analyzing life and game-threatic way like your own life like personal life You I think you've given an example that you had an honest conversation with Igor about like how long is this relationship? I'm gonna last So similar to our sort of marriage problem kind of discussion, having an honest conversation about the probability
Starting point is 02:01:30 of things that we sometimes are a little bit too shy or scared to think of in a probabilistic terms. Can you speak to that kind of way of reasoning, the good and the bad of that? Can you do this kind of thing with human relations? Yeah, so the scenario you're talking about, it was like. Yeah, tell me about that. Yeah. I think it was about a year into our relationship
Starting point is 02:01:55 and we were having a fairly heavy conversation because we were trying to figure out whether or not I was gonna sell my apartment, but he had already moved in, but I think we were just figuring out what like our long-term plans would be. Should be, should be by a place together, et cetera. When you guys are having that conversation,
Starting point is 02:02:12 are you like drunk out of your mind on wine or is you sober and you're actually having a serious, like how do you get to that conversation? Because most people are kind of afraid to have that kind of serious conversation. Well, so, you know, our relationship was very, well, first of all, we were good friends for a couple of years before we even got, you know, in romantic. And when we did get romantic, it was very clear that this was a big deal.
Starting point is 02:02:40 It wasn't just like another, like, you know, it wasn't a random thing. And so the probability of it being a big deal. It wasn't just like another, you know, it wasn't a random thing. And so the probability of it being a big deal was high. It was already very high. And then we had been together for a year and it had been pretty golden and wonderful. So, you know, there was a lot of foundation already where we felt very comfortable having a lot of frank conversations. But Egor's MO has always been much more than mine. He was always from the outset, like, just in a relationship, radical transparency and honesty is the way, because the truth is the truth, whether you want to hide it on,
Starting point is 02:03:12 you know, it will come out eventually. And if you aren't able to accept difficult things yourself, then how could you possibly expect to be, like, the most integral version? You can't, the relationship needs this bedrock of like honesty as a foundation more than anything. Yeah, it's really interesting, but I would like to push against some of those ideas, but let's don't align your stomach. I just brutally interrupt. That's fine. And so, you know, we'd been about together for a year, things were good, and we were having this hard conversation.
Starting point is 02:03:48 And then he was like, well, okay, what's the likelihood that we're going to be together in three years then? Because I think it was roughly a three-year time horizon. And I was like, oh, interesting. And then we were like, actually, wait, before you say that, let's both write down our predictions formally. Because we were just getting into like effective altruism and rationality at the time, which is all about making
Starting point is 02:04:07 formal predictions as a means of measuring your own, well, your own foresight, essentially, in a quantified way. So we like both wrote down our percentages, and we also did a one year prediction and a 10 year one as well. So we got percentages for all three, and then we showed each other. And I remember having this moment of like, ooh, because I went for the 10-year one,
Starting point is 02:04:32 I was like, oh, well, I mean, I love him a lot, but like a lot can happen in 10 years, you know? And we've only been together for, you know, so I was like, I think it's over 50%, but it's definitely not 90%. And I remember like wrestling, I was like, oh, but I don't want him to be hurt. I don't want him to, you know,
Starting point is 02:04:46 I don't want to give a number lower than his. And I remember thinking I was like, ah, don't game it. This is an exercise in radical honesty. So just give your real percentage. And I think mine was like 75%. And then we showed each other and luckily we were fairly well aligned.
Starting point is 02:05:02 But honestly, but everyone, 20%. Huh? I definitely, it definitely would have, I, if his had been consistently lower than mine, that would have rattled me for sure. Whereas if it had been the other way around, I think he would, he's just kind of like a water off the ducts back type of guy. I'd be like, okay, well, all right, we'll figure this out. Well, did you guys provide air bars on the estimate, like the level on the gym?
Starting point is 02:05:23 They came built in. We didn't give formal plus or minus air bars. I didn't draw any or anything like that. Well, I guess that's the question I have is, did you feel informed enough to make such decisions? Because I feel like if I were to do this kind of thing rigorously, I would want some data. I would want to say, one of these assumptions you have is you're not that different from other relationships. Right. And so I want
Starting point is 02:05:52 to I want to have some data about the base rights. Yeah. And and also actual trajectories of relationships. I would love to have like time series data about the ways the relationships fall apart or prosper, how they collide with different life events, losses, job changes moving, both partners find jobs on what the only one has a job. I want that kind of data and how often the different trajectories change in life, like how informative is your past to your future. That's a whole thing. Like, can you look at my life and have a good prediction about in terms
Starting point is 02:06:35 of my characteristics and my relationships with what that's going to look like in the future or not? I don't even know the answer to that question. I'll be very ill informed in terms of making the probability. I would be far, yeah, I just would be under informed. I would be under informed. I'll be over biasing to my prior experiences, I think. Right, but as long as you're aware of that,
Starting point is 02:06:58 and you're honest with yourself, and you're honest with the other person, say, look, I have really wide air bars on this for the following reasons. That's okay. I still think it's better than not trying to quantify it at all if you're trying to make really major irreversible life decisions. And I feel also the romantic nature of that question.
Starting point is 02:07:14 For me personally, I would, I try to live my life thinking it's very close to 100%. Like allowing myself, actually, the, this is the difficulty of this is allowing myself actually the this is this is a difficulty of this is allowing myself to think differently I feel like has a psychological consequence. That's where that that's what's one of my pushbacks against radical honesty is This one one particular perspective on you are saying you would you would rather give a falsely high percentage To your partner. Going back to the Y-Sage film. In order to sort of react to the traditional optimism. Home youth.
Starting point is 02:07:50 Yes. Of fake it till you make it, the positive, the positive thing. Positive, yeah. Yeah, yeah. Yeah, yeah. To the hashtag. Well, so that, and this comes back
Starting point is 02:08:01 to this idea of useful fictions, right? And I agree, I don't think there's a clear answer to this, and I think it's actually quite subjective. Some people just works better for than others. Yeah. You know, to be clear, Igor and I weren't doing this formal prediction in it. Like, we did it with very much tongue in cheek. It wasn't like we were going to make it. I don't think it even would have drastically changed
Starting point is 02:08:25 what we decided to do even. We kind of just did it more as a fun exercise. For the consequence of that fun exercise, you're actually kind of, there was a deep honesty to it too. Exactly, it was a deep, and it was just like this moment of reflection. I'm like, oh wow, I actually have to think
Starting point is 02:08:39 like through this quite critically and so on. And it's also what was interesting was I got to check in with what my desires were. So there was one thing of what my actual prediction is, but what are my desires? And could these desires be affecting my predictions and so on? And that's a method of rationality. And I personally don't think it loses anything in terms of,
Starting point is 02:09:02 it didn't take any of the magic away from our relationship, quite the opposite. It brought us closer together because it was like we did this weird, fun thing that I appreciate a lot of people find quite strange. I think it was somewhat unique in our relationship that both of us are very, we both love numbers, we both love statistics, we're both poker players. of like our safe space anyway. For others, you know, one partner, one partner like really might not like that kind of stuff at all in which case it's not a good exercise to do. You know, I don't recommend it to everybody. But I do think there's, you know, it's interesting sometimes to poke holes in the, you know,
Starting point is 02:09:41 probe at these things that we consider so sacred that we can't try to quantify them. But which is interesting because that's intention with the idea of what we just talked about with beauty and what makes something beautiful, the fact that you can't measure everything about it. And perhaps something shouldn't be tried to make. Maybe it's wrong to completely try and value the utilitarian frame of measuring the utility of a tree in its entirety. I don't know, maybe we should, maybe we shouldn't. I'm ambivalent on that. But overall, people have too many biases. People are overly biased against trying to do a quantified cost-benefit analysis on really tough life decisions.
Starting point is 02:10:27 They're like, I'll just go with your gut. It's like, well, sure, but our intuition is the best suited for things that we've got tons of experience in. Then we can really trust on it. If it's a decision we've made many times, but if it's like, should I marry this person or should I buy this house over that house, you only make those decisions a couple of times in your life maybe. Well, I would love to know there's a balance, there's probably a personal balance of strike, is the amount of rationality you apply to a question versus the useful fiction, the useful fiction, the fake it till you make it. For example, just talking to soldiers in Ukraine, you ask them, what's the probability of you winning? Ukraine winning.
Starting point is 02:11:19 Almost everybody I talk to is 100%. Wow. And you listen to the experts, right? They say all kinds of stuff. They are, they, first of all, the morale there is higher than probably. And I've never been to a war zone before this, but I've read about many wars. And I think the morale in Ukraine is higher than almost anywhere I've read about. It's every single person in the country is proud to fight for their country. Wow. Everybody, not just soldiers, not everybody.
Starting point is 02:11:50 Why do you think that is, specifically more than, you know, in other wars? I think because there's perhaps a dormant desire for the citizens of this country to find the identity of this country to find the identity of this country because it's been going through this 30-year process of different factions and political bickering. And they haven't had, as they talk about, they haven't had their independence war. They say all great nations have had an independence war.
Starting point is 02:12:20 They had to fight for their independence, for the discovery of the identity, of the core, the ideals that unify us. And they haven't had that. There's constantly been factions, there's been divisions, there's been pressures from empires, from United States and from Russia, from NATO and Europe, everybody telling them what to do. Now they want to discover who they are. And there's that kind of sense that we're going to fight for the safety of our homeland, but we're also going to fight for our identity. And that, on top of the fact that there's just, if you look at the history of Ukraine and there's certain other countries like this, there are certain cultures are feisty in their pride of being part of being the citizens of that nation. Ukraine is that Poland was that, you just look at history.
Starting point is 02:13:14 In certain countries, you do not want to occupy. Right. I mean, both Stalin and Hitler talked about Poland in this way. They're like, this is a big problem. If we occupy this land for prolonged periods of time, they're gonna be a pain in their ass. Like, they're not going to be want to be occupied. And certain other countries are like pragmatic.
Starting point is 02:13:34 They're like, well, you know, leaders come and go. I guess this is good. And you know, there, Ukraine just doesn't have, Ukrainians, those seem, throughout the 20th century, don't seem to be the kind of people that just like sit calmly and let the Quarant-Quar occupiers impose their rules.
Starting point is 02:13:54 That's interesting though, because you said it's always been under conflict and leaders have come and gone. So you would expect them to actually be the opposite under that. Yeah, because well, because they're, it's a very fertile land. It's great for agriculture. So a lot of people want to, I mean, I think they've developed this culture because they've constantly been occupied by different people for different peoples. And so maybe there is something to that where you've constantly had to feel like within the blood of the generations there's the struggle for against the man, against the imposition of rules, against oppression and all that kind of stuff, and that stays with them. So there's a there's a will there. But you know a lot of other aspects
Starting point is 02:14:40 are also part of the that has to do with the reverse, small, kind of situation where social media has definitely played a part of it. Also different charismatic individuals have had to play a part, the fact that the president of the nation, Zelensky, stating key of during the invasion was, is a huge inspiration to them because most leaders, as you could imagine, won the capital of the nation is under attack, the wise thing, the smart thing, that the United States advised the Zelensky to do is to flee and to be the leader of the nation from a distant place. He said, fuck that.
Starting point is 02:15:21 I'm think put, you know, everyone around him, there was a pressure to leave, and he didn't. And that, that in, you know, those singular acts really can unify nation. There's a lot of people that criticize the landscape within Ukraine before the war, it was very unpopular, even still, but they put that aside for the, especially that singular act of staying in the capital. Yeah, a lot of those kinds of things come together to create something within people. But these things always, of course though, like the, you know, which, how zoomed out of a view do you want to take? Because yeah, you describe it as like an anti-molic thing happened within Ukraine because it
Starting point is 02:16:13 brought the Ukrainian people together in order to fight a common enemy. Maybe that's a good thing, maybe that's a bad thing. In the end, we don't know how this is all going to play out, right? But if you zoom it out from a level, you know, on a global level, they're coming together to, you know, fight that could, you know, that could make a conflict larger. You know what I mean? I don't know what the right answer is here. Yeah. It seems like a good thing that they came together. I, but I like, we don't know how this is all going to play out. If this all turns into nuclear war, we'll be like, good thing that they came together. I but I like we don't know how this is all going to play out if If this all turns into nuclear war will be like okay, that was the bad that was the yeah, so I was describing the the reverse
Starting point is 02:16:50 Molech for the local level exactly now. This is where the experts come in and they say well if you Channel most of the resources the nation and the nation supporting Ukraine into the war effort. Are you not beating the drums of war that is much bigger than Ukraine? In fact, even the Ukrainian leaders are speaking of it this way. This is not a war between two nations. This is the early days of a world war if we don't play this correctly. Yes. And we need cool heads from all leaders.
Starting point is 02:17:32 So from your current perspective, you need to, your crane needs to win the war. Because what is winning the war mean is coming up, coming to peace negotiations, an agreement that guarantees no more invasions. And then you make an agreement about what land belongs to who? And that's, you stop that. And basically, from their perspective, is you want to demonstrate to the rest of the world who's watching carefully, including Russia and China
Starting point is 02:18:04 and different players on the geopolitical stage that this kind of conflict is not going to be productive. If you engage in it. So you want to teach everybody a lesson, let's not do World War three. It's not going to be bad for everybody. It's a loose, loose. Please lose. That's my. So, but they, you know, but it and I think that's actually a correct. When I zoom out, I mean, 99% of what I think about is just individual human beings and human lives and just that war is horrible. But when you zoom out and think from a geopolitics perspective, we should realize that it's entirely possible that we will see a World War III in the 21st century. And this is like a dress rehearsal
Starting point is 02:18:54 for that. And so the way we play this as a human civilization will define whether we do or don't have a World War III. You know, how we discuss war, how we discuss nuclear war, the kind of leaders we elect and prop up the kind of memes we circulate because you have to be very careful when you're being pro-Ukraine for example you have to realize that you're being you are also indirectly feeding the ever-increasing military industrial complex So it'd be extremely careful that when you say pro-Ukraine or pro-Anybody, you're pro-Human beings, not pro-The machine that creates narratives that says it's pro-Human beings, but it's pro-human beings, but it's actually if you look at the raw use of funds and resources, it's actually pro-making weapons and shooting bullets and dropping bombs. Right. We have to just somehow get the meme into everyone's heads that the real enemy is war itself. That's the enemy we need to defeat. And that doesn't mean to say that there isn't justification for
Starting point is 02:20:27 small local scenarios, adversarial conflicts. If you have a leader who is starting wars, they're on the side of team war, basically. It's not that they're on the side of team country, whatever that country is, they're on the side of Team War, so that needs to be stopped and put down. But you also have to find a ways that your corrective measure doesn't actually then end up being co-opted by the war machine and creating greater war. Again, the playing field is finite. The scale of conflict is now getting so big that the weapons that can be used are so mass-destructive that we can't afford another giant conflict. We just, we won't make it. What existential threat, in terms of us not making it, are you most worried about what existential threat to human civilization? We got, like, let's go. We're done with dark path, huh? It's good. No, it's dark. No, I was like, well, we're in the summer place. We might as well.
Starting point is 02:21:29 Some of my best friends are dark paths. What worries you most? We mentioned asteroids. We mentioned AGI, nuclear weapons. The one that's on my mind the most, mostly because I think it's the one where we have actually a real chance to move the needle on in a positive direction or more specifically stop some really bad things from happening really dumb avoidable things is bi-o-bio risks. So what kind of by in terms of some of the fun options. So many. So of course, like we have natural risks
Starting point is 02:22:06 from natural pandemics, naturally occurring viruses or pathogens. And then also as time and technology goes on and technology becomes more and more democratized into the hands of more and more people, the risk of synthetic pathogens. And whether or not you fall into the camp of COVID was gain of function accidental lab leak or whether it was purely naturally occurring. Either way, we are facing a future where
Starting point is 02:22:38 synthetic pathogens or human meddled with pathogens, either accidentally get out or get into the hands of bad actors who, you know, whether they're Omnisside or Maniacs, you know, either way. And so that means we need more robustness for that. And you would think that us having this nice little dry run, which is what as awful as COVID was, you know, and all those poor people that died, it was still like child's play compared to what a future one could be in terms of fatality rate. And so you'd think that we would then be becoming, we'd be much more robust in our pandemic preparedness. And meanwhile, the budget in the last two years for the US, sorry, they just did this, I can't remember the name of what the actual budget was, but it was like a multi-trillion dollar budget that the US just set aside.
Starting point is 02:23:34 And originally in that, considering that COVID cost multiple trillions to the economy, right? The original allocation in this new budget for future pandemic preparedness was 60 billion. So tiny proportion of it. That's proceeded to get whittled down to like 30 billion, to 15 billion, all the way down to two billion out of multiple trillions. For a thing that has just cost us multiple trillions, we've just finished, we barely even,
Starting point is 02:24:00 we're not even really out of it. It basically got whittled down to nothing because for some reason people think that, oh, all right, we've got the pandemic out of it. It basically got whittled down to nothing, because for some reason people think that, whew, all right, we've got the pandemic out the way. That was that one. And the reason for that is that people are, and I say this with all due respect to a lot of the science community, but there's an immense amount of naivety about,
Starting point is 02:24:19 they think that nature is the main risk moving forward, and it really isn't. And I think nothing demonstrates this more than this project that I was just reading about that's sort of being proposed right now called deep vision. And the idea is to go out into the wild, and we're not talking about us like, you know, within cities, like deep into like caves that people don't go to, deep into the Arctic,
Starting point is 02:24:43 wherever, scour the earth for whatever the most dangerous possible pathogens could be, they can find. And then, not only do, you know, try and find these, bring samples of them back to laboratories. And again, whether you think COVID was a lab leak or not, I'm not going to get into that, but we have historically had so many, as a civilization, we've had so many lab leaks from even the highest level
Starting point is 02:25:08 security things. It's people should go and just read it. It's like a comedy show of just how many they are, how leaky these labs are, even when they do their best efforts. So bring these things then back to civilization. That's step one of the badness. The next step would be to then categorize them, do experiments on them and categorize them by their level of potential
Starting point is 02:25:30 pandemic-ethality. And then the piece of resistance on this plan is to then publish that information freely on the internet about all these pathogens, including their genome, which is literally like the building instructions of how to do them on the internet. And this is something that genuinely a pocket of the like bio, of the scientific community thinks is a good idea. And I think on expectation, like the, and that argument is that, oh, this is good because you know, it might buy us some time to buy to develop that vaccines, which okay, sure maybe would have made sense prior to mRNA technology, but that vaccines, which okay, sure, maybe would have made sense prior
Starting point is 02:26:06 to mRNA technology, but like mRNA, we can develop a vaccine now when we find a new pathogen within a couple of days. Now then there's all the trials and so on. Those trials would have to happen anyway in the case of a brand new thing. So you're saving maybe a couple of days, so that's the upside.
Starting point is 02:26:22 Meanwhile, the downside is you're not only giving, you're bringing the risk of these pathogens of like getting leaked, but you're literally handing it out to every bad actor on earth who would be doing cartwheels. So I'm talking about like Kim Jong Un, ISIS, people who like want, they think the rest of the world is their enemy. And in some cases, they think that killing themselves is a noble cause. And you're literally giving them the building blocks of how to do this. It's the most batshit I don't ever heard. Like on expectation, it's probably minus EV
Starting point is 02:26:54 of multiple billions of lives if they actually succeeded in doing this. Certainly, certainly in the tens or hundreds of millions. So the cost benefit is so unbelievably, it makes no sense. And I was trying to like try to wrap my head around like why why like what's going wrong in people's minds to think that this is a good idea? And it's not that it's malice or anything like that. It's I think it's that people don't, you know, the proponents, they don't, they're actually overly naive
Starting point is 02:27:22 about the interactions of humanity. Well, there are bad actors who will use this for bad things. Because not only do all that, if you publish this information, even if a bad actor couldn't physically make it themselves, which give him, you know, in 10 years' time, the technology is getting cheaper and easier to use. But even if they couldn't make it, they could now bluff it. Like, what would you do if there's like some deadly new virus that we published on the internet in terms of its building
Starting point is 02:27:51 blogs? Kim Jong-un could be like, hey, if you don't let me build my nuclear weapons, I'm going to release this. I've managed to build it. Well, now he's actually got credible bluff. We don't know. And so that's just like handing the keys, handing weapons of mass destruction to people.
Starting point is 02:28:06 The- Makes no sense. The possible, I agree with you, but the possible world in which you might make sense is if the good guys, which is a whole another problem, defining who the good guys are, but the good guys are like an order of magnitude higher competence And so they can stay ahead of the bad actors by just being very good at the defense By very good not meaning like a little bit better, but an order of magnitude better
Starting point is 02:28:40 But of course the question is in each of those individual disciplines Is that feasible? Can you, can the bad actors, even if they don't have the competence leapfrog to the place where the good guys are? Yeah, I mean, I would agree in principle with pertaining to this, like, particular plan of like that, you know, with the thing I described as deep vision thing, where at least then that would maybe make sense for steps one and step two of like getting, you know, with the thing I described this deep vision thing, where at least then that would maybe make sense for steps one and step two of like getting the information, but then
Starting point is 02:29:09 why would you release it, the information to your literal enemies, you know, that's, that makes them, that doesn't fit at all in that perspective of like trying to be ahead of them. You're literally handing them the weapon. But there's different levels of release, right? So there's the kind of secrecy where you don't give it to anybody. But there's different levels of release, right? So there's the kind of secrecy where you don't give it to anybody. But there's a release where you incrementally give it to like major labs. So it's not public release, but it's like you're giving it to a different layers of reasonability. But the problem there is it's going to, if you go anywhere beyond like complete secrecy,
Starting point is 02:29:43 it's going to leak. That's the thing. It's very hard to keep secrets. So that's still information is, so you might as well really sit to the public. Is that argument? So you either go complete secrecy or you really sit to the public. So which essentially the same thing is going to leak anyway, if you don't do complete secrecy. Right. Which is why you shouldn't get the information in the first place.
Starting point is 02:30:05 Yeah, I mean, well, in that, I think, well, that's a solution. Yeah, if the solution is either don't get the information in the first place or be, keep it incredibly, incredibly contained. See, I think, I think it really matters which discipline we're talking about. So in the case of biology, I do think you're a very right. We shouldn't even be, it should be forbidden to even like think about that. Meaning don't just even collect the information, but like don't do, I mean gain a function research is a really iffy area. Like you start, I mean, it's all about cost benefits, right? There are some scenarios where I can imagine the cost benefit of a gain of function research
Starting point is 02:30:46 is very, very clear, where you've evaluated all the potential risks factored in the probability that things can go wrong and like, you know, not only known on knowns, but unknown unknowns as well, tried to quantify that. And then even then, it's like orders of magnitude better to do that. I'm behind that argument, but the point is, is that there's this like naivety that's preventing people from even doing the cost benefit properly on a lot of the things because I get it, the science community, again, I don't want to bucket the science community, but like some people within the science community
Starting point is 02:31:17 just think that everyone's good and everyone just cares about getting knowledge and doing the best for the world. And unfortunately, that's not the case. I wish we lived in that world, but we don't. Yeah, I mean, there's a lie. Listen, I've been criticizing the science community broadly quite a bit. There's so many brilliant people that brilliance is somehow a hindering sometimes because it has a bunch of blind spots. And then you start to look at a history of science, how easily it's been used by dictators to any conclusion they want. And it's dark how you can use brilliant people that like playing the little game of science, because it is a fun game. You're building, you're going to conferences,
Starting point is 02:31:55 you're building it up for each other's ideas, it's breakthroughs. Hi, I think I've realized how this particular molecule works, and I could do this kind of experiment, and everyone else is impressed. Ooh, cool. No, I think you're wrong. Let me show you why you're wrong. And that little game, everyone gets really excited, this particular molecule works and I could do this kind of experiment and everyone else is impressed. Oh, cool.
Starting point is 02:32:06 No, I think you're wrong. Let me show you why you're wrong. And that little game, everyone gets really excited and they get excited. Oh, I came up with a pill that solves this problem and it's going to help a bunch of people. And I came up with a giant study that shows the exact probability it's going to help or not. And you get lost in this game and you forget to realize this game just like Mollick can have like unintended consequences that might destroy human civilization or or divide
Starting point is 02:32:36 human civilization or have dire geopolitical consequences. I mean the effects of I mean it's just so, the most destructive effects of COVID have nothing to do with the biology of the virus, it seems like. I mean, I could just list them. But like, one of them is the complete distrust of public institutions. The other one is because of that public distrust, I feel like if a much worse pandemic came along, Because of that public distress, I feel like if a much worse pandemic came along, we as a world have not cried wolf. And if an actual wolf now comes,
Starting point is 02:33:10 people will be like, fuck masks, fuck. Fuck vaccines, fuck it, yeah. For everything. And they won't be, they'll distrust every single thing that any major institution is gonna tell them. And. Because that's the thing, there was certain actions made by certain health public figures where they told, they very knowingly told, it was a white lie, it was intended
Starting point is 02:33:36 in the best possible way, such as early on when there was clearly a shortage of masks. And so they said to the public, oh, don't get masks, there's no evidence that they work. They're all there, you know, don't get them, they don't work. In fact, it might even make it worse. You might even spread it more. Like that was the real stinker.
Starting point is 02:33:58 Yeah, no, no, there's a, let's you know how to do it properly, you're gonna make that you're gonna get sticker or you're more likely to get the cash the virus, which is just absolute crap. And they put that out there. And it's pretty clear the reason why they did that was because there was actually a shortage of masks and they really needed it for health workers, which makes sense. Like I agree, like, you know, but the cost of lying to the public, to the public when that
Starting point is 02:34:23 then comes out, people aren't as stupid as they think they are. And that's, I think, where this distrust of experts has largely come from. A, they've lied to people overtly, but B, people have been treated like idiots. Now, let's not just say that there are a lot of stupid people who have a lot of wacky ideas around COVID and all sorts of things. But if you treat the general public like children, they're going to see that, they're going to notice that, and that is going to just
Starting point is 02:34:50 absolutely decimate the trust in the public institutions that we depend upon. And honestly, the best thing that could happen, I wish, like if Fauci, you know, and these other leaders who, I mean, God, I would, I can't imagine how nightmare his job has been for the last few years, hell on earth. So, you know, I have, I have a lot of sort of sympathy for the position he's been in. But like, if he could just come out and be like, okay, look, guys, hands up, we didn't handle this as well as we could have. These are all the things I would have done differently in hindsight. I apologize for this and this and this and this.
Starting point is 02:35:23 That would go so far. And maybe I'm being naive, maybe this would backfire, but I don't think it would like to someone like me, even because I have like, I've lost trust in a lot of these things. I'm fortunate that at least no people who I can go to who I think have good epistemics on this stuff. But you know, if they could sort of put their hands on me, okay, these are the spots where we screwed up. This, this, this. This was our reasons. Yeah, we actually told a where we screwed up. This, this, this. This was our reasons. Yeah, we actually told a little white light here. We did it for this reason.
Starting point is 02:35:48 We're really sorry. Where they just did the radical honesty thing, the radical transparency thing. That would go so far to rebuild in public trust. And I think that's what needs to happen. I know, I totally agree with you. Unfortunately, his job was very tough and all those kinds of things.
Starting point is 02:36:04 But I see arrogance and arrogance preventing him from being honest in that way previously. And I think arrogance will prevent him from being honest in that way. Now, when you leaders, I think young people are seeing that, that kind of talking down to people from a position of power. I hope is the way of the past people really like authenticity and they they like Leaders that are like a man and a woman of the people and I think that just I mean he still has a chance to do that I think I mean, yeah, sure I don't think you know if I doubt he's listening, but if he is, like, hey, I think, I don't think he's irredeemable.
Starting point is 02:36:50 Well, anyway, I think this, I don't have an opinion of whether there was arrogance or there or not. Just know that I think, like, coming clean on the, you know, it's understandable to have fucked up during this pandemic. I won't expect any government to handle it well because it was so difficult, like, so many moving pieces, so much like lack of information and so on. But the step to rebuilding trust is to go, okay, look, we're doing it scrutiny of where
Starting point is 02:37:15 we went wrong. And for my part, I did this wrong in this part. And that would be huge. All of us can do that. I mean, I was struggling for a while. Well, there I want to talk to All of us can do that. I mean, I was struggling for a while. Well, there I want to talk to him or not. I talked to his boss, Francis Collins. Another person that's screwed up in terms of trust, lost a little bit of my respect to there seems to have been a kind of dishonesty in the in the back rooms in that they didn't trust people to be intelligent. Like, when you to tell them was good for them, we know what's good for them.
Starting point is 02:37:52 That kind of idea. To be fair, the thing that's, what's it called, I heard the phrase today, not picking. Social media does that. So you've got like nit picking, not picking is where the craziest, stupidest, if you have a group of people,
Starting point is 02:38:10 let's say people who are vaccine, I don't like the term anti-vaccine, people who are vaccine hesitant, vaccine speculative. What social media did or the media or anyone, their opponents would do is pick the craziest example. So the ones who are like, I think I need to inject myself with like, motor oil at my ass, or something.
Starting point is 02:38:31 Select the craziest ones and then have that beamed to, so from like someone like Fauci or Francis' perspective, that's what they get, because they're getting the same social media stuff as well. They're getting the same media report, I mean, they might get some more information. but they too are going to get these, the nuts perpetrated them. So they probably have a misrepresentation of what the actual public's intelligence is.
Starting point is 02:38:53 Like the real. Well, that just, yes. And that just means they're not social media savvy. So one of the skills of being on social media is to be able to filter that in your mind, like to understand, to put into proper content. Realize that what you are seeing to social media is not anywhere near an accurate representation of humanity. Not picking on other.
Starting point is 02:39:12 And there's nothing wrong with putting a monorail up your ass. It's just one. It's one of the better aspects of it. I do this every weekend. Okay. With another bad analogy, come from in my mind? Like what? I don't know. I think we need to,
Starting point is 02:39:27 there's some Freudian thing, we need to deeply investigate with a therapist. Okay. What about AI? Are you worried about AGI, super intelligent systems or paperclip maximizer type of situation?
Starting point is 02:39:42 Yes. I'm definitely worried about it. But I feel kind of bipolar in the, some days I wake up and I'm like, You're excited about the future? Well, exactly. I'm like, wow, we can unlock the mysteries of the universe, you know, escape the game. Um, and this, this, you know,
Starting point is 02:39:58 because I spend all my time thinking about these molecule problems that, you know, what is the solution to them? What, you know, in some ways you need this like omnibenevalent omniscient, omnivise, coordination mechanism that can like make us all not do the the the molecule thing or like provide the infrastructure would redesign the system so that it's not vulnerable to this molecule process. And in some ways, you know, that's the strongest argument to me for like the race to build AGI is that maybe, you know, we can't survive without it. But the flip side to that is the, the, the, the, unfortunately, now
Starting point is 02:40:38 that there's multiple actors trying to build AGI, AGI, you know, this was, this was fine 10 years ago when it was just deep mind, but then other companies started up and now it created a race dynamic. Now it's like, that's the whole thing, isn't the same, it's got the same problem, it's like, whichever company is the one that like optimizes for speed at the cost of safety, we'll get the competitive advantage. And so we're more likely the ones to build the AGI, you know, and that's the same cycle that you're in. And there's no clear solution to that because you can't just go like slapping, you know, if you go and try and like stop all the different companies, then it will, you know, the good ones will stop because they're the ones, you know, within, you know, within the West's reach, but then that leaves all the other ones to continue, and then they're even more likely. So it's a very difficult problem with no clean solution.
Starting point is 02:41:30 And at the same time, I know at least some of the folks at DeepMind and they're incredible, and they're thinking about this. They're very aware of this problem. And they're like, I think some of the smartest people on Earth. Yeah, the culture is important there, because they are thinking about that, and they're like, you know, I think some of the smartest people on earth. Yeah, the culture is important there because they are thinking about that. And they're some of the best machine learning engineers. So it's possible that a company or a community of people that are both great engineers and are thinking about the philosophical topics. Exactly.
Starting point is 02:41:59 And importantly, they're also game theorists, you know, and because this is ultimately a game theory problem, the thing is this small look mechanism. And like, you know, what this rate, how do we voice, arms race scenarios? You need people who aren't naive to be thinking about this. And again, like, luckily, there's a lot of smart, non-naive game theorists within that group. Yes, I'm concerned about it. And I think it's, again again a thing that we need people to be thinking about in terms of like how do we create, how do we mitigate the arms race dynamics and how do we solve the thing of it? It's got, Boston calls it the orthogonality problem whereby because there's obviously
Starting point is 02:42:39 there's a chance, you know, the belief, the hope is is that you build something that's super intelligent and by definition of being super intelligent, it will also become super wise and have the wisdom to know what the right goals are. And hopefully those goals include keeping humanity alive, right? But, Boston says that actually those two things, you know, on super intelligence and super wisdom aren't necessarily correlated. They're actually kind of orthogonal things. And how do we make it so that they are correlated? How do we guarantee it?
Starting point is 02:43:10 Because we need it to be guaranteed, really, to know that we're doing the thing safely. But I think that, like, merging of intelligence and wisdom, at least my hope is that this whole process happens sufficiently slowly, that we are constantly having these kinds of debates that we have enough time to figure out how to modify each version of the system as it becomes more and more intelligent. Yes, buying time is a good thing, definitely. Anything that slows everything down, we just have one needs to chill out. We've got millennia to figure this out. Yeah. everyone needs to chill out. We've got millennia to figure this out. Or at least, well, it depends again. Some people think that we can't even make it through the next few decades without having some kind of suddenly omnivise coordination mechanism. And there's also an argument
Starting point is 02:44:02 to that. Yeah, I don't know. Well, there is a I'm suspicious of that kind of thinking because it seems like the entirety of human history has people in it that are like predicting doom or just around the corner. There's something about us that is strangely attracted to that thought. It's it's almost like fun to think about the destruction of everything. And just objectively speaking, I've talked and listened to a bunch of people and they are gravitating towards that. It's almost, I think it's the same thing that people love about conspiracy theories. Is they love to be the person that kind of figured out some deep fundamental thing about the that's going to be, it's going to mark something extremely important about the history
Starting point is 02:44:51 of human civilization, because then I will be important. When in reality most of us will be forgotten and life will go on. One of the sad things about whenever anything traumatic happens to you, whenever you lose loved ones or just tragedy happens, you realize life goes on. Even after a nuclear war that will wipe out some large percentage of the population and will torture people from years to come because of the sort of, I mean, the effects of a nuclear winter,
Starting point is 02:45:30 people will still survive. Life will still go on. I mean, it depends on the kind of nuclear war, but in case a nuclear war, it will still go on. That's one of the amazing things about life. It finds a way. And so in that sense, I just, I feel like the doom and gloom thing is a...
Starting point is 02:45:48 Well, we don't want a self-fulfilling prophecy. Yes, that's exactly. Yes, and I very much agree with that. And I, you know, even have a slight, like, feeling from the amount of time we spent in this conversation talking about this, because it's like, you know, is this even a net positive if it's like making everyone feel, or in some ways, like making people imagine these bad scenarios can be
Starting point is 02:46:11 as a self-fulfilling prophecy, but at the same time, that's out, that's weighed off with at least making people aware of the problem and gets them thinking. And I think particularly, you know, reason why I wanna talk about this to your audience is that,
Starting point is 02:46:24 on average, they're the type of people who gravitate towards these kind of topics because they're intellectually curious and they can sort of sense that there's trouble brewing. They can smell that there's, you know, I think there's a reason of people thinking about this stuff a lot is because the probability, the probability, you know, it's increased in probability over the, certainly over the last few years, to direct use of not gone favorably, let's put it since 2010. So it's right, I think, for people to be thinking about it, but that's where it's a useful fiction
Starting point is 02:46:56 or whether it's actually true or whatever you want to call it. I think having this faith, this is where faith is valuable because it gives you at least this anchor of hope. And I'm not just saying it to trick myself. I do truly, I do think there's something out there that wants us to win. I think there's something that really wants us to win. And it just, you just have to be like, just like, okay, now I sound really crazy, but like, open your heart to it a little bit. And it will give you the like the sort of breathing room with which to marinate on the solutions. We are the ones who have to come up with the solutions. But we can use that there's like this hashtag positivity. There's value in that.
Starting point is 02:47:44 Yeah, you have to kind of imagine all the destructive trajectories that lay in our future and then believe in the possibility of avoiding those trajectories. All while you said audience, all while sitting back, which is my short, the two people that listen to this are probably sitting on a beach, smoking some weed. Just that's a beautiful sunset. the two people that listen to this, they're probably sitting on a beach, smoking some weed. Just, that's a beautiful sunset. Or they're looking at just the waves going in and out. And ultimately, there's a kind of deep belief there
Starting point is 02:48:15 and the momentum of humanity to figure it all out. I don't like it, but we've got a lot of work to do. Which makes this whole simulation, this video game kind of fun. This battle of Politopia, I still, man, I love those games so much. That's so good. And that one for people who don't know, battle of Politopia is a big, it's like a, is this really radical simplification of a civilization type of game. It still has a lot of the skill tree development,
Starting point is 02:48:49 a lot of the strategy, but it's easy enough to play in a phone. Yeah. It's kind of interesting. They've really figured it out. It's one of the most elegantly designed games I've ever seen. It's incredibly complex. And yet, again, it walks that line between complexity and simplicity in this really, really great way. And they use pretty colors that
Starting point is 02:49:10 hack the dopamine reward circuits and all brains very well. Yeah, it's fun. The video games are so fun. Yeah. Most of this life is just about fun, escaping all the suffering to find the fun. What's energy healing? I have in my nose energy healing question mark. What's that about? Oh man, got your audience again, I think I'm mad. So the two crazy things that happened to me, the one was the voice in the head that said you're going to win this tournament and then I won the tournament. The other craziest thing that's happened to me was in 2018, I started getting this weird problem in my ear, where it was kind of like low frequency sound distortion,
Starting point is 02:50:00 where voices, particularly men's voices, became incredibly unpleasant to listen to. It was like, it was like, falsely amplified or something, and it was almost like a physical sensation in my ear, which was really unpleasant. And I, it would like, last for a few hours, and then go away, and then come back for a few hours,
Starting point is 02:50:16 and go away, and I went and got hearing tests, and they found that like the bottom end, I was losing the hearing in that ear. And I, so in the end, I was losing the hearing in that ear. And I saw it in the end, I got doctors said they think it was this thing called many ears disease, which is this very unpleasant disease where people basically end up losing their hearing, but they get this like, it often comes with like disease bells and other things because it's like the inner ear gets all messed up. Now I don't know if that's actually what I had, but that's what at least a couple of one doctor said to me. But anyway, so I'd had three months of this stuff, this going on, and it was really getting
Starting point is 02:50:53 me down. I was, and I was that burning man of all places. Don't mean to be that person talking about burning man, but I was there. And again, I had it, and I was unable to listen to music, which is not what you want, because Burning Man is a very loud intense place. And I was just having a really rough time. And on the final night, I get talking to this girl who's like a friend of a friend.
Starting point is 02:51:14 And I mentioned, I was like, oh, I'm really down in the dumps about this. And she's like, oh, well, I've done a little bit of energy healing. Would you like me to have a look? I'm also sure. Now, this is, again, deep, I I was you know, no time in my life for this. I didn't believe in any of this stuff. I was just like, it's all bullshit, it's all
Starting point is 02:51:31 wooing nonsense. Um, that was like, sure, have a go. And she starts like with her hand and she says, oh, there's something there. And then she leans in and she starts like sucking over my ear, not actually touching me, but like close to it, like with her mouth. And it was really unpleasant. I was like, well, can you stop? She's like, no, no, no, there's something there I need to get it. I was like, no, no, no, I really don't like it. Please, this is really loud.
Starting point is 02:51:53 She's like, I need to just bear with me. And she does it. I don't know how long for a few minutes. And then she eventually collapses on the ground, like freezing cold, crying. Not, you know, and I'm just like, I don't know what the hell is going on. Like I'm like thoroughly freaked out as does everyone else watching. Just like, what the hell on me? Like warm her up and she's like, what? Oh, you know, she was really shaken up.
Starting point is 02:52:16 And she's like, I don't know what that she said. It was something very unpleasant and dark. Don't worry, it's gone. I think you'll be fine in a couple. You'll have the physical symptoms for a couple of weeks and you'll be fine. But, you know, she was like that, you know, so I was so rattled A because the potential that actually I'd had something bad in me that made someone feel bad and that she was scared. That was what, you know, I was like, wait, I thought, you do this. This is the thing. And now you're terrified, like you pulled some kind of exorcism or something, what the fuck is going on?
Starting point is 02:52:48 Yeah. So it, like just, the most insane experience, and frankly, it took me like a few months to sort of emotionally recover from it. But my ear problem went away about a couple of weeks later and Touchwood I've not had any issues since so That gives you Like hints that maybe there's something out there. I Mean I don't I again, I don't have an explanation for this the most probable explanation was a you know
Starting point is 02:53:22 I was a burning man was in a very open state. Let's just leave it at that. Placebo is an incredibly powerful thing and a very not understood thing. Almost assigning the word placebo to it reduces it down to a way that it doesn't deserve to be reduced down. Maybe there's a whole science of what we call placebo. Maybe there's a placebo's a door. Self-healing. Yeah.
Starting point is 02:53:48 You know, and I mean, I don't know what the problem was. Like I was told it was many years. I don't want to say, I definitely had that because I don't want people to think that, oh, that's how, you know, if they do have that because it's terrible disease and if they have that, that this is going to be a guaranteed way for it to fix for them, I don't know.
Starting point is 02:54:02 And I also don't, I don't, and you're absolutely right to say like using even the word placebo is like, it comes with this like baggage of, of like frame, and I don't want to reduce that down. All I can do is describe the experience and what happened. I cannot put an ontological framework around it. I can't say why it happened, what the mechanism was, what the problem was, what
Starting point is 02:54:25 the problem even was in the first place. I just know that something crazy happened, and it was while I was in an open state, and fortunately for me, it made the problem go away. But what I took away from it, again, it was part of this, you know, this took me on this journey of becoming more humble about what I think I know. Because as I said before, I was like, I was in the like rich dockens train of atheism in terms of there is no God, there's and everything like that is bullshit. We know everything, we know, you know, the only way we can get through, you know, how medicine works and it's molecules and chemical interactions and that kind of stuff. And now it's like, okay, well, there's, there's clearly more for us to understand.
Starting point is 02:55:05 Okay, well, there's clearly more for us to understand. And that doesn't mean that it's ascientific as well. Because, you know, the beauty of the scientific method is that it's still gonna apply to this situation. Like, I don't see why, you know, I would like to try and test this experimentally. I haven't really, you know, I don't know how we would go about doing that. We'd have to find other people with the same condition,
Starting point is 02:55:23 I guess, and like, try and repeat the experiment. But it doesn't, just because something happens that's sort of out of the realms of our current understanding, it doesn't mean that it's the scientific method can't be used for it. Yeah, I think the scientific method sits on a foundation of those kinds of experiences, because they, scientific method is a process to carve away at the mystery all around us. And experiences like this is just a reminder that we're mostly shrouded in mysteries, though. That's it. It's just
Starting point is 02:56:02 like a humility. We haven't really figured this holding out. But at the same time, we have found ways to act. We're with Leely doing something right because think of the technological scientific advancements, the knowledge that we have that was the blow people's minds even from a hundred years ago. Yeah, and we've even allegedly got out the space and landed on the moon. Although I still haven't I have not seen evidence of the earth being round, but I'm keeping an open mind. Speaking of which, you studied physics in astrophysics. Just to go to that,
Starting point is 02:56:41 just to jump around through the fascinating life you've had. When did you, how did that come to be? When did you fall in love with astronomy and space and things like this? As early as I can remember. I was very lucky that my mum and my dad, but particularly my mum, my mum, my mum is the most nature. She is mother earth. It's the only way to describe her. She's like doctor do little animals flock to her and just like sit and look at her Adorningly and she sings yeah She's just she just is mother earth and she has always been fascinated by You know, she doesn't have any you know, she never went to university or anything like that. She's actually phobic of maths
Starting point is 02:57:23 If I try and get her to like, I was trying to teach her poker, or she hated it. But she's so deeply curious. And that just got instilled in me when we would sleep out under the stars whenever it was the two nights of the year when it was warm enough in the UK to do that. And we'll just lie out there until we fell asleep looking for satellites, looking for shooting stars. And I was just always, I don't know whether it was from that, but I've always naturally
Starting point is 02:57:51 gravitated to like the biggest, the biggest questions. And also the like the most layers of abstraction, I love just like what's the meta question, what's the meta question and so on. So I think it just came from that really. And then on top of that, like physics, you know, it also made logical sense in that it was a, it was, it was the degree, it was a degree that was subject that ticked the box of being, you know, answering these really big picture questions, but it was also extremely useful. It like has a very high utility in terms of I didn't know necessarily. I thought I was going to become like a research scientist. My original plan was I want to be a professional astronomer. So it's not just like a philosophy degree that asks the
Starting point is 02:58:33 big questions, and it's not like biology in the path to go to medical school or something like that, which is overly pragmatic, not overly, is very pragmatic. The moral of the mathematics side. But physics is a good combination of the two. Yeah, at least for me, it made sense. And I was good at it. I liked it. Yeah, I mean, it wasn't like I did an immense amount of soul searching to choose it or anything.
Starting point is 02:58:59 It just was like this. I mean, it made the most sense. I mean, you have to make this decision in the UK age 17, which is crazy, because you know, in US you go the first year, you do a bunch of stuff right, and then you choose your major. I think the first few years of college, you focus on the drugs and only as you get closer to the end, do you start to think, oh shit, this wasn't about that. And I'm, I owe the government a lot of money. How many alien civilizations are out there?
Starting point is 02:59:30 When you looked up at the stars with your mom and you were counting them, what's your mom think about the number of alien civilizations? I actually don't know. I would imagine she would take the viewpoint of, she's pretty humble and she knows how many, she knows there's a huge number of potential spawn sites out there. So she would spawn sites. spawn sites, yeah. You know, this is our spawn site.
Starting point is 02:59:54 Yeah, spawn sites in polytopia. We spawned on earth, you know, it's... Hmm, yeah, spawn sites. What does that feel weird to say spawn? Because it makes me feel like it's... There's only one source of life and it's spawning in different locations. That's why the word spawn. Because it feels like life that originated on earth really originated here. Right. It is unique to this particular... Yeah, I mean, but I don't, in my mind, it doesn't exclude, you know, that completely different forms of life and different biochemical soups can't also spawn,
Starting point is 03:00:34 but I guess it implies that there's some spark that is beautiful, which I kind of like the idea of. It's, you know, and then I get to think about respawning, like after it dies, Like what happens if life on earth ends? Is it, is it going to restart again? Probably not. It depends. Maybe earth is too. It depends on the type of, you know, what's the thing that kills it off, right? If it's a paper-dlipped maximizer, not, you know, for the, for the example, but, you know, some kind of very
Starting point is 03:01:02 self-replicating, high on the capabilities very low on the wisdom type thing. So whether that's gray goo, green goo, nanobots, or just a shitty misaligned AI that thinks it needs to turn everything into paperclips. If it's something like that, then it's gonna be very hard for life, complex life, because by definition, a paperclip maximizer is the ultimate instantiation of mollak, deeply
Starting point is 03:01:30 low complexity, over optimization on a single thing, sacrificing everything else, turning the whole world into... Although something tells me, if we actually take a paperclip maximizer, it destroys everything, it's a really dumb system that just envelops the whole of Earth. And it's about to be on. I didn't know that part, but okay, great. That's the most experimental. So it becomes a multi-planetary paperclip maximizer? Well, it just propagates. I mean, it depends whether it figures out how to jump the vacuum gap. But again, I mean, this is also silly because it's a hypothetical thought experiment, which
Starting point is 03:02:05 I think doesn't actually have much practical application to the AI safety problem, but it's just a fun thing to play around with. But if by definition it is maximally intelligent, which means it is maximally good at navigating the environment around it in order to achieve its goal, but extremely bad at choosing goals in the first place. So again, we're talking on this orthogonality thing, right? It's very low on wisdom, but very high on capability. Then it will figure out how to jump the vacuum gap between planets and stars and so on.
Starting point is 03:02:32 And thus just turn every atomic hits, it's hands on into paperclips. Yeah, by the way, for people who... Which is maximum virality, by the way. That's what virality is. But does not mean that virality is necessarily all about maximizing paperclips. In that case, it is. So for people who don't know, this is just a thought experiment example But it does not mean that the reality is necessarily all about maximizing paperclips.
Starting point is 03:02:45 In that case, it is. So for people who don't know, this is just a thought experiment, an example of an AI system that has a goal and is willing to do anything to accomplish that goal, including destroying all life on earth and all human life and all of consciousness in the universe, you know, for the goal of producing a maximum number of paper clips. Or whatever its optimization function was that it was set up. But don't you think? Could be making recreating lexas.
Starting point is 03:03:13 Maybe it'll tile the universe in lex. Go on. I'd like to say, did not. That's better. That's more interesting than paper clips. That could be infinitely optimal if I were to say something. It's still a bad thing because it's permanently capping what the universe could ever be.
Starting point is 03:03:29 It's like, that's, that's its end. Or achieving the optimal that the universe could ever achieve. But that's up to, different people have different perspectives. But don't you think within the paperclip world that would emerge just like in the zero zone ones that make up a computer that would emerge just like in the zero zone ones that make up a computer that would emerge beautiful complexities.
Starting point is 03:03:49 It won't suppress, you know, as you scale to multiple planets and throughout, they'll emerge these little worlds that on top of the fabric of maximizing paperclubs, there would be that would emerge like little societies of a paperclubs. Well, they were not describing a paperclip maximized anymore because by the, like, if you think of what a paperclip is, it is literally just a piece of bent iron, right? So if it's maximizing that throughout the universe, it's taking every atom that gets its hand on into somehow, turning it into iron or steel, and then bending it into that shape and then done and done.
Starting point is 03:04:31 By definition, like paper clips, there is no way for, well, okay, so you're saying that paper clips somehow will just emerge and create through gravity or something. No, no, no, because there's a dynamic element of the whole system. It's not just it's creating those paper clips and the act of creating, there's going to be a process and that process will have a dance to it because it's not like sequential thing.
Starting point is 03:04:57 There's a whole complex three-dimensional system of paper clips, you know, like, you know, people like string theory, right? It's supposed to be strings that are interacting in fascinating ways. I'm sure paper clips are very string-moly that can be interacting in very interesting ways, as you scale exponentially through three-dimensional, I mean, I'm sure the paper clip maximizer has to come up with a theory of everything. It has to create like wormholes, right? It has to break, like, it has to understand quantum
Starting point is 03:05:26 mechanics. I love your optimism. This is where I'd say this we're going into the realm of pathological optimism, where it by its. I'm sure there will be a, I think there's an intelligence that emerges from that system. So you're saying that basically intelligence is inherent in the fabric of reality and we'll find a way. Kind of like, goblins says life will find a way. You think life will find a way. Even out of this perfectly homogenous, dead soup.
Starting point is 03:05:54 It's not perfectly homogenous. It has to, it's perfectly maximal in the production. I don't know why people keep thinking it's homogenous. It maximizes the number of paperclips. That's the only thing. It's not trying to be homogenous. It's trying to... True, true, true, true, how it maximizes the number of paperclips. That's the only thing. It's not trying to be homogenous. It's trying to maximize paperclips. So you're saying that because it, because, you know, kind of like in the Big Bang, or,
Starting point is 03:06:15 you know, it seems like, you know, things, there were clusters. There was more stuff here than there. That was the enough of the patonicity that kickstarted the evolutionary process. It's a little weirdness that will make a beautiful... Even out of... Yeah, about city emerges. Interesting. Okay.
Starting point is 03:06:30 So how does that line up then with the whole heat death of the universe, right? Because that's another sort of instantiation of this. It's like, everything becomes so far apart and so cold and so perfectly mixed that it's like homogenous grainess. Do you think that even out of that homogenous grainess where there's no negative entropy that you know there's no free energy that we understand even from that new. Yeah the paperclip maximizer or any other intelligence systems will figure out ways to travel to other universes to create big bangs within those universes or through black holes to create whole
Starting point is 03:07:10 other worlds to break the what we consider the limitations of physics. The paperclip maximizer will find a way if a way exists and where we should be humbled to realize that we don't because because it just wants to make more paper clips. So it's going to go into those universes and turn them into paper clips. Yeah, but we humans, not humans, but complex system exists on top of that. We're not interfering with it. There, this complexity emerges from the simple base state. The simple base state.
Starting point is 03:07:42 Whether it's, yeah, whether it's, you know, plank lengths or paper clips is the base unit. Yeah, I, you can think of like the universe as a paperclip maximizer because it's doing some dumb stuff. Like physics seems to be pretty dumb. It has, like, I don't know if you can summarize it. Yeah, the laws of fairly basic, and yet out of them amazing complexity emerges.
Starting point is 03:08:04 And its goals seem to be pretty basic and dumb. If you can summarize this goals, I mean, I don't know what's a nice way, maybe, maybe laws of thermodynamics could be good. I don't know if you can assign goals to physics, but if you formulate in a sense of goals, it's very similar to paperclip maximizing in the dumbness of the goals. But the pockets of complexity as an emerge is where beauty emerges, that's where life emerges,
Starting point is 03:08:34 that's where intelligence, that's where humans emerge. And I think we're being very down on this whole paperclip maximizing the thing. No, the reason we hear it. I think, yeah, because what you're saying is that you think that the force of emergence itself is another, like, unwritten, what, not unwritten, but like another baked
Starting point is 03:08:53 in law of reality. And you're trusting that emergence will find a way to even out of seemingly the most molecule, awful, plain outcome, emergence of stuff anyway. I love that as a philosophy. I think it's really nice. I would wield it carefully
Starting point is 03:09:12 because there's large error bars on that and the certainty of that. Yeah. And what we build the paper cup backs and find out. Plastic. Yeah, mollick is doing cartwheels, man. Yeah.
Starting point is 03:09:24 But the thing is, it will destroy humans in the process, which is the thing, which is the reason we really don't like it. We seem to be really holding on to this whole human civilization thing. Would you, would that make you said, if AI systems that are beautiful, that are conscious, that are interesting, and complex, and intelligent, ultimately to the death of humans, that make you sad? If humans led to the death of humans, sorry. Like if they would supersede humans. Oh, if some AI. Yeah, AI would end humans. I mean, that's the reason why I'm like in some ways, in less emotionally concerned about AI risk as it then, say, bioresc, because at least with A.I. there's a chance, if we're
Starting point is 03:10:08 in this hypothetical where it wipes out humans, but it does it for some higher purpose, it needs our atoms to energy to do something. At least now, the universe is going on to do something interesting, whereas if it wipes everything, bio just kills everything on Earth, and that's it. And there's no more, you know, Earth cannot spawn anything more meaningful in the few hundred million years it has left. Because it doesn't have much time left. Then... Yeah, I don't know. So one of my favorite books I've ever read is Nover Scene by James Lovelock, who sadly just died.
Starting point is 03:10:44 He wrote it when he was like 99. He died age 102, so it was a fairly new book. And he sort of talks about that that he thinks it's, you know, sort of building off this Gaya theory, like Earth is a like living some form of intelligence itself, and that this is the next like step, right? Is this this, and that this is the next like step, right, is this whatever this new intelligence that is maybe silicon based as opposed to carbon based goes on to do. And it's really sort of in some ways an optimistic
Starting point is 03:11:13 but rudely fatalistic book. I don't know if I fully subscribe to it, but it's a beautiful piece to read anyway. So am I sad by that idea? I think so, yes. And actually, yeah, this is the reason why I'm sad by the idea, because if something is truly brilliant and wise and smart and truly super intelligent, it should be able to figure out abundance. So if it figures out abundance, it shouldn't need to
Starting point is 03:11:37 kill us off. It should be able to find a way for us. It should be, the universe is huge. There should be plenty of space for it to go out and do all the things it wants to do and like give us a little pocket where we can continue doing our things and we can continue to do things and so on. And again, if it's so supremely wise, it shouldn't even be worried about the game theoretic considerations that by leaving us alive
Starting point is 03:11:57 will then go and create another like Super Intelligent Agent that it then has to compete against because it should be omnivise and smart enough to not have to concern itself with that. Unless, unless it deems humans to be kind of assholes. Like, the humans are a source of non-avolus-lose kind of dynamics. Well, yes and no.
Starting point is 03:12:19 We're not, molluck is. That's why I think it's important to say that we're not. But maybe humans are the source of molluck. No, I think, I mean, I think game theory is the source of molluck is, that's why I think it's important to say that. But maybe humans are the source of molluck. No, I think, I mean, I think game theory is the source of molluck. And, you know, because molluck exists in non-human systems as well. It happens within, like, agents within a game in terms of, like, you know, it applies to agents, but it, like, it can apply to, you know, a species that's on an island of animals, rats outcompeting the ones that massively consume
Starting point is 03:12:49 all the resources of the ones that are going to win out over the more, like, chill, socialized ones. And so, you know, creates this mouth-usian trap. Like, molecules exist in little pockets in nature as well. Well, I wonder if it's actually a result of consequences of the invention of predator and prey dynamics. Maybe it needs AI will have to kill off every organism that you're talking about killing of competition. Not competition, but just like the way it's like the weeds or whatever in a beautiful flower gardener, the parasites, yeah, on the whole system. Of course, it won't do that completely. You'll put them in a zoo like we do with parasites.
Starting point is 03:13:35 It'll ring fence. Yeah, and there'll be somebody doing a PhD on like, they'll prod humans with a stick and see what they do. But I mean in terms of letting us run wild outside of the geographically-constructed region, that might be that it might have decided to know. I think there's obviously the capacity for beauty and kindness and non-molic behavior of a mishuman. So I'm pretty sure AI will preserve us. Let me, I don't know if you answered the aliens question. You had a good conversation with Toby Woy.
Starting point is 03:14:14 Yeah. About various sides of the universe. I think, did he say, now I'm forgetting, but I think he said it's a good chance we're alone. So the classic, you know classic Fermi paradox question is there are so many spawn points and yet it didn't take us that long to go from harnessing fire to sending out radio signals into space. So surely given the vastness of space we should be, and even if only a tiny fraction of those create life and other civilizations too, the universe should be very noisy, there should be evidence
Starting point is 03:14:50 of disenfee or whatever, like at least radio signals and so on, but seemingly things are very silent out there. Now of course it depends on who you speak to, some people say that they're getting signals all the time and so on and I don't want to make an epistemic statement on that, but signals all the time and so on and like I don't want to make an epistemic statement on that. But it seems like there's a lot of silence and so that raises this paradox. And then they, you know, the Drake equation. So the Drake equation is like basically just a simple thing of like trying to estimate the number of possible civilizations within the galaxy by multiplying the number of stars created per year by the number of stars that have planets, planets, a habitable blah, blah, blah.
Starting point is 03:15:29 So all these like different factors. And then you plug in numbers into that and you, you know, depending on like the range of, you know, you're low bound and you're up a bound point, point estimates that you put in, you get out a number at the end for the number of civilizations. But what Toby and his crew did differently was Toby, there's a number at the end for the number of civilizations. But what Toby and his crew did differently was Toby, there's a researcher at the Humanity Institute. They, instead of, they realize that it's like basically a statistical quirk that if you put in point sources, even if you think you're putting in conservative point sources, because on some of these variables,
Starting point is 03:16:02 the uncertainty is so large, it spans like maybe even like a couple of hundred of orders of magnitude. By putting in point sources, it's always going to lead to overestimates. And so they, by putting stuff on a log scale, or actually they did it on like a log log scale on some of them.
Starting point is 03:16:22 And then like ran the simulation across the whole bucket of uncertainty across scale on some of them. And then like ran the simulation across the whole bucket of uncertainty across all those orders of magnitude. When you do that, then actually the number comes out much, much smaller. And that's the more statistically rigorous, mathematically correct way of doing the calculation. It's still a lot of hand waving.
Starting point is 03:16:39 As science goes, it's like definitely just waving, I don't know what an analogy is, but it's hand waving. And anyway, when they did this and then they did a Bayesian update on it as well to like factor in the fact that there is no evidence that we're picking up because, you know, no evidence is actually a form of evidence, right? And the long and short of it comes out that the, we're roughly around 70% to be the only intelligence civilization in our galaxy thus far, and around 50-50 in the entire observable universe, which sounds so crazily counterintuitive, but their math is legit.
Starting point is 03:17:18 Well, yeah, the math around this particular equation, which is the equations ridiculous at many levels, but the powerful thing about the equation is there's the different components that can be estimated and the air bars on which can be reduced with science. And hence, throughout since the equation came out, the air bars have been coming out on a different aspects. Yeah, that's very true. And so that it's almost kind of says what like this gives you a mission to reduce the air bars on these estimates. Now over a period of time, once you do, you can better and better understand, like in the process of redoing the air bars,
Starting point is 03:17:59 you'll get to understand actually, what is the right way to find out where the aliens are, how many of them there are, and all those kinds of things. So I don't think it's good to use that for an estimation. I think you do have to think for more like from first principles just looking at what life is on earth. And trying to understand the very physics-based biological chemistry biology-based question of what is life, maybe computation-based. What the fuck is this thing?
Starting point is 03:18:32 And that, how difficult is it to create this thing? It's one way to say how many plants like this are out there, all that kind of stuff. But it feels like from our very limited knowledge perspective, the right ways to think, how, how does, what is this thing? And how does it originate from, from very simple, non-life things, how does complex life-like things emerge from from from a rock to a bacteria protein and these like weird systems that encode information and pass information from self replicate and then also select each other and mutate in interesting ways such that they can adapt and evolve and build increasingly
Starting point is 03:19:20 more complex systems. Right. Well, it's a form of information processing. Yeah. Right. Well, it's a form of information processing. Yeah. Right. Well, it's information transfer, but then also an energy processing, which then results in, I guess information processing,
Starting point is 03:19:34 maybe I'm getting booked. Well, it's doing some modification. And yeah, the input is some energy. Right. It's able to extract, yeah, extract resources from its environment in order to achieve a goal. But the goal doesn't seem to be clear. Right. The goal is, well, the goal is to make more of itself. Yeah, but in a way that increases, I mean, I don't know if evolution is a fundamental law of the universe, but it seems to want to replicate itself
Starting point is 03:20:08 in a way that maximizes the chance of its survival. Individual agents within an ecosystem do yes. Evolution itself doesn't give a fuck. It's a very, it don't care. It's just like, oh, you optimize it. Well, at least it certainly, yeah, it doesn't care about the welfare of the individual agents within it. But it does seem to, I don't know, I think the mistake is that we're anthropomorphizing. It's to even try and give evolution a mindset.
Starting point is 03:20:39 Because there's a really great post by Elias Yudkowski on Les Wrong, which is an alien god. And he talks about the mistake we make when we try and put on my think through things from an evolutionary perspective, as though giving evolution some kind of agency and what it wants. Yeah, worth reading. But yeah. I would like to say that having interacted with a lot of really smart people that say that anthropomorphization is a mistake. I would like to say that saying that anthropomorphization is a mistake is a mistake. I think there's a lot of power and anthropomorphization.
Starting point is 03:21:20 If I can only say that word correctly one time. I think that's actually a really powerful way to reason to things. And I think people, especially people in robotics seem to run away from it as fast as possible. And I just, I think- Did you give an example of how it helps in robotics? Oh, and that our world is a world of humans.
Starting point is 03:21:43 And to see robots as fundamentally just tools runs away from the fact that we live in a world of a dynamic world of humans that like these all these game theory systems we've talked about that a robot that ever has to interact with humans and I don't mean like intimate friendship interaction I mean in a factory setting where it has to interact with humans. And I don't mean like intimate friendship interaction. I mean, in a factory setting where it has to deal with the uncertainty of humans, all that kind of stuff, you have to acknowledge that the robot's behavior has an effect on the human, just as much as the human has an effect on the robot.
Starting point is 03:22:19 And there's a dance there. And you have to realize that this entity, when a human sees a robot, this is obvious in a physical manifestation of a robot, they feel a certain way. They have a fear, they have uncertainty, they have their own personal life projections. We have pets and dogs and the thing looks like a dog, they have their own memories of what a dog is like, they have certain feelings. And that's going to be useful in a safety setting, safety critical setting, which is one of the most trivial settings for a robot in terms of how to avoid any kind of dangerous situations.
Starting point is 03:22:54 And a robot should really consider that in navigating its environment. And we humans are right to reason about how a robot should consider navigating its environment through anthropomorphization. I also think our brains are designed to think in human terms. Like, game theory, I think, is best applied in the space of human decisions. Right. You're dealing with things like AI. AI is we can somewhat...
Starting point is 03:23:35 I don't think it's... The reason I say anthropomorphization we need to be careful with is because there is a danger of overly applying, overly, wrongly assuming that this artificial intelligence is going to operate in any similar way to us, because it is operating on a fundamentally different substrate, like even dogs, or even mice, or whatever, in some ways, like anthropomorphizing them is less of a mistake, I think, than an AI, even though it's an AI we built and so on, because at least we know that they're running from the same substrate. And they've also evolved
Starting point is 03:24:09 from the same out of the same evolutionary process. They've followed this evolution of needing to compete for resources and needing to find a mate and that kind of stuff. Whereas an AI that has just popped into an existence somewhere on a cloud server, let's say, you know, or whatever, however it runs and whatever, I don't know whether they have an internal experience, I don't think they necessarily do. In fact, I don't think they do. But the point is, is that to try and apply any kind of modeling of thinking through problems and decisions in the same way that we do, has to be done extremely carefully because they are, in the same way that we do has to be done extremely carefully because they are, like, they're so alien, their method of whatever their form of thinking is. It's just so different because they've never had to evolve, you know, in the same way.
Starting point is 03:24:55 Yeah, I was beautifully put. I was just playing devil's advocate. I do think in certain context, anthropomorphization is not going to hurt you. Yes. Engineers run away from it too fast. I could see that. But for the most point, you're right. Do you have advice for young people today, like the 17 year old that you were of how to live life? You can be proud of how to have a career. You can be proud of in this world full of mlyx. Think about the win-win, look for win-win situations. And be careful not to you know, overly use your smarts to convince yourself that something is win-win-win-it's not. So that's difficult and I don't know how to advise you know people on that because it's
Starting point is 03:25:42 something I'm still figuring out myself. But have that as a sort of default MO. Don't see things, everything is a zero-sum game, try to find the positive sumness and find ways, if there isn't seem to be one, consider playing a different game. So that I would suggest that. Do not become a professional poker player. Because people always ask, they're like, oh, she's a pro, I want to do that too. just that. Do not become a professional poker player. Because people always ask, oh, she's a pro.
Starting point is 03:26:06 I want to do that too. Fine, you could have done it when I started out. It was a very different situation than poker is a great game to learn in order to understand the ways to think. And I recommend people learn it, but don't try and make a living from it these days. It's very, very difficult
Starting point is 03:26:25 to the point of being impossible. And then, it's really, really be aware of how much time you spend on your phone and on social media and really try and keep it to a minimum, be aware that basically every moment that you spend on it is bad for you. So that doesn't mean to say you can never do it, but just have that running in the background.
Starting point is 03:26:47 This I'm doing a bad thing for myself right now. I think that's the general rule of thumb. Of course, about becoming a professional poker player, if there is a thing in your life that's like that and nobody can convince you otherwise, just fucking do it, don't listen to anyone's advice. Yeah, find a thing that you can't be talked out of too. That's a thing.
Starting point is 03:27:12 I like that. Yeah. You were a league guitarist and a metal band. Did I write that down for something? What did you, what would you do it for? The the the performing was the the pure the the music of it. Was it just being a rock star? Why'd you do it? So we only have a played two gigs. We didn't last, you know, it wasn't a very, we weren't famous or anything like that. We we didn't last you know, it wasn't a very we weren't famous or anything like that
Starting point is 03:27:47 but I I was very into metal like it was my entire identity sort of from the age of 16 to 23 Best metal band of all time. Oh, it's don't ask me that so hard dancer So I know I had a long argument with Ah! So I know I had a long argument with um I'm a guitarist more like a classic rock guitarist so you know I've had friends who are very big penterra fans and so there was often arguments about what's the better metal band Metallica versus Penterra this is a more kind of 90s maybe discussion but I was always on the side of Metallica. Both musically and in terms of performance and the depth of lyrics and so on. But they were basically everybody was against me because if you're a true metal fan, I guess
Starting point is 03:28:37 the idea goes as you can't possibly be a Metallica fan. I think it's like it's like it's sold out. Metallica are metal. Like they, they were the, I mean, again, you can't say who was the godfather of metal, blah, blah, blah, but like they were so groundbreaking and so brilliant. I mean, you've named literally two of my favorite bands. Like that's that, when you asked that question, I know who are my favorites. Like those were two that came up.
Starting point is 03:29:03 A third one is children of boredom, who I just think, they just tickled the boxes for me. Yeah, I don't know. Nowadays, I kind of feel like a repulsion to the, I was that myself, like I'd be like, who do you prefer? I'm like, no, you have to rank them. But it's like this false zero sum this this like why they're so additive.
Starting point is 03:29:27 Like there's no conflict there. Although I, when people ask that kind of question by anything, movies, I feel like it's hard work and it's unfair, but it's, it's, you should pick one. Yeah. Like, and I, that's actually, you know, the same kind of, it's like a fear of a commitment. People ask me, what's your favorite band? It's like, but I, you know, it's good to pick. Exactly. And thank you for, yeah, thank you for the tough question. Yeah. Well, maybe not. No, no, no, it's a lot of people are listening. Um, can I just like, what? Why isn't it? No, it does.
Starting point is 03:29:59 It is. Are you still into metal? Funny enough, I was listening to a bunch before I came over here. Oh, like, do you use it for, like motivation? Yeah. I used to live in metal. Funny enough, I was listening to a bunch before I came over here. Do you use it for motivation or get you in a certain way? Yeah, I was really listening to 80s hair metal before I came. Does that count as metal? I think so. It's like proto-metal.
Starting point is 03:30:17 And it's happy. It's optimistic, happy proto-metal. Yeah, I mean, these things, you know, all these genres bleed into each other. But, yeah, sorry, to answer your question about guitar playing, my relationship with it was kind of weird and that I was deeply uncreative. My objective would be to hear some really hard technical solo and then learn it, memorize it, and then play it perfectly. But I was incapable of trying to write my own music. Like the idea was just absolutely terrifying. But I was just also just thinking I was like, yeah, be kind of cool to actually try
Starting point is 03:30:52 starting a band again and getting back into it and write. But it's scary. It's scary. I mean, I put out some guitar playing, other people's covers, like I play comfortably numb on the internet. It's scary too. It's scary putting stuff out there. And I had this similar kind of fascination with technical playing, both on piano and guitar. You know, one of the first... One of the reasons I started learning guitars from Ozzy Osborne, Mr. Crowley solo, and one of the first solos I learned is that there's a beauty to it, there's a lot of beauty to it.
Starting point is 03:31:34 It's tapping, right? Yeah. Is there some tapping, but it's just really fast. Beautiful, like arpeggios. Yeah, arpeggios, yeah. And there's a melody that you can hear through it, but there's also a build on muscle. It's a beautiful solo, but it's also technically just visually the way it looks
Starting point is 03:31:50 when a person's watch is, you feel like a rock star player. Yeah. But it ultimately has to do with technical. You're not developing the part of your brain that I think requires you to generate beautiful music. It is ultimately technical in nature. So that took me a long time to let go of that and just be able to write music myself. And that's a different journey, I think.
Starting point is 03:32:18 I think that journey is a little bit more inspired in the blues world, for example, or improvisation is more valued, obviously in jazz and so on. But I think ultimately it's a more rewarding journey because you get to your relationship with the guitar then becomes a kind of escape from the world where you can create, create. I mean, creating stuff is... And it's something you work with because my relationship with my guitar was like it was something to tame and defeat. Yeah. Which was kind of what my whole personality was about then like I was just very like you know as I said like very competitive very just like must you bend this thing to my
Starting point is 03:32:57 will. Whereas writing music is you work it's like a dance you work with it. But I think because of the competitive aspect, for me at least, that's still there, which creates anxiety about playing publicly or all that kind of stuff. I think there's just like a harsh self-criticism within the whole thing. It's really, it's really tough. It's really tough.
Starting point is 03:33:18 I wanna hear some of your stuff. I mean, there's certain things that feel really personal. And on top of that, as we talked about poker offline, there's certain things that you get to a certain height in your life, and that doesn't have to be very high, but you get to a certain height. And then you put it aside for a bit, and it's hard to return to it because you remember being good. It's hard to, like, you being at a very high level in poker, it might be hard for you
Starting point is 03:33:46 to return to poker every once in a while and enjoy it knowing that you're just not as sharp as it used to be because you're not doing it every single day. That's something I was wondering with, I mean, even just like in chess with Kasparov, some of these grades, just returning to it. It's just, it's almost painful. Yes, I can. Yeah. And I feel that way with guitar too,
Starting point is 03:34:05 you know, because I used to play like every day a lot. So returning to it is painful because like it's like accepting the fact that this whole ride is finite and that you have you have a prime. There's a time when you're really good and now it's over and out. You're on a different chapter of life. I was like, oh, I'm like, what is that? But you can still discover joy within that process. It's been tough, especially with some level of like, as people get to know you, there's a, and people film stuff, you don't have the privacy of just sharing something
Starting point is 03:34:43 with a few people around you. Yeah. That's a beautiful privacy. That's your point. With the internet, it's just disappearing. Yeah, that's a really good point. Yeah. But all those pressures aside, if you really,
Starting point is 03:34:56 you can step up and still enjoy the fuck out of a good musical performance. What do you think is the meaning of this whole thing? What's the meaning of life? It's in your name as we talked about, you have to live up, do you feel the requirement, have to live up to your name? Because live? No, because I don't see it. I mean, my, Yeah, no, because I don't see it. I mean my Well, again, it's kind of like No, I don't know because my full name is Olivia. Yeah, so I can retreat in that I'll be like oh Olivia what does that mean me? Live up to live
Starting point is 03:35:36 No, I can't say I do because I've never thought of it that way Okay, anything you name backwards is evil. Yeah, I talked about Okay, anything, you name backwards is evil. Let's talk to the boss. There's like, layers. I mean, I feel the urge to live up to that, to be the inverse of evil, or even better, because I don't think, you know, is the inverse of evil good
Starting point is 03:35:57 or is good something completely separate to that? I think, my intuition says it's the latter, but I don't know, anyway, again, getting in the way. What is the meaning of all this? Of life. Why are we here? I think to explore, have fun and understand and make more of here and to keep the game going.
Starting point is 03:36:21 Of here, more of here. More of this, more of this. More of experience. Just to have more of experience and our ideally positive experience. And more complex, you know, to, I guess, try and put it into a sort of vaguely scientific term. Make it so that the program required, the length of code required to describe the universe is as long as possible, and that highly complex and therefore interesting. Because again, I know we bang the metaphor to death, but like, titled with X, you know, titled with paper clips,
Starting point is 03:37:02 doesn't require that much of a code to describe. Obviously, maybe something emerges from it, but that steady state, assuming a steady state, it's not very interesting, whereas it seems like our universe is over time becoming more and more complex and interesting. There's so much richness and beauty and diversity on this earth, and I want that to continue and get more. I want more diversity, and in the very best sense of that word is to me the goal of all this.
Starting point is 03:37:32 Yeah, and somehow have fun in the process. Yes. Because we do create a lot of fun things along. Instead of in this creative force and all the beautiful things we create create somehow there's like funness to it and perhaps that has to do with the finiteness of life, the finiteness of all these experiences which is what makes them kind of unique. Like the fact that they end, there's this whatever it is, falling in love or Creating a piece of art or creating a bridge or Creating a rocket or creating a I Don't know just the businesses that do that that that build something or
Starting point is 03:38:19 Solve something the fact that it is born and it dies somehow embeds it with fun, with joy, for the people involved. I know what that is, the finiteness of it. It can do. Some people struggle with the, you know, I mean, a big thing, I think that one has to learn is being okay with things coming to an end. And in terms of projects and so on, people cling onto things beyond what they're meant to be, you know, beyond what is reasonable. And I'm going to have to come to terms with this part he has come into an end. I really enjoy talking to you. I think it's obvious, as we've talked about many times,
Starting point is 03:39:05 you should be doing a podcast. You should, you're already doing a lot of stuff publicly to the world, which is awesome. And you're a great educator, you're a great mind, you're great into like, but it's also this called medium of just talking is also fun. It's a fun one. It really is good.
Starting point is 03:39:20 And it's just, it's just, it's nothing but like, it's just so much fun and you can just get into so many, yeah there's the space to just explore and see what comes and emerges and yeah. Yeah, to understand yourself better and if you're talking to others to understand them better and together with them, I mean you should do your own podcast but you should also do a podcast with C's to talk about about the two of you have such Different minds that like melt together and just hilarious ways fascinating ways Just the tension of ideas there is really powerful But in general, I think you you you got a beautiful voice. So thank you so much for talking today
Starting point is 03:40:00 Thank you for being a friend. Thank you for honoring me with this conversation and with your valuable time. Thank you. Thanks for listening to this conversation with Lou Berry to support this podcast. Please check out our sponsors in the description And now let me leave you with some words from Richard Feynman. I Think it's much more interesting to live not knowing than to have answers which might be wrong I have approximate answers and possible beliefs and different degrees of uncertainty about different things, but I'm not absolutely sure of anything, and there are many things I don't know anything about such as whether it means anything to ask why we're here.
Starting point is 03:40:39 I don't have to know the answer. I don't feel frightened not knowing things. By being lost in a mysterious universe without any purpose, which is the way it really is as far as I can tell. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.