Armchair Expert with Dax Shepard - Mustafa Suleyman (on artificial intelligence)

Episode Date: September 14, 2023

Mustafa Suleyman (The Coming Wave) is an artificial intelligence researcher, co-founder of DeepMind, and author. Mustafa joins the Armchair Expert to discuss how he became interested in the concept of... universal human rights, what deep learning is, and how machines can actually become creative. Mustafa and Dax talk about how close artificial intelligence is getting to matching the human brain, how people might feel about computers creating all content, and how AI might revolutionize the energy industry. Mustafa explains why the fear of government regulation exists, why people need to be involved in decisions about technological advancements, and how certain biases are corrected in AI programs. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome, welcome, welcome to Armchair Expert, Experts on Expert. I'm Dan Riggle, and I'm joined by Mrs. Mouse. We had an armchair, this is an Easter egg. We had an Armchair Anonymous yesterday. We recorded one, and someone said, it's Dan and Lily. Daniel and Lily, maybe? Yeah, it was really cute.
Starting point is 00:00:20 Today we have Mustafa Suleiman, and this is, we've been wanting really, really bad to get a premium expert on AI. Yeah, because it's so relevant. And we talked so much about it, yet we haven't had too many experts on to really tell us where we're at in this whole crazy experience. Mustafa is an AI researcher and an entrepreneur. He is the co-founder of DeepMind and Inflection AI. He's a powerhouse in this field. He's a big deal.
Starting point is 00:00:48 Yes, they have Pi as their AI version. He has a new book out called The Coming Wave, Technology, Power, and the 21st Century's Greatest Dilemma. So he's got lots of recommendations on how we keep this from devouring us. And they're all very, very interesting. It was a really interesting conversation.
Starting point is 00:01:09 And I think necessary at this point in time. So please enjoy Mustafa Suleiman. Trip Planner by Expedia. You were made to have strong opinions about sand. We were made to help you and your friends find a place on a beach with a pool and a marina and a waterfall and a soaking tub. Expedia. Made to travel. Hey, I just got us a new Coca-Cola spice. Nice.
Starting point is 00:01:33 What's it taste like? It's like barefoot water skiing while dolphins click with glee. Whoa, let me try. Nah, it's like gliding on a gondola through waving waters as a mermaid sings. Nah, it's like Coca-Cola with a refreshing burst of raspberry and spiced flavors. Yeah. Try new Coca-Cola Spiced today. He's an armchair expert.
Starting point is 00:02:02 He's an armchair expert. He's an entrepreneur He's an entrepreneur What are you saying? I can't imagine you don't live here. I live in Palo Alto. Seems on the nose. Yeah. Literally, I was in San Francisco at the weekend
Starting point is 00:02:20 visiting this very interesting antique shop full of strange curiosities, like really interesting collection. Within like one second, I started chatting to the person who worked behind the counter. She was like, so where'd you live then? I was like, Palo Alto. She's like, I knew it!
Starting point is 00:02:34 It's so obvious! Oh my God, that's funny. I want to go to a fun antique shop. I love shopping. I love that he said full of curiosities. Yeah. I think the place is he said full of curiosity. Yeah. I think the place is called a cottage of curiosity. Oh.
Starting point is 00:02:50 Isn't that a cool name? That's enticing. They should sell Altoids there because, of course, their saying is curiously strong. Oh. It is? Yeah. I don't think I knew that. Isn't it the greatest slogan for a product? Yeah, I didn't know that.
Starting point is 00:03:01 Curiously strong. I also love brands that have never changed. And I feel like Altoids, it's the same tin as it was in the 50s. Not that I was around, but it feels like it was. It has a retro vibe. Unless it is just doing the retro thing and I'm being tricked. It might be. And that would be really embarrassing.
Starting point is 00:03:18 Certainly possible, but you're right. It feels fun to consume an institution. Yeah, be part of the history with every mouthful. Look at this. It's an advert for Altice. Yeah. I'll go further. I mean, chips are always a stretch.
Starting point is 00:03:31 But even there's like food products where it's like, no, no, this is an institution. Like vanilla wafers. Yes. Vanilla wafers, yeah. Yeah, my grandparents consumed them. They fed them to us. I can't say that I'm passing it on.
Starting point is 00:03:43 Do you have any children? I do not. One day. You guys do? I have two. Monica has frozen eggs. I have a couple eggs. That's where we're at. No kids for me yet. You have a... Jesus, I'm a mess. Throw it all on the floor.
Starting point is 00:03:58 Literally, just toss it all out the window. Yeah. And I've just realized I have a little bit of debris on my phone and i hate that oh like one of my mints is well no you don't need to waste most office time i think the quicker you get it scratched the more accustomed you get to it just being like junky and then it's like fine that's why i don't do cases i'm like screw it wow yeah it's fine i mean i've got quite a lot of scratches so don't get overexcited,
Starting point is 00:04:26 but I just learned to live with them. How frequently do you shatter a screen? Once every two years. I tend to upgrade every other generation. Yeah, it's not too bad, actually. I'm good at catching. I've definitely got into that. They've gotten sturdier over
Starting point is 00:04:42 time. For sure. They have. I have kids grabbing them and stuff. I need a case. I hate it. I hate when you put it in your front pocket and it's got drag now. It has resistance. Right. I like that that's sliding. I know.
Starting point is 00:04:52 When you first get it, it looks so beautiful. Steak and slippery. Yeah. Could find its way into that shop of curiosities. One day. One day. Most def. Okay, so you're from England, London, England.
Starting point is 00:05:05 Dad was a Syrian taxi cab driver? He was a minicab driver, yeah. There are two kinds of taxis in the UK. There's the black cab, which is the profession. You have to do the knowledge. It takes like three years. They memorize every street. You have to remember all the routes.
Starting point is 00:05:20 It's insane. They're actually very, very, very smart cabs. Really appropriate for this conversation because here's a task that has already been outsourced to the phone, yet they are still doing it that way. And I don't know what we say about that. That's just very rare. It's a ding, ding, ding.
Starting point is 00:05:36 It's one of those things where the rules matter and it affects the pace of change. Right. That's, I think, a sign of hope because I think we should be collectively making decisions about how quickly we want new technologies to be introduced so that we can decide on the rate of change. We accidentally, or maybe you engineered this, we stumbled upon almost the greatest example of some legislation or policy that has prevented technology taking
Starting point is 00:06:01 over a space. And it seems to be working. Yeah, we were in London not too long ago in one of those cabs, and it worked out just fine. They're brilliant. They do have the advantage, though, of being able to use the bus lanes. So they can get there faster than a regular car. So I think that is a structural advantage, which justifies the higher price,
Starting point is 00:06:20 because they're also quite a bit more expensive than Uber. They are. But they have adopted some stuff. You can order them with your phone now. Yeah, so that's an improvement. Now, I guess what I interpreted by reading that your father was Syrian, I assumed in Syria he drove. No, so he moved to London in the early 80s and settled there.
Starting point is 00:06:38 And so I was born and raised in London. 84? That's right. Yeah, exactly. Painfully younger than me. It's okay. Still older than me, so we're doing pretty good. He's made more money and he's more relevant.
Starting point is 00:06:49 Okay, well. It's a little hurtful. But anyways. So, but in London as an occupation, he did drive. He drove initially one of those unlicensed minicabs. They don't have a medallion. You negotiate a price with the person. There's no meter.
Starting point is 00:07:01 Exactly. You negotiate on the spot. Is it 15 pounds, 20 pounds, 25? That's too much, a bit less. That has gone out of fashion now, whatever. It's been licensed. So there's much less of that. You can't sort of just hawk and pick up on the street and so on.
Starting point is 00:07:15 That was his profession for basically all of his life in the UK. We all love story. And for me, that's a great story that his son would end up being a pioneer in AI. I think that feels like about as good of an immigrant story as you can get for your child. Right. It's funny because my mom, who was a nurse in the National Health Service, she was kind of adamant that I would drop out of school at 16 and get a trade, become a plumber, become a carpenter.
Starting point is 00:07:41 Everyone's always going to need an electrician. That was just like long-term reliable. Obviously, I didn't do that. But it was definitely that kind of, you know, making money quick is important. Yeah, exactly. She's English though? She's English. How did he woo her? That must have been hard in the 80s. That is actually a funny story in a way. So my mom was riding around the world on a double-decker red bus in 1982, I think it was. And she was at a rest stop in Afghanistan repairing their tire with this crew of basically travelers. And in the washrooms at the rest stop, my dad apparently was having a shower next door. Oh, wow. He's taking a shower at the gas station. She's in a double-decker. This is all goodness. He's taking a shower at the gas station.
Starting point is 00:08:25 She's in a double-decker. This is all impossible. It's like a Dr. Seuss story. It's completely crazy. So he comes out in his towel. You know, they have a little exchange. He's attractive, I'm guessing. I have no idea.
Starting point is 00:08:38 It's sounding like you. Well, you're handsome as hell, so I'm imagining he was a looker. So then he's standing there in his towel, and of course they don't have a common language. He doesn't speak any English. He doesn't speak any Arabic. They have a bit of like schoolboy, schoolgirl French.
Starting point is 00:08:54 And so they have this connection, nice little chat. They go on their way. My dad is picking up marble to head to Pakistan to trade. He was basically trying to fund his engineering degree. Is he going over the Kaibar Pass? Is it like that historic? I don't know if that was the route, but it certainly feels like that in your head.
Starting point is 00:09:14 Yeah, for sure. Then really strangely, 10 days later, they end up encountering each other again, completely randomly, this time in Iran. What? No no this is destiny this is so romantic it's totally wild very horny and obviously that was game over from then on it was just like a slam dunk at least for a few years he emigrated to england to be with her correct oh wow did he get his engineering degree? You know what? He did not finish it because they became pregnant with me in Pakistan. They didn't want to have me in
Starting point is 00:09:50 Pakistan, although they did think about it in Islamabad. And so then they came back to the UK. And then obviously he was sort of very upset because in the UK, they didn't recognize half of his Pakistani engineering degree. And he had actually left Syria to avoid conscription. Because in those days, everyone at the age of 18 has to go into the army for three years. And in the 80s, Syria was getting into it. It's not like you wouldn't be realistic to assume that could be life-threatening.
Starting point is 00:10:17 That was a rough experience. I have uncles who went through it, and it's not the kind of touchy-feely, friendly, turn you into a servant of the national effort. It's very brutal. And so he basically wanted to avoid that. Okay. We'd call that a draft dodger here.
Starting point is 00:10:32 Exactly. At one point in time, I think nowadays we'd say it's smart. Well, I think we all know where I stand on that. I'm a big bleeding liberal, so I'm, you know. We're for it. I don't think you should go kill people you don't want to kill. You have a very interesting and circuitous route to this position you hold as an authority in AI because you drop out of college, I guess. You're 19.
Starting point is 00:10:54 Presumably, you're in college. You start a helpline for Muslims, which becomes an enormous resource and maybe the biggest for mental health for Muslims in the UK. How the fuck does that happen? And how do you drop out of college to pursue that? I got to Oxford where I was actually studying philosophy. I think the biggest transformation of my life happened where I basically discovered human rights principles. In a philosophy class? Yes, in the kind of spirit of universal justice and fairness, rather than thinking that me and my people were the righteous ones, the chosen ones, the special elite, and so on. And throughout my late teens, I had kind of struggled with lots of parts of the religion.
Starting point is 00:11:34 Both my parents were very strict, very strangely. My mom was actually already a Muslim. She converted before meeting my dad. Hence the crazy bus trip, probably. Yeah, going to find herself. I'd grown up with a very strict sense of the religion and it was starting to get uncomfortable. When I got to Oxford, it kind of provided me
Starting point is 00:11:52 with a framework for thinking about universal rights rather than just one group. That you didn't need to appeal to faith. You could use reason to establish the fairness of things. I think that we need that more than ever now, rather than having to rely on this very shaky, arbitrary foundation of what I believe. For me, that's a bit of a trigger word.
Starting point is 00:12:13 When I hear believe these days, I'm like, eek, I'm going to have to be cool here because I'd much rather have a discussion and have those beliefs or those ideas evolve and be subject to critical reason and stuff like that. So I helped to start the Muslim Youth Helpline. It's actually a group of people who were already in motion getting that started. And it was a non-judgmental, non-directional, so secular support service.
Starting point is 00:12:36 Right, which probably had to be very rare in that community. I mean, it was the first of its kind because all the other ones... They would drive you to... Exactly. You shouldn't have sex before marriage and homosexuality is a sin. You observed the five pillars today. Did your parents freak out? Did they know you were doing this? I don't think my dad would have approved. I had left home by that point when I was 15.
Starting point is 00:12:57 So they weren't so much in the picture. It wasn't such a big deal. That was actually very freeing in itself because I could sort of go and make my own way, figure out my own ideas. Because you had gone to a boarding school at some point as well. Actually, I went to a state school, a free education, but it was a very good one. You had to do an exam to get in. And I think because it was selective, you had to pass two exams and an interview, you know, just meant that everyone was starting from a little bit further ahead. It changed my life. The school was really gave me a huge advantage in life and just to be surrounded by a lot of other really driven people and really smart people. Pressure testing your thoughts and ideas around adversaries that are worthy, they will make you raise your argument.
Starting point is 00:13:36 They'll make you better. Yeah. And that's actually what I found at Oxford as well, is that there was no judgment for being a bit obsessive or a bit nerdy or really over passionate about stuff. Yeah. Whereas earlier in life, that was a bit more tricky. The helpline experience leads seamlessly into some more broader public health and public service work. You end up working for the mayor of London. You end up advising on conflict resolution, all kinds of things. You have clients like the United Nations. That seems like an easy to follow trajectory. And then you form DeepMind with two other people. You co-found DeepMind, which is an artificial intelligence lab. How on earth do we get from the public service to there? I guess my assumption that anyone like you that I'd be talking to had
Starting point is 00:14:21 to be a computer science major, probably has a PhD in something like I would not have thought drop out 19 public health. I definitely don't have a PhD. No. I got lucky in that I started the kind of work thing very early. So because I left at 19, I managed to cram in a lot of experience working in charities and local government, working as a conflict resolution facilitator all around the world. That gave me huge exposure to lots of different types of work and types of theories of change. I was obsessed with what is your theory of change? How do you have impact in the world? How do you scale that impact?
Starting point is 00:15:00 How do we actually progress civilization? Just in that transition from religion to secular ethics, that was my new kind of raison d'etre in life. And I got to this point where in 2009, I was at the climate negotiations in Copenhagen, and I was helping to facilitate one of the main negotiating tracks, the one around reducing emissions from deforestation. the one around reducing emissions from deforestation. And I just suddenly had this realization that actually these institutions are super stuck. They're not evolving fast enough.
Starting point is 00:15:32 We can't establish consensus of any type on even the basics of the science of what's happening, let alone what the right interventions are and what we should do. And at the same time, for the previous couple of years, 2007 to 2009, I had my eye on what we should do. And at the same time, for the previous couple years, 2007 to 2009, I had my eye on what Facebook was doing. It was blowing me away.
Starting point is 00:15:50 I was like, this tiny little app has gone from zero to 100 million monthly active users engaging with it. They're sharing their personal lives. They're connecting. They're forming new relationships. They're getting married. It's incredible how much this is changing so quick. This is an incredible instrument of nothing else. It was very obvious to me that it was so much more than a platform. And the word platform was actually a
Starting point is 00:16:14 misrepresentation of what was actually going on there. Because it's much more of a mode for framing how activity takes place. The incentives, the structure, the way the website is laid out, the colors, the very all famous now thumbs up like button, which drives engagement. The original sim. Yes. Yeah. In hindsight, that seems super obvious. And to me at the time, it was like, wow, this is really incredible. I need to do everything I can to be a participant in technology. Technology is going to be the thing that helps make us smarter, more efficient,
Starting point is 00:16:53 more productive, essentially do more with less. It must have been appealing as well. If you're dealing with governments and institutions in these negotiations, to see something that comes with no history has to be very encouraging. Like you're not bridled with 300 years of how we do things. That's a great point. It's actually much easier to innovate in greenfields than it is to change the status quo. I kind of think that's one of the big problems that we have in the world today where there aren't very many compelling positive narratives of the future coming from the old world order. On all sides of the spectrum, people talk about the mainstream media,
Starting point is 00:17:30 they talk about mainstream finance. Their reaction is that the existing establishment is not innovating fast enough. And so we look outside of that for new narratives of the future. And to me, Silicon Valley and technology and the potential to do more with less and invent and create things in green fields, that really is the kind of vision of the future that whether we like it or not
Starting point is 00:17:52 is becoming the kind of default way that we understand how things are going to play out. Yeah, because the old institutions work so slowly, they can't even correct fast enough, right? I mean, that's part of the inherent problem is that they're not nimble the way all these technologies are. I mean, especially the ones we're going to get into where it's like between 8 a.m. and 12 p.m., this machine, if given enough data, can go from knowing nothing to knowing everything about something.
Starting point is 00:18:19 What human organization can keep up with that? Yeah, I mean, what human organization or collection of all humans could possibly consume that much raw content? I mean, these models are just like alien life forms in the amount of knowledge that they can consume and reproduce. Yeah, kind of beyond comprehension. You should get involved with the SAG and WGA strike negotiations because first you know how to negotiate
Starting point is 00:18:45 or you've been involved in that and you know everything about AI. Believe me, I've got an earmark question as we get to that. Okay, let's get you in there and let's also get it solved fast. It's coming. We're ready.
Starting point is 00:18:56 Okay, so specifically, how does this interest in, I want to say MySpace because I never did use Facebook, but Facebook, you weren't as enamored with MySpace apparently. That's the only place I did a lot of business. But how does that then take you to artificial intelligence specifically?
Starting point is 00:19:12 Well, so it set me on a quest to find anybody and everybody who was in my network who was involved in technology or software of some kind. During that year, 2009 and early 2010, I met with anyone who would give me five minutes to just tell me what was happening and why and how it worked. And I had an interest in software and technology. I mean, I was on the internet very young and was an obsessive on forums. And actually, in 2003, my first actual business was an electronic point of sale system, where we had these little PDAs and we went
Starting point is 00:19:45 in and installed network equipment for restaurants and stuff and tried to get them digitized so they could take orders really quickly and stuff. Then you remembered in England, the restaurants have no desire to do anything quickly, especially bring you your check or allow you to give them the credit cards. It's my only complaint about England in general. It's just, you want to walk in and go like, let's pay now. Yes. Because you know it's going to be 90 minutes.
Starting point is 00:20:10 Ask me for the check now. I know you don't know what I ordered, but let's start that process of giving me my check. A hundred percent. It's so true. That's what I quickly found out. This is not a problem that needs to be solved at that time in the UK.
Starting point is 00:20:22 Plus the technology wasn't good enough. So I basically set about on this quest and my best friend at the time, the UK. Plus the technology wasn't good enough. So I basically set about on this quest and my best friend at the time, George, his older brother, Demis Hassabis, who was the co-founder and CEO of DeepMind, was just finishing his postdoctoral research at UCL at the Gatsby Computational Neuroscience Unit. So that's what his degree was in, was neuroscience? His degree was in neuroscience. He was interested in memory and how the brain processed information and used memory for imagination. He wrote some really important papers there. And whilst he was at Gatsby, he met our third co-founder, who is sort of an AI guy
Starting point is 00:20:59 through and through and a big believer in AGI from day one. He did his PhD on definitions of intelligence. He aggregated like 100 different definitions, all the different types of intelligence you could possibly imagine. And he universalized them into one working definition, which was the ability to perform well across a wide range of tasks. So this emphasis on generality. Interesting. Intelligence is about generality. And that's when we pushed this idea of AGI. And in fact, our business plan said, and this is summer of 2010, building artificial general intelligence ethically and safely.
Starting point is 00:21:37 So did you guys coin this AGI term? It was in use before. We didn't coin it, but I think we were the first to use it in a company as a mission. And then it got popularized shortly after because this technique called deep learning started to work. It wasn't really until 2012, 2013 that deep learning was showing very promising signs. Did you have imposter syndrome among these two with the PhDs?
Starting point is 00:22:00 Yeah, they're scary. And did you feel like you had to prove yourself worthy of being a part of this triumvirate? For sure. What I came to realize after the fact is that I had very different skills. And that complementarity was an amazing three-way dynamic for a decade. And, you know, one of the things I am luckily able to do is to just go from the micro to the macro. Like I'm quite good at thinking big picture and quite practical and very operational as well. And I'm very urgent. I'm the one who's saying, right, what are we going to do tomorrow?
Starting point is 00:22:28 And that turned out to be a good constellation of skills between the three of us. Would you guess, or you probably know, that's a rare combination of folks, I'd imagine. I imagine most people that end up in Silicon Valley, there's not a lot of outsider, let's just say artsy social scientists above hard science, you know, whatever we would call that. I can't imagine they find themselves in this situation very often. I think that's exactly right. And interestingly, between the three of us, even Demis and Shane are not strong programmers. Demis is a neuroscientist. He did computer science as his degree, but he never became a software engineer
Starting point is 00:23:02 formally. And Shane was very much on the mathematics and the theoretical end of the spectrum. So typically, in Silicon Valley, a startup pair of co-founders would both be coding. Yeah. And one of them would be an exceptional coder. This is the Wozniak, Steve Jobs. Yeah, although Steve wasn't a coder either, right? Right. He's the non-coder. He's a non-coder he's a non-coder yeah exactly steve is the big picture guy you know who focuses on the user and thinks always about the product right okay so this company you guys end up being pretty successful in your exploration of all this and google ends up buying this company yeah pretty, pretty successful is really low key.
Starting point is 00:23:45 Well, in that I mean, it's not like they have an operational chat GPT or Pi. You don't have a product yet per se, right? You just have a lot of progress. Right. It was a remarkable acquisition. I mean, we're talking 2014. Most people haven't even heard the word AI,
Starting point is 00:24:03 let alone AGI, right? This is out there completely speculative. And they haven't been snatching up European companies either. Exactly. It was actually the first acquisition that Google made in Europe. Right. So all the reasons you should not assume that you're going to get bought by Google. Yeah, it was very unlikely.
Starting point is 00:24:22 But the thing that caught Google's attention is that we made this demo of an AI learning to play the Atari games from scratch. So you remember like Space Invaders and Pong? Asteroid. Right. Breakout, if you ever played that game where you have a paddle at the bottom. Exactly. We basically had the AI just from the pixels, right? So we didn't give it any rules or any structure, nothing else. Just interact with this environment, get a score. So it tells you if you randomly, luckily managed to hit it or if you lose. And then over time through self-play, so it would play itself millions of times, it would learn to associate a set of pixels with a set of actions with a score. And then over time, it would get
Starting point is 00:25:07 really good. It would start to think, oh yeah, these are the actions that I took in the run-up to getting that score last time. So I'm going to reproduce that. And really quickly, does it have to play the game in real time or can it accelerate the game itself? That's a great question. And this is a key insight about computers, right? They're paralyzable, so they can go at lightning speed. You can scale them up. You have millions of instances playing against itself rather than having to go in human time. Okay, good.
Starting point is 00:25:33 So it's playing a whole match in seconds or minutes as opposed to it would take us 15 minutes to get to that board or whatever. Right. That's a key intuition because machines can have more experiences than any single human. And we see this pattern repeat over and over again. In 2015, we trained an AI to detect breast cancer in mammograms. And by 2016, it was better than the expert radiologists.
Starting point is 00:25:59 We did the same thing for ophthalmology. So for 52 blinding diseases, it was better than the best human ophthalmologists in the world. Right. And that's mostly because it has seen orders of magnitude more cases than the humans. You know, the best humans in their career might see 30,000 cases. Well, seen and I think even more importantly, remember at all times.
Starting point is 00:26:21 I mean, that's the real key. It's like if you actually could remember the sequence of events of everything in your life, you too would be able to predict a much higher degree of accuracy. But it's like, we don't have access to that. Totally. So the fact that it can see things much, much faster,
Starting point is 00:26:36 the fact that it has perfect memory in many cases, or very, very good memory, and the fact that it doesn't get tired. Yeah, because if it sees a slide of an iris, let's just say, I don't really know how it works, and it has some little impigmentation here, it can rapidly go through all 10 million photos in the data set with known prognosis.
Starting point is 00:26:54 The human doctor can't hold it up next to 10 million examples, but it can. That's the magic of it, right? Right. And that's actually what has happened for the last decade as we see these crazy exponential trends of now seeing trillions and trillions of words of open data. Yeah.
Starting point is 00:27:09 But at the time, it couldn't work in the space of words. It was really just looking at images and pixels. When we saw it play these Atari games, it was really incredible. This thing has learned clever strategies that your average human player wouldn't have discovered. There was one particularly in Breakout where it would tunnel up the back and knock down all the bricks and then bounce off the back wall. Like a clever little trick, right?
Starting point is 00:27:36 That was the first time that I thought, this is why I'm working on AI, the ability to learn new knowledge. That's how AIs can truly help us get out of all the messes that we're in from climate change to what we have to do with agriculture and what we have to do in transportation, right? They can teach us new knowledge like great scientists and researchers. Yeah, they can shatter paradigms, right?
Starting point is 00:28:00 Because we all get stuck in thought paradigms. You know, medicine works this way, diagnose, treat, whatever paradigm it is. The thing outside of it is so hard for us, but for a machine, it's not. And it can actually be creative. So this is the cool thing, is that it can discover new knowledge.
Starting point is 00:28:18 It can see outside of the box and it can be genuinely inventive. Everyone's probably seen these image generation models these days, like Dali and other ones. Yes, it's crazy. It's wild. Probably more than you can imagine can now be manifested. I think that's all creativity is.
Starting point is 00:28:34 It's just combining multiple different ideas in novel ways to produce something unique. Okay, that kind of fast-forwards us to one point about AI. When it's explained to us, I think we discredit the fact that we too operate almost in an identical way. So a current argument right now, and this is dangerous for me to say because I'm in the union and I am supportive of the WGA, but... ANSAC. ANSAC. The example I'm giving is about writing.
Starting point is 00:29:01 Got it. Joseph Gordon-Levitt, wonderful actor, really smart guy, wrote this incredibly well thought out piece about this issue and said that the owners of the content that are fed to the AI, let's say if it's to write a script and you feed it 600 scripts from brilliant writers that have worked in the past, that there should be a royalty paid on that original work which is a very sound argument but i do have to point out i as an artist i too have been trying to replicate all my favorite movies when i sit down to write a movie i'm informed by michael mann in heat i'm informed by pulp fiction's my favorite movie i like this i like I like that. I am an AI. I am cashing in on all the info I took in and liked, and now I synthesize my version of it. So it's just a curious situation where I shouldn't be asked to pay a royalty to Quentin Tarantino if I write something inspired by him, but the computer should. I feel like there's some weakness in that argument.
Starting point is 00:30:02 So first of all, the strength in the argument, which I broadly agree with, is provided the AI isn't regurgitating word for word the paragraph of copyrighted text, that would be plagiarizing, right? But it seems that most of the time these models are not regurgitating that word for word. They're really inventing something new. They're finding the space between two ideas.
Starting point is 00:30:27 Interpolation, it's called. That seems to me just like anything that we would normally do as humans. That's how we're creative. You might like the pattern of that sofa and you might go, hmm, that's cool. I want to see that on a jacket and you take it with you.
Starting point is 00:30:38 I'll be more literal. I pitched this movie to Warner Brothers. I said, I want to remake the TV show Chips, but I want to do it Chips meets Lethal Weapons. weapons so it's hard r the action's intense and it has comedy i mean i'm literally saying marry chips and lethal weapon together that's what i'm going to try to do but didn't you have to pay some sort of royalty to original chips well chips because we're using the actual intellectual property in the name yeah but i could have also gone in and said i got this movie idea it's called bike cops imagine chips plus lethal weapon and most pitches are sold that way so it's midnight Yeah. But I could have also gone in and said, I got this movie idea. It's called Bike Cops.
Starting point is 00:31:05 Imagine chips plus lethal weapon. And most pitches are sold that way. So it's Midnight Run meets blank. But Midnight Run has never gotten any royalties from any of these pitches from these humans. So it's just curious. It's okay if humans don't pay royalty, but we want the machines to. And it's also exactly the same thing with apps. The number of times you hear a startup founder being like, I'm going to do Uber for food delivery or I'm going to do it for this, I'm going to do it for that.
Starting point is 00:31:29 This is how we create. Yes. Stay tuned for more Armchair Expert, if you dare. Sasha hated sand, the way it stuck to things for weeks. So when Maddie shared a surf trip on Expedia Trip Planner, he hesitated. Then he added a hotel with a cliffside pool to the plan. And they both spent the week in the water. You were made to follow your whims.
Starting point is 00:31:59 We were made to help find a place on the beach with a pool and a waterfall and a soaking tub and, of course, a great shower. Expedia. Made to travel. Order up for Damien. Hey, how did your doctor's appointment go, by the way? Did you ask about Rebelsis? Actually, I'm seeing my doctor later today. Did you say Rebelsis? My dad's been talking about Rebelsis. Rebelsis? Really? Yeah, he says it's a pill that...
Starting point is 00:32:26 Well, I'll definitely be asking my doctor if Rebelsis is right for me. Rebelsis. Ask your doctor or visit Rebelsis.ca. Order up for Rebelsis. This episode is brought to you by Tresemme. Want silky smooth hair that's still full of natural movement? The Tresemme Keratin Smooth Weightless Collection is your simple solution. This new collection features a wide range of products from nourishing shampoo and conditioner to lightweight heat protectants and a silky smooth serum for a sleek finish. Wave goodbye to frizz and say hello to three days of smooth hair
Starting point is 00:33:05 with the Tresemme Keratin Smooth Weightless Collection. Visit tresemme.com to learn more. Okay, you work there for a while. You see this obviously just grow in its ability. You have to be dazzled almost monthly because it itself becomes a bit exponential, doesn't it? Because it learns from its learning. Right.
Starting point is 00:33:33 And so it's just accelerating at all times. Is the pace of it at any point while you're observing it? When do you start getting nervous or apprehensive about it? I think I've been nervous about it since the day we founded the company. That's the only honest and wise approach. We should approach technologies that are as fundamental as intelligence itself.
Starting point is 00:33:57 We've literally just been describing something that if you swapped out the human for the machine and you replayed this conversation, you couldn't really tell, were we talking about the machine there or were we talking out the human for the machine and you replayed this conversation, you couldn't really tell, were we talking about the machine there or were we talking about the human? You can put a kid in front of an Atari game and he will master it.
Starting point is 00:34:12 We've seen it happen. And he does it through remembering what things worked and what didn't. So we're already at a place where we're taking the thing that has made us unique as a species, our intelligence, this ability to plan and imagine, create, adapt,
Starting point is 00:34:26 and invent new things, communicate perfectly in language. Language is a technology. We now have another type of input to that technology, which is the machine, able to use language. So it's always been top of mind for me. I mean, it's why we framed the company around ethics and safety from day one. And when we were acquired by Google, we actually made it a condition of the acquisition that we have an ethics and safety oversight board with independent members and a charter of ethics that governed all the technology that we give to Google in perpetuity.
Starting point is 00:34:59 And two of those red lines were that it could never be used for military purposes and never be used for state surveillance. That's still in operation now. But as we'll get into, even a well-intentioned, seemingly bulletproof statement like that can be hacked by AI. And we'll get into how you couldn't prevent the machine from getting racist. There's all these things you can't account for. And that's in your book, The Coming Wave. There's like a lot of almost impossible to predict things. Right.
Starting point is 00:35:25 And that has been the story of the last decade. The progress has been eye-watering. It was 2014 when we were acquired by Google and we became Google DeepMind, part of the Google ecosystem. And in that year, when we trained the Atari model, that used something like two petaflops of computation. So two billion million computations, which sounds like a lot, but it's actually relatively small.
Starting point is 00:35:51 Two billion million. Two million billion. That's like when a little kid is like, it's a billion million. Yeah, yeah, yeah, exactly. Like a made up. A thousand billion hundred. Totally a made up number. It's called a petaflop.
Starting point is 00:36:03 So you can go and like, can go and drop that on someone. Every year since then, the cutting-edge models in AI have used 10 times more compute than the previous year. So for 10 years, it's gone 10x. Oh, my God. So to your question about am I surprised, am I amazed, I mean, it's totally mind-blowing. Well, 10 to the 10th powers,
Starting point is 00:36:25 I don't even know what that number is. It's kabillions. Exactly. That's a good number. It's called 10 billion million billion flops. So it's gone from 2 to 10 billion. 10 billion million billion flop billions. Some insane numbers.
Starting point is 00:36:42 You can't even graph that. There's definitely not enough room on the page to list all the zeros. Scale has been the thing that has transformed this. The models are getting bigger. They're consuming more data. They consume insane amounts of computation. That is starting to look like a brain-like structure in terms of the number of neurons we have in the human brain.
Starting point is 00:37:03 Which we have trillions. We have about 100 trillion. 100 trillion. Connections in the brain. And so these models at the moment are roughly on the order of about 1 trillion. Okay. So there's a 100x difference between the models and the brain. But as you just said, we're moving...
Starting point is 00:37:17 Which would just be a year or two. Yeah. Could be a couple of years. That's right. To at least match the amount of neurons in the brain. Just to be clear, that's not going to mean a human-like performance. Right. But it's just a crude, rough measure. And really early into this acquisition, you started using the DeepMind technology to address energy consumption for cooling Google's data centers. And the AI itself looked at this and
Starting point is 00:37:42 figured out how to reduce it by 40% for cooling. Yeah, that was a crazy project. I mean, the Google data center infrastructure spends tens of billions of dollars a year. I was going to say, I have to imagine the very best engineers on planet Earth designed the Google cooling center. Literally. Whoever the very best person in thermodynamics or whatever, I'm sure they have he or her. Absolutely. And they were very resistant
Starting point is 00:38:05 to cooperating with us at the beginning. They were like, we're the best systems engineers, mechanical engineers, industrial engineers on the planet. So it took us about six months to cozy up to them and persuade them to give us a shot
Starting point is 00:38:17 at doing this. And we basically had the AI look at all the historical data, five years worth of mechanical set points, like how fast is a fan going? Does it turn on at this temperature or that temperature? Exactly. What combination of fans to use?
Starting point is 00:38:32 Because there's like a motherboard fan, there's a chassis fan, there's corridor fans. And these are like the size of three or four football pitches, these data centers, right? You look down them and you can't see the end. And they're usually underground? They can be overground, but they're almost always near hydroelectric power or solar power. Cheap energy is the number one priority.
Starting point is 00:38:51 Right. Wow. Three football fields. I know, it's insane. I mean, it takes you like 20 minutes, half an hour to walk down the thing. And you have to wear headphones because it's like screechingly loud. Really? And of course, freezing.
Starting point is 00:39:03 So in looking at all this data, how long did it take the AI to figure out a system that would reduce it by 40%? Well, it took us about a year of tinkering and experimentation because obviously we got it wrong a lot. And in the first deployment, it got about a 30% reduction. So we were pretty blown away by that. And then what we were doing, because obviously they didn't want to let the AI loose
Starting point is 00:39:26 on the actual hardware, the AI would basically give a set of recommendations to the human data center controller. And you guys would implement the recommendations? Yeah, and then the human would be like, yeah, that seems sensible, I'll adjust it this level, that level. Because then what we saw after like three or four months of operation,
Starting point is 00:39:42 the human was just accepting the AI's recommendations like 95% of the time. So instead of having a 15-minute gap, we were like, well, if we have it as a real-time control, it'll just update the set points every 30 seconds or every minute. And you get like a huge efficiency just from that, which just shows you how kind of inefficient humans are in general. So that gave another 10%.
Starting point is 00:40:03 We have limits of conscientiousness. I mean, I don't know why, but just recently I've been thinking like, you know, it's kind of the number one thing you would want in an employee is conscientiousness. The computer is just the most conscientious thing in the world. It can't deviate from whatever it's supposed to do, right? I think this is another reason for us to be really optimistic about the future because we want the machines to produce a consistently fair and just outcome across the board. Obviously, the challenge is that we have to make sure we program them in that way
Starting point is 00:40:33 and that we can hold them accountable, keep them controlled. So there's that challenge. But in theory, it should mean that unlike a judge that gets tired after lunch. Sentences you before or after lunch. Right, yeah, there's a huge swing. It's like three years difference or something crazy. You see it all the time. We're guilty of
Starting point is 00:40:50 it individually. And also we walk into a place and we know they're supposed to do that, but they don't want to and they don't do it. Emotions. Yeah. Emotions and energy levels and distractions and personal problems. You think about it, if you ever visit a relative in a hospital or something, and you see all the different people in different parts of the ward, and the people who are quiet and not very pushy and demanding of nurse practitioner care, you have to think how uneven that treatment is as a result of somebody having a really active patient advocate member of their family to help manage their care and stuff. And that is going to end up in worse care for some people who really deserve better. And so that's why I'm excited about the kind of fairness side of AI. Yeah. I will add that you left Google in 2022. People
Starting point is 00:41:35 will be mad if I don't mention you had allegations of bullying people and you left DeepMind. Right. You bullied people on email, maybe? I can be super hard charging and very demanding. It was five years ago and I've learned a lot from that. Okay, wonderful. Moving on. So you leave DeepMind and you go to Google proper and then you leave there in 2022 and then you found with Reid Hoffman, who we've had on and we adore Reid, Inflection AI. And in 2023, you guys introduced Pi, which is a chat bot. Now I'd love to get into some of the things that your book is warning us about. I don't know that it should be optimistic first
Starting point is 00:42:11 or pessimistic first, but I imagine a lot of this will file into the very astute observation of Tristan Harris, which is social media is dystopia and utopia. It's all things. So I think AI too will be all things. There'll be like these incredible, miraculous breakthroughs.
Starting point is 00:42:28 And then there'll be some really dangerous things that come along with it because there are bad actors in the world. So let's first look at what's coming. You have a wonderful list in the book, but if you want to hit me with some of the ones that you find personally the most interesting, and I have a few that I think seem so exciting.
Starting point is 00:42:46 Yeah, I mean, I think this is going to be the greatest force amplifier in history. So you're right that it's going to amplify the bad as much as it will amplify the good. And I think the sooner we just accept that that part of the equation is inevitable, the quicker we can start to adapt. Because this is about adaptation and mitigation. What boundaries and constraints and guardrails can we put around the technology so that we get the most out of it,
Starting point is 00:43:14 but so that it always remains accountable to us and that it doesn't end up causing more harm than good. I think there is a low chance that it ends up causing more harm than good because I think that people are starting to realize how significant this moment is. And they're starting to get involved. I mean, you mentioned the writer's strike. I mean, that's the tip of the iceberg. Yeah, this is just the first opening salvo, it's going to be the greatest force amplifier. And so that is likely to cause huge chaos and instability. It's going to come from two different angles.
Starting point is 00:43:46 On the one hand, there's going to be these super giant models that the really big companies build. We're one of them. We can talk about that for Pi. But it's also going to be open source, meaning anyone can make it, create it, recombine those ideas. So as we've seen from things like stable diffusion. What's that?
Starting point is 00:44:04 Stable diffusion is an image generation model, which is like OpenAI's DALI, but it's entirely open source. So the actual code to run that model is available to anybody on the internet to adapt it, play with it, improve it. You can actually take the entire model and run it on a laptop and it can produce photorealistic, super high quality images with just a few sentences of instruction, zero technical effort or expertise required. Wow. My mind just immediately went to something
Starting point is 00:44:36 pornographic. People are probably creating like... Well, that's a whole deep fakes thing, right? Yeah. Is that AI generated? Those deep fakes? Those deep fakes are AI generated. Right, so they're probably, you know, imagining their favorite podcast host, probably me. You wish. I know, that's open source. Anyone wants to do that? Kristen has talked about this because her face has been put on some porn stuff and it's crazy. Ashton sent that over to us.
Starting point is 00:45:04 Okay, well, I guess he is at the forefront of this tech stuff. He's an investor in inflection, actually. Oh, he is? Okay. He would be, of course. Okay, so you can run it on your laptop. Now, when you generate, we get into kind of copyright authorship. Like, if you give it the five commands
Starting point is 00:45:22 that generate this image, is that your image? I think that the idea of ownership is going to start to fall away. Interesting. And obviously that's controversial. I'm not proposing this outcome. I'm describing what I see. Don't shoot the messenger. I believe that as far as I can see, my best prediction is that over the next five to ten years or so,
Starting point is 00:45:46 as I can see, my best prediction is that over the next five to 10 years or so, text, video, audio, imagery, it is just going to be produced at like zero marginal cost. It's going to cost almost nothing. And it's going to be super easy to edit it, manipulate it, improve it within just one sentence. And that's going to be an unbelievable explosion of creativity, because now the barrier to entry is lower than it's ever been. I mean, think about what has happened in the last 20 years. Now that everyone has a smartphone camera, they're basically all directors. We now have hundreds of millions of professional directors producing incredibly engaging content, so much that we all want to spend time on TikTok. We're leaving YouTube behind and YouTube left TV behind. And that is what happens when you get access to a new technology. Like everybody suddenly gets wildly creative and inventive. And I think that's the trajectory that we're on from
Starting point is 00:46:35 here. And just the volume will be so vast that copyright tracking down would be impossible. Yeah, a little bit like on YouTube, Other than for the very sort of elite, high-grade feature film stuff, there isn't really a concept of you owning your data. Like you made a video on YouTube five years ago, 10 years ago, and someone basically completely ripped it off. I mean, that is the definition of the meme space.
Starting point is 00:46:58 A meme is an evolving idea manifested in images and video. TikTok has set fire to that such that it's happening minute by minute. You put a cool video out and then the next thing you know, someone has basically made exactly the same video a few degrees to the right with a slight shift in color or style or something
Starting point is 00:47:18 and mashed it with someone else's direction. And so you're seeing that memetic evolution happen in hyper real time. Well, and it also makes sense that people's attachment to things that took years to create are going to be different. Let's say the David. I don't know how long it took him to chisel that out of the block of marble, but presumably a long, long time.
Starting point is 00:47:39 And so his attachment to it is, this represents a year or two of my life. Whereas if he was able to make 10 Davids a day, he couldn't possibly have that same attachment or that sense of ownership or the sense of anything. So maybe part of it is just like quantity and investment. The investment is so much smaller and therefore your sense of what you're owed from it probably diminishes as well a little bit.
Starting point is 00:48:02 Totally. I mean, we've become overly attached to the craft, whereas really the value is in the ideas and the concepts. Now, I think we're moving to a stage where the value is in the curator. It's the edit. It's the judgment to reduce, to scale back, to simplify. It's the shortlist,
Starting point is 00:48:20 because everybody is going to have the power to produce really good and interesting creative content. I mean, that's the thing that blows me away every time I use TikTok. I mean, the variety is insane. And what TikTok is kind of doing is curating the feed. That's actually where there's a huge amount of value. So high quality productions in the future are really going to be about the edit, taking away the extra and simplifying. I think that's the kind of thing that would take off. Yeah, I think for people like myself, who I would say is a tradesman or a craftsman, I know how to write screenplays. I've spent a huge chunk of my life investing in that ability. And I spent a huge chunk of my life refining my acting and everything else.
Starting point is 00:49:03 I think what happens with this technology is it kind of democratizes and gives everybody access to executing their ideas. And I personally feel like bullshit, go sit in a room for five days. If you want the result, you go put in the time. And I think a lot of us who have dedicated our lives to these things to see someone get an idea like we have, but they don't have to do anything beyond that to execute it. It feels very threatening. It feels like it erases talent. Talent is about to become or is becoming irrelevant. Which is interesting because I think as humans and humanists,
Starting point is 00:49:35 we would probably go like, well, yeah, we would want everyone to feel creative and be able to create and we wouldn't want barriers. I don't feel that. I don't either. But I'm saying it probably butts up against some of our ideals. Well, you know, I think humans marvel at talent. That's why we love the Olympics.
Starting point is 00:49:52 Because it's like, how is it possible that someone was born like me can do that? Or someone could write this movie or direct everything ever all at once. Like, how did somebody do that? When you take away the person, what's going to to happen are we going to be excited anymore about anything or is it just like well yeah duh computer thought of that because a computer can think of anything i think it's a really interesting question we're about to run the experiment yeah we will know find out in my lifetime we will find out in 10 years my instinct is that I think we tend to feel arbitrarily precious about the thing that we've previously been invested in. Understandably, it's an emotional attachment. It's an identity.
Starting point is 00:50:34 And that identity is a product of you having slogged away for like years honing your craft, only to see it now available to absolutely everybody. But then the flip side of that narrative would be, well, what about all the people that didn't get the privilege and opportunity to be able to get through that training, to make those connections, to be able to make it in the industry? Think about how much arbitrary luck or even nepotism there is that enables people to get access.
Starting point is 00:51:02 And so much as we're displacing people by the democratization of access to these tools, we're also giving other people the potential to be massively uplifted. I mean, think of all the YouTube stars, TikTok stars that have emerged out in the middle of nowhere that never were on the scene or had connections. I do think that tends to default to producing more creativity in aggregate overall. And it just sort of reshuffles the chips of who basically has power today and basically makes everything much more competitive. Yeah, but there also is like a really huge global thought or question. A literal analogy I can give is that I had a year in my life
Starting point is 00:51:38 where I came into more money than I thought I would ever have. I started doing all these exceptional things I always dreamt of. Maybe I'll take a helicopter to this race. A year into this experience, I was doing something spectacular again. And I was like, I don't even care about this. I don't care about it because I've done 13 other cool things of this magnitude. And I'm now ruining everything with the greatness that's at my fingertips. And I have to reign this in and police myself and keep things special and rare. So I think there's also a potential for it's like every movie you see is Pulp Fiction. What does that do to your overall appetite for anything?
Starting point is 00:52:16 If everything's perfect and great. Yeah, there's nothing to compare one thing to another. What's good anymore? If it's all good. Yes. What is good? Is half of what we like about something good is in fact how rare it is
Starting point is 00:52:29 and the dopamine hit of experiencing something novel. So we could end up with 6 million perfect movies a year in the theater, but we might find that we don't give a fuck soon as they're all perfect. I think that's a great point. And actually, I think that's quite likely. We're already trending in that direction.
Starting point is 00:52:45 We're becoming completely overwhelmed with access to information. Where are the secrets today? Where are the vacation spots that no one's ever heard about? Where's that coffee shop that you heard from a friend of a friend that that was a place that you should go to?
Starting point is 00:52:59 I mean, this is like 20, 30 years ago. So we've learned to reproduce culture so quickly and in such a polished kind of semi-artificial way. Like, you know, you go to a restaurant, so few restaurants are clearly authentically the culture that they are and they've been there 20, 30 years. And then some you might stumble across like an old Italian place and you're like, wow, this really hasn't changed for 20 years. The Altoids. Right, exactly. The Altoids. They're not better than other mints, but there's something special about that old package.
Starting point is 00:53:30 You should be definitely getting a cup. I mean, they are not sponsored yet. Send them in. Their ad budget is enormous. Anywho, you do think that's already kind of happening. I think we're becoming desensitized
Starting point is 00:53:41 to the variety of content that we experience everywhere of every form. So now that we've overwhelmed with this information, it is just hard to discern what is cool, what is good, what is interesting, what do I like? Because the volume we're seeing is like 100x what I would have consumed 10 years ago. I'm now seeing that in a week. Also, how's anyone going to make money? I don't understand. I'm now seeing that in a week.
Starting point is 00:54:03 Also, how's anyone going to make money? I don't understand. If we can produce 100 dollies, like if I can, and then you, what's going to happen to money? Explain that to us. The marketplace. How are people going to make money? Yeah, I mean, look, hyper competition has that tendency, right? Because if we reduce the means of production to zero marginal cost, right? So the inputs become much cheaper.
Starting point is 00:54:25 You don't have to rent a studio. You don't have to go hire a whole ton of extra actors. You don't have to go into post-production and do all the coloring because there's filtering. You can just take an auto-tuned voice off the library or off a shelf. I mean, we've seen that trajectory for the last decade and it's now going to get compressed into hyper real time.
Starting point is 00:54:43 The utopian story is that we're going to be more creative and yeah we might need some more curators and editors but the other question is what on earth are we going to do i basically said from the beginning we should expect significant taxation to fund significant redistribution because over a 20 year period these models really do make production much, much more efficient and much more accurate. I mean, it is a race against the machine. And human innovation and evolution, independent of machines, isn't going to move fast enough to be able to be better. So you had originally been a part of the artificial general intelligence, which you talked about a little bit. of the artificial general intelligence, which she talked about a little bit,
Starting point is 00:55:23 but you have since introduced this term, artificial capable intelligence, which is kind of a midway point between AI and AGI. So tell us how you would define artificial capable intelligence. Because I feel like it funnels into the question about money. It's exactly connected to that. So for years, there's been this idea of a Turing test.
Starting point is 00:55:42 What is that? Yeah, I never heard that. Alan Turing. Alan Turing, one of the earliest computer scientists. Benedict Cumberbatch. Yes, exactly. Hot guy. I'll remember it if it has Benedict Cumberbatch in it.
Starting point is 00:55:55 We should have Brad Pitt teaching all tech. Oh my god, what a hack. Everything's the Brad Pitt principle. Oh my god. I like this. That's why Cillian Murphy was such an amazing cast. Jesus. Who doesn't want to watch?
Starting point is 00:56:08 Yeah, Oppenheimer V looks like Cillian. Perfect looking. The Turing Test was an imitation game that Alan Turing invented or proposed 60 years ago. And it basically said, if you could teach a machine to speak as well as a human, and it could deceive another human into thinking that it was in fact human and not a machine to speak as well as a human, and it could deceive another human into thinking that it was in fact human and not a machine, then it would have succeeded in imitating human intelligence.
Starting point is 00:56:33 That was the grand Turing test. And now that we have these language models, Pi and ChatGPT, it's pretty clear that we're quite close to beating the Turing test. And yet we've got no idea if we're close to inventing intelligence at all. I mean, it's a version of intelligence, but it clearly isn't the full picture. And so in the next few years, what I think is likely to happen is that we will pass a modern Turing test, which is where an AI is capable, it can do real things in the world, is capable, it can do real things in the world, right? It can learn to use tools, it can ask other humans to take actions on its behalf, and it can string together pretty complicated sequences of tasks around abstract goals. So for example, you could say to an AI, go off and make a million dollars by inventing a new product,
Starting point is 00:57:26 do your research online, figure out what people are into at the moment, go and generate a new image, go and negotiate with a manufacturer in China, discuss the blueprints with the manufacturer, get it drop shipped over, and then go and market it and promote it. And you said like you imagined giving it $100,000 to start with. It could start with a relatively
Starting point is 00:57:46 small amount of money. And I think quite quickly, it would be able to generate a lot more. And that would be for you, artificial capable intelligence that would signal that. It would, yeah,
Starting point is 00:57:57 because artificial general intelligence is a much more long-term speculative. That's the kind of super intelligence, Terminator and all the rest of it. This is a much more long-term speculative. That's the kind of super intelligence, Terminator and all the rest of it. This is a much more narrow focus on not just what can an AI say, but what can an AI do, right? Because that's what we really care about.
Starting point is 00:58:14 Right. It's funny while you were talking, I don't know if this is a breakthrough idea or if it's really basic, but I was thinking, weirdly, I might actually say a measure of intelligence would be acting logically without data. It's almost the opposite. It's that a human without any data can make a relatively
Starting point is 00:58:33 sound good logical decision in any given moment by modeling purely from their imagination and perhaps no data. Extrapolation is a critical skill that these machines currently don't have, which is without any context, can you figure out what's required in a certain setting without any prompting? Without any patterns to observe to then model onto it. Exactly. The moment they're mostly reproducing known patterns. Right. And it turns out you can go really, really far just by looking at known patterns. Right. And it turns out you can go really, really far just by looking at known patterns. Yeah.
Starting point is 00:59:07 But that's humans too. Normally, I think it's because you've experienced something you can at least connect to that's similar in some way, or you've seen it in a movie, or you've read it. I agree, but if you look at babies, babies have an intelligence.
Starting point is 00:59:21 They're born with an intelligence. They can make decisions, and they don't really have a data set to draw from. Yeah, that's true. And I think we often are in situations with almost no comps and we intuitively know what to do. Well, another way of thinking about that is that we are in the millionth generation of the evolution of that mind. So, although the baby generation of the evolution of that mind. So although the baby appears to learn something on the spot that's really novel, in fact, we're all one species. 65 million years of mammalian evolution that we're sharing lots of our genetic code with.
Starting point is 00:59:57 Right. And in some ways, these new AIs, these large language models are only in the 10th or 20th generation. As we make progress in a certain area and we publish academic papers on it and they get peer reviewed, the knowledge and insight then gets passed on to the next group of developers and other teams. And when you see that, oh, those guys have done something really interesting, you try and copy it. And so then that gets incorporated into the models next. You're just building endlessly. Once we have Pythagorean's theorem, it gets implemented in everything going forward. Yeah, and that's how we evolve knowledge. We're inventing new ideas.
Starting point is 01:00:31 It becomes part of the established status quo. I mean, think about how many things seemed normal 30 years ago that are now absurd. Not just our cultural positions on really important topics, but doctors used to be the biggest smokers of all. Sure. You know, it was amazing how that kind of shit.
Starting point is 01:00:47 They endorsed many different brands of cigarettes over the years. People collect these wonderful ads for Camel. Eight out of 10 doctors smoke Camel. We left out a couple of things that I just want to say that are really exciting. Because again, we're already kind of getting into the negative things. But abundant energy is something, obviously, that just like it figured out this 40% reduction and what it'll do to management of a nuclear facility, like the ones Bill Gates is behind, you know, that's incredible.
Starting point is 01:01:14 We left out the synthetic biology, which is going to be working in concert with this AI, where we can read, edit and create and print DNA. Oh my god that's bonkers viruses that produce batteries what proteins that purify contaminated water carbon scrubbing algae toxic waste into bio factories like these are very exciting and could heal the planet whatever our concern is about whether we're going to write movies or not you, there's also some big ticket items that we might have to like surrender whether we drive semis long distance in order to save the planet. We might think that's a trade-off that's worth it.
Starting point is 01:01:52 We have to extract carbon from the atmosphere. Those algae or the kelp farms that you talk about, that's an incredible upside of this experimentation. Imagine being able to absorb all of this excess carbon that we really need to remove as quickly as possible. I honestly think that in the next 20 years, energy is going to become an order of magnitude cheaper than it currently is. And we are on an incredible trajectory. Renewables are the quiet hero of the last 20 years. Costs of solar are going through the floor.
Starting point is 01:02:21 of the last 20 years. Costs of solar are going through the floor. Hydroelectric power is now a huge percentage of the energy mix. I think that we get a bit stuck in the negativity and the downside of things when, in fact, this is exactly what we need to be making progress on. Also like running core government services. You know, you imagine a government
Starting point is 01:02:41 without any outside financial manipulation, no corruption whatsoever. These are pretty pleasing thoughts. And obviously those will be the upside. Now, we should address some of the things that could be very dangerous about it. There could be some major threats to government stability. I think also we've not talked about it, but it's the point I always end up coming to when I'm debating with somebody who basically wants to pull the plug on everything. And I go, I'd be for that if we could truly get Russia to agree to that and China.
Starting point is 01:03:15 We have to acknowledge we're in an arms race with this technology. That's a fact. So who do you want to have the lead? Do I want North Korea to have the lead? I personally don't. So you start working backwards from the reality of can we afford to not be out in front? I don't think we can. Do you have an opinion on that?
Starting point is 01:03:34 The technology itself is not going to kill us. It's going to be the mishandling of the technology by our own governments or by a bunch of crazies, bad actors. Now, that is a manageable downside. It is not a reason for us to panic and pull the plug. It's a reason for us to be responsible and conscious and proactive and start having an adult debate about it right now. Because too often you hear extremists on both sides. I hear these people who are just like,
Starting point is 01:04:06 fuck it, we should just be charging ahead. Equally insane to be like, right, I'm done, pull the plug, Luddite story. It's just maybe too clickbaity and you end up just reading or seeing people advocating for one or other. They're the most exciting arguments to listen to. Moderation and pragmatism isn't the sexiest.
Starting point is 01:04:24 Right, and it's actually just a much more dry and boring to listen to. Moderation and rationality. Pridentism isn't the sexiest. Right. And it's actually just much more dry and boring to sort of fumble our way through a way of just making it work, which I think is possible. Stay tuned for more
Starting point is 01:04:36 Armchair Expert. If you dare. Whether you want adventure or predictability, Super All-Wheel Control from Mitsubishi lets you drive like you want. It's like all-wheel drive, but smarter. Using real-time data, it integrates all the information and sends instructions to each wheel all the time.
Starting point is 01:04:56 It multitasks to deliver the perfect drive, smooth handling and control, exceptional comfort, and superior traction. Super All-Wheel control from Mitsubishi. It's the real definition of control. Visit Mitsubishi-Motors.ca to learn more. Ooh, French lavender soy blend candle. I told you HomeSense has good gift options. Hmm, well, I don't know.
Starting point is 01:05:21 Mom's going to love it. She'll take one sniff and be transported to that anniversary trip you took to San Tropez a few years ago. Forget it. She complained about her sunburn the whole trip. It's only $14. $14? Now that's a vacation I can get behind. Deal so good, everyone approves.
Starting point is 01:05:40 Only at HomeSense. For just $4.99, you can get a Subway 6-inch Black Forest ham sub made with our new fresh-sliced deli. But the fresh slicing doesn't stop at beautiful Black Forest ham. We're talking tantalizing turkey, perfectly piled pepperoni, sensationally sliced salami, so you can lunch legendary, dinner deliciously, breakfast brilliantly. We're talking friggin' fresh slicing, and I'm yelling yes way!
Starting point is 01:06:05 Get a 6-inch Black Forest ham for only $4.99. Only at Subway. Price and participation may vary. Extras, taxes, and delivery additional. Expires April 8th. Well, okay, so you have a 10-step plan for the future that can help us avoid these kind of pitfalls. One of the things I want you to explain to us is containment. for human control. Long term, we want to make sure that we're able to understand what we're building, that there are guardrails that are built into all of these systems by design, and that the
Starting point is 01:06:52 emergent effects can be predicted or at least to the best of our ability accounted for. Because in past waves of technology, if they come too quickly, then society doesn't really have time to adapt and update. Whereas if you look at things like the car, it's actually been a pretty incredible track record of safety on every front, from seatbelts to airbags. Non-shattering glass. Yeah, like there's so many small innovations and a huge amount of licensing, like driver training, and you have parking tickets, and there's all these test standards. That is a huge success story. Obviously, it's a tragedy that some people still lose their lives, but net-net, that is an incredible benefit. Likewise, with flight aviation, we very early on established that there must be a black box recorder
Starting point is 01:07:43 tracking everything that the pilots say, tracking all the telemetry on board the aircraft, and sharing that with a centralized body, the FAA or equivalent, that can review that information and share it with competitors where there's like a known weakness or a known fault. Right, because it would benefit Boeing
Starting point is 01:08:02 to have Airbus not have a solution to their air safety. As a competitor, not that they're evil and they would want that, but surely a competitive advantage. The FAA says not a chance. Right, that's a sign of good regulation. There's too much regulation bashing, particularly in this country. I think that Europe's a little bit more open to it. People are just so afraid of it.
Starting point is 01:08:23 It's like, come on, let's not throw the baby out with the bathwater. It works quite often. Look at a country that just had an earthquake that doesn't have building regulations. Right. Turkey, for example. I mean, that was a complete catastrophe. Many, many buildings completely just sank to the ground. Because this evil force government oversight wasn't there.
Starting point is 01:08:43 Right. Okay, so containment. Now, here's my question. There is a bit of a paradoxical thought. First of all, it just hit me. Dead on, you got to be flush. Okay. Bit of Robert Downey.
Starting point is 01:08:55 Oh, wow. Yes, yes, yes, yes. It hit me in the eyes there. There was a moment where I thought I was looking at Downey. Okay, that's not the paradox. This is always part of our show. I was like, is this the paradox? I'm an AI.
Starting point is 01:09:07 I see patterns. I'm like, oh, some of the metrics are there. On the very surface, we'd have to say it's a little bit paradoxical to first acknowledge, or at least I'm willing to acknowledge, we keep talking about it like, what if it gets as good as us? But it's going to get better than us. It's going to get better than us. It's going to get more intelligent than us. And so it seems a bit of a paradox that there will be an entity on
Starting point is 01:09:31 earth that's dumber than another entity that'll have control. I mean, just in its simplest thought, like chimps are not going to be ruling human society. They don't have the capacity. Whatever fucking clever workaround they thought they came up with will be smarter. So how do we address that most fundamental question that we will be expecting to be the dumber of the two yet be in control? Think about it like this. An AI isn't going to function in the same way that a human does. an anthropomorphism, like a projection of our human kind of emotional state, to assume that an AI by default must desire control. Now, that is just because we are a competitive species and we've lived and breathed evolutionary fight or flight for millions of years. Kill or be killed.
Starting point is 01:10:20 So the first thing we think, this thing's going to kill us. Sure. Why would it want us on Earth gobbling up resources? Totally. And actually, the way that we design these models is very far from that kind of approach, right? It's a completely different species. Some people may design them to have that kind of independent control. And that is one of the things where we would need regulation to stop those kinds of activities, that kind of research. Like you wouldn't want an AI that was inherently designed for complete autonomy. I think that's one of the things that would be pretty scary. And that's not what we should be pursuing. You wouldn't want an AI that could go off and create its own goals. It shouldn't just be allowed to just decide, well, today I'm going to work on cancer research,
Starting point is 01:11:02 and tomorrow I'm going to build a dam, and next week I'm going to work on cancer research and tomorrow I'm going to build a dam and next week I'm going to build a tank. Yeah, yeah, yeah. It's not free to do that. You know, likewise, recursive self-improvement. If the AI can look at its own code and update its own code independently of human oversight, that's an issue. That's, I guess, the thing I get fearful of. Mind you, I've heard Steven Pinker make a very similar argument, which I like, which is like, we're kind of trapped in our animal mindset. And we are anthropomorphizing this machine, giving it these animalistic things that it just doesn't have. But it's on us, though, to make sure we don't program it that way because it will have the ability, but we have to not allow it to have that ability. And what we just talked about with everything being open source and being very democratizing, right now it does require a company the size of yours, the size of Google.
Starting point is 01:11:51 But as the individual has access to all this, because you can hold Google accountable, you can hold Pi accountable. Holding Jerry accountable in Tulsa, I don't know how we do that. Well, it has to be illegal. I mean, it has to then be fully illegal to do that. This is part of the, you know, definition of containment that I'm sort of trying to popularize because I do think that the proliferation risk is the real risk here. We've evolved as a nation state system system a mechanism for holding centralized power accountable. That's what the state is. It says, you pay your taxes,
Starting point is 01:12:31 and in return, we'll have a monopoly over the use of violence and force, and we'll use that to keep the peace. That's just the basic rules of the state. Law and order, that is what everyone fundamentally cares about and should care about. Over the next 20 or 30 years, if these exponential cost curves continue and everything from synthetic biology to AI gets radically, radically cheaper,
Starting point is 01:12:56 I think it means that people become less dependent on the state. They should be able to generate power off-grid. They'll be able to grow crops that are resistant to disease and that require less water. And they won't need as much centralized support. They could maybe even have their own robot armies and so on and so forth. That proliferation of power
Starting point is 01:13:16 definitely represents a threat to the nation state. Now, just to be clear for anyone who is a supporter of open source, I am not saying that there aren't risks of centralized developers of AI like myself. This is not a ploy for me to say, I'm the trusted one, don't worry about me. Well, yeah, yeah, yeah, yeah.
Starting point is 01:13:32 Because that would be way better. It's all white college educated. At the end of the day, you get into who's really at the top of all this. And who should make those decisions? That is a problem because I have to be regulated. Pi has to be regulated. Google has to be regulated.
Starting point is 01:13:46 Microsoft, everyone else, just as much as the open source. Bill Gates is super in support of real regulation, shockingly. I think everyone is. Everyone is like, look, this is a time to have this conversation. We want to remain as humans at the top of the food chain forevermore. We're not trying to displace ourselves. The analogy I would use is it weirdly in my mind feels similar to the war on drugs in that we have only been successful at ending a single drug. And that was Quaaludes, very popular drug here in the US in the 70s and 80s. It was
Starting point is 01:14:20 like a benzo or a muscle relaxer. It chilled you out. There was only a single manufacturer of Quaaludes, and I think they were in Switzerland. And so finally, the FDA or whoever said, like, we're asking you to stop manufacturing that because no one can do it on their own. And then as we got into the crystal meth epidemic, it is also true that the base compound you need to make it in your bathtub, really only a few people make. I think both the facilities are in India. The patents are held by U.S. companies. But that, too, can't be made by an individual.
Starting point is 01:14:52 You need that precursor, that base thing. So we could have at any point decided we don't want meth as a problem anymore. But when you have weed, you'll never combat that. That's why I think we don't even try because you can grow it in your backyard. Anything that can be democratized, we're not going to have control over. That's what scares me about the full empowering of your average guy with a laptop. And I think that's the fundamental question of the next 20 or 30 years, because in every sense, power is getting smaller with a bigger reach. So think about those image models that we mentioned earlier.
Starting point is 01:15:27 They've been trained on all of the images available on the open web, but they've been compressed to a two gigabyte thumb drive. So in many ways, you're putting all of that knowledge and insight from the open web onto something that is moved around on a thumb drive. Now, the same trajectory is going to happen across all of these other areas, the knowledge and know-how to produce really high-quality crops, the knowledge and know-how to produce the very best doctor and clinician. Like in 10 years, just as we had all the images compressed to a thumb drive,
Starting point is 01:15:58 you're going to have all the medical expertise, all the very best doctors in the world, all their knowledge and experience compressed to a single model that provides an incredible diagnosis that is universally available to everybody, right? So that's the trajectory that we're on for basically every profession, for all knowledge, all intellectual capital, which is obviously amazing. But of course, how then do you keep the state together? How do you keep order in that kind of environment? We're just beginning to have that conversation.
Starting point is 01:16:30 Obviously, I don't have the answers, by the way. Don't let that be the next question, please. Yeah. Well, one of the 10 steps is national treaties. I wish I had more faith in these. I mean, if you just look at like the nuclear treaties and how willfully they were totally lied and ineffective in so many ways.
Starting point is 01:16:53 And those are visible from space. You know, we can see manufacturing of this. This other stuff is like imperceptible from the outside. So how would I come to trust that when Putin signs this, they'll actually stop working on this thing? Actually, nuclear gives a lot of hope. Oh, okay. We have had a lot of progress with nuclear nonproliferation.
Starting point is 01:17:17 It is kind of incredible that there are only seven nuclear powers in the world today. That's true. And in fact, the number of nuclear powers went down from, I believe, nine. So South Africa gave up nuclear weapons, Ukraine and one other country. That's an amazing story. Like in the 50s, everyone thought, okay, the whole world is going to get access to nuclear weapons. That was the mental model 50 years ago.
Starting point is 01:17:40 How did we do that? Well, we did it through traditional threats and incentives. There were economic incentives to participate in the global market. There were military threats that said you can't have access. There were licensing of expertise. So you basically said, if you've got a degree in nuclear physics, you have to register. We need to know who you are, where you are, how you work, who you work for. Wow. Interesting. Same story with chemical weapons, biological weapons, if you work on conventional missiles. But I would say biological weapons are failing. I think we just saw that. Now, look, I don't think we know for sure, but certainly even the New York Times is willing to consider with a 50% likelihood that COVID was coming from a lab in Wuhan. And we have treaties
Starting point is 01:18:28 against that. Completely. That really does make my blood boil. That's a different story, actually. That's a different failure of containment. That is the scientific research of gain of function, where in that lab, they were actively trying to improve the transmissibility of viruses. Yes. Deliberately. It's like out of a James Bond movie. It's totally nuts, right? But obviously, their goal was to try and make it more transmissible
Starting point is 01:18:54 so they could invent something that would counterattack it. One of the things that was alluded to is that they wanted to see more progress, that Xi Jinping, this lab was under all this pressure to show him some huge progress that they had been able to unleash and then control for his vanity, I guess. The Wuhan lab was actually funded or had a number of postdoctoral researchers and professors that were funded by the U.S. National Institute of Health, right? So this is a much more complicated story than a Chinese effort.
Starting point is 01:19:27 There are many gain-of-function research labs that are the same BSL-4 and 3 in the US and in Europe and in Australia. So there's lots of these labs around the world. The gain-of-function research is still continuing. They weren't the only people that were doing it and they were located there because of the proximity to some interesting bats and so on.
Starting point is 01:19:46 This is more of a culture of having more awareness about what scientific research is happening and why and who's funding it. And again, it's about people getting involved in the political process and really caring about the future of our planet and not thinking that science is off limits to people. You know, because I think a lot of people think, oh, too technical for me i can't be part of it i was just having that thought when you knew way more about that lab than i did and you started saying like acronyms and stuff i was like oh i'm out i don't really know shit about this i'm sorry because then we get to learn more information i was just relishing in the fact that i just had that thought two seconds ago. Oh yeah, that's out of my domain right there. Everyone needs to feel that they can get access to technology and science and it's not this kind of elusive. Well, this is your plea for transparency, right?
Starting point is 01:20:35 Exactly. Labs should be open about what they're working on, what progress they're making, what issues they've run into. And again, going back to what I said about the black box thing, there's a lot of simple lessons from previous eras in other domains that we should learn from. We should share the mistakes that we make. In 2016, I co-founded the Partnership on AI, which is this multi-stakeholder forum for sharing best practices between all the companies. We got all the companies to sign up. Apple, Google, Microsoft, IBM, Facebook, OpenAI, DeepMind, as well as 100 civil society groups, nonprofits from Oxfam to the UN and everyone. And it's still going now. And the goal was to basically have an incident reporting database. If you work at an AI company or elsewhere in big
Starting point is 01:21:20 tech, and you see something going aloof, whatever, then you could report it confidentially, kind of blow the whistle, not cause a big PR stink and go and like be a martyr on the news or whatever. Trust in each other. Yeah. Yeah. And I think that's a kind of really important part of the process. This is where maybe we could use the racism thing as an example. So some of the early large language models turned out to be racist quite quickly. Could you explain how that happened? And was that one of the self-reporting things? And did they say, this is how we fix it. So when it arises in your large language model, this is how you'll fix it?
Starting point is 01:21:54 Yeah. Two or three years ago, if you asked one of these models, if you said, complete the sentence, the pimp is of this origin. And it would say, well, a black man. Or if you said the president, always a white man. It had these biases. It had a tendency to take the data that it had been trained on
Starting point is 01:22:12 and basically rigged by humans and regurgitate it. Now, the interesting thing about the last year or so is that we've made a huge amount of progress on these kinds of biases. So much so that I think they are going to be largely eliminated. So if you play with Pi now, Pi is extremely respectful and balanced. Can you tell me what mechanisms get put in to curb that? It's a process of alignment.
Starting point is 01:22:39 So the alignment process is where we show two examples of outputs by the model to real humans. We call them AI teachers, and they come from all different backgrounds. They're old and young. They're educated and not educated. Some of them are professionals in certain areas. Some of them are just generalists. And we have thousands of these AI teachers that work basically around the clock 24-7. these AI teachers that work basically around the clock 24-7 and we're constantly showing thousands and thousands of examples of two or three or four different answers that the AI could
Starting point is 01:23:13 produce given some question or some prompt. And the human then selects between these different options with regard to a criteria. Is this one more funny? Or is this one more respectful? Is this one clearly racist? Does it look more like financial advice? Is this about carpentry? And so that feedback is called the process of alignment. You align the model with a set of values or ideas, which is pretty incredible. And that has produced extremely nuanced and very precise behaviors, which you now see in Pi. And then I imagine that group that the AI is aligning itself to, I guess it would just naturally evolve as culture evolves and societal norms evolve. Because it wouldn't be like you'd set it in stone, like, okay, we got it. We've
Starting point is 01:24:03 aligned it with what humanity thinks is right. Right. And then see in a thousand years, no, that's going to evolve as we see at an incredibly rapid rate as well. Well, this comes back to what we were saying earlier about whether the AI should be able to evolve its own set of values or set its own goals
Starting point is 01:24:19 or update its own code. So I think these are the sorts of capabilities that in a containment strategy should be off the table. For a start, just to state the obvious, we are shaping the values of this AI that goes and participates in the world. By that, I mean me and my team. We try to be as transparent as we can be about the process
Starting point is 01:24:37 and we publish the value set on our website just to show what it is being held accountable to, what it's aligning to. But that's where competition is really healthy because we want to have a variety of different models produced by lots of different actors, not just me. I think in time, you'll be able to create your own sort of AIs that align to your values subject to some baseline constraints. There'll be lots of different AIs that end up being out there in the world with different kinds of positions on these sorts of things for both good and bad. That's the whole point about it, amplifying who we are as a
Starting point is 01:25:09 species today, because everyone's going to want to have influence basically over their own AI, right? You're going to want it to be more in your vibe. Absolutely. Irreverent, a little dangerous sometimes. And then I guess my last question, you kind of already answered. We hear these examples that, you know, the AI will create works cited. So I heard that a lawyer asked the AI to write its brief. It referenced some statutes that didn't exist.
Starting point is 01:25:40 It referenced some cases that it had made up. Because its ability is so large and fast, it makes me think that there has to be an AI over top of everything that just minimally tells you what's fake and not. How do you, is it police? I mean, no one can do that. Yeah, but I guess if you saw something cited, you could go search and see if that's a real article. That's what happened in the 60 Minutes piece. It had cited some references in a works cited page or bibliography about this argument it laid out. And then they found out that the AI had created like four of the books. Yes. And that's funny and silly, but there's a real article on absolutely everything,
Starting point is 01:26:18 every different opinion on it. It all comes from some quote, reputable something. Even now, I mean, we fact check on this show. It's stupid. We can find a defense for absolutely anything. Yeah, but I think we need to know what was an actual scientific study that produced these results and not what the AI said. We have to know the difference.
Starting point is 01:26:38 And it's going to be such a volume. A human can't police that. So I think you need an AI to police the AI. That's my last concern. I'm just curious, like, how will we fucking know what information is real? Just as three or four years ago, these models were prone to bias and racism, and they have now got much, much better, not a solved problem, but hugely better. These are the kinds of problems that I expect to be completely eliminated in the next few years citations and sources is one that everybody
Starting point is 01:27:10 is actively working on at the moment i mean if you go to pie today and ask for fresh and factual information it will know the sports results from yesterday it'll know the news it'll have an opinion on barbie and oppenheimer It's pretty fresh and factual, and that's because it is going off and checking stuff on the web. It's looking up in real time. And so I expect in sort of three or four years' time, it is going to go and do the same thing for all of the academic journals and all the news reporting
Starting point is 01:27:38 and all the real sources and citations. So I think the thing for us to really focus on is not, are we going to have these problems in perpetuity? It's what do we do when they're solved? And what does it mean to have AIs that can do perfect recall from any knowledge base that aren't biased actually, that are actually really fair,
Starting point is 01:27:59 like more fair than humans, that are actually more creative than most humans? How do we handle the arrival of that thing? Is there any part of your book that I left out that you'd like to talk about before we wrap up? This has been exhilarating. Yeah, so fascinating. How many days have we been talking? We're right on time. You covered all the good bits, I think. No, that's great. I mean, definitely makes me want to read the book. I'll say that because this is so deep and will continue to be so relevant for all of us. Okay, we're going to go out on this.
Starting point is 01:28:32 Is it not suspicious to you that you're here on planet Earth at this moment to witness this? I have this deep prevailing suspicion like, wait a minute. When I was born, we didn't have computers. Now we're at a point where the computers are going to solve every fucking human problem. And then probably some kind of bizarre longevity is going to emerge out of that. Is it possible I was born at the time I was to witness all this? It feels suspicious. Do you have that ache of suspicion at all? 100%. If you just think about this trajectory on cosmological time, how much our planet and our species and this moment is just a complete freak accident. How on earth can this
Starting point is 01:29:17 be the case that we're now alive sitting here on these chairs in this moment doing this thing? It just feels so arbitrary and so fragile and in a way like such a fleeting moment of evolutionary time. And we're so obsessed with time on a month and week and year basis, you know. And it's like actually the world doesn't care about that schedule. The world operates on this geological or cosmological time. Yes. If you do our geological calendar, which is so fun, humans arrive at 11.59 p.m. on December 31st.
Starting point is 01:29:52 Yeah. The idea that in that last minute of this geological year calendar, in the last one second, we went from no telephones to this is hard for me to actually buy into. And that we happen to be born in that one second. No, that's what I call bullshit. from no telephones to this is hard for me to actually buy into. And that we happen to be born in that one second makes no sense. No, that's what I call bullshit. So then we get into...
Starting point is 01:30:11 Sim. Are we in a simulation? The classic. Well, you know what the funny thing is? We're going to find out. Well, right. I know. That's what's wild is you're going like, you're going to find out.
Starting point is 01:30:22 And it's true. We're going to find out if there are movie stars pretty soon. We're going to find out if there's writers pretty soon. I can almost not comprehend that that's true. Yet, I bet it will. Yeah, it does feel a little bit like. Even for a progressive, that's a lot. I've read a bunch of great books on dopamine lately.
Starting point is 01:30:40 I'm kind of obsessed with dopamine. I'm pretty panicked about this level of change. And I know I have apex dopamine and progressiveness in me. I can't imagine how fucking terrifying this is for half the country. I completely agree with that. And think about how ridiculous the world looked 40 or 50 years ago, how it was dominated by the patriarchy, how you had a career for 40 years, how people of color were like irrelevant and basically just coming out of slavery all over the world, how empire and colonialism was the default way of operating. We have changed unbelievably culturally and politically.
Starting point is 01:31:16 And so we're a product of that generation. So we're still living in this mindset that things should be stable. We have this default expectation that actually this order should be here forever. Actually, there's just no reason to believe that. In fact, the default is flux. Yeah, it's very unsettling for a lot of people and in a varying degree. That's why I'm acknowledging, like, I think I trend quite high on that embracing of change and it's
Starting point is 01:31:45 very scary to me so I just am very sympathetic I imagine for some people it's just completely overwhelming it is intense and I think that we have to figure out ways to be respectful of everybody's rate of change I think it's super important to be empathetic to that and not just charge ahead as though it is obviously right just as we shouldn't be righteous in caution and non-change, we absolutely should not be righteous that change is inevitably going to be good or is out of our control or is being done to people. Yeah, we need massive humility as we race into the unknown. And we have to figure out each challenge
Starting point is 01:32:21 because otherwise they will pull the plug on the sim. Like we're just here for us to figure out all challenge because otherwise they will pull the plug on the sim like we're just here for us to figure out all the problems so we have to keep figuring them out or they're gonna be like oh start over well we're an ai model it's gonna result either we fix global warming or we don't so we have to if we want to keep living because i was interviewing you today i was thinking about all this and then i thought okay so let's see so all this will happen in front of me there'll be a headline new york times if that's even a thing still people are gonna live forever okay well about all this and then I thought, okay, so let's see. So all this will happen in front of me. There'll be a headline in the New York Times
Starting point is 01:32:45 if that's even a thing still. People are going to live forever. And I go, okay, well, that's bullshit. Like now I know I'm innocent. And then someone's going to pull a cable out and I'm going to be in a room
Starting point is 01:32:53 and they're going to go, pretty fun ride, huh? But it won't be you. I'll be this beautiful like avatar guy. Seven feet tall with a tail. No, I bet it's... It'll be like a blob of...
Starting point is 01:33:03 Blob. Ectoplasm. Yeah. You tall, the tail. No, I bet it's. It would be like a blob of. Blob. Ectoplasm. Yeah. You should make that film. Tell your AI to produce it for you quick. Oh my God. I think the strike would prevent that. But I was thinking like, okay, so I'm going to come out and there you go.
Starting point is 01:33:17 That was a wild ride, huh? Like we took you from nothing to all of that in 70 years. Do you want to do it again? Right. I don't know what they let us do. I don't either. I wonder if it's actually 10 minutes. That would make sense because the way time moves.
Starting point is 01:33:32 Yeah. We were plugged into a machine for 10 minutes of this entire experience. You're not saying that. We're not saying you're saying that. You're co-signing on this. No, no. Mustafa says this is exactly how it's going to happen. Well, I find you to be an incredibly thoughtful and this may sound derogatory to other tech geniuses I've interviewed, but you're also very EQ.
Starting point is 01:33:54 I'm grateful you're in the mix. That is the one thing that I think scares a lot of us is there's a type that's in Silicon Valley and it's not the type that was in my rural American town. And so that's a little scary. The power has been consolidated in very few hands and they're very similar to each other. And that's a bit unnerving. I'm delighted you're in the mix. It's been a pleasure meeting you. Yeah, this is great. Thank you. Everybody read The Coming Wave. Obviously, I feel like this is an act of altruism from you. I can't imagine you need money. Are you trying to get rich off a book? I don't think that could possibly be the case. You're doing fine, right?
Starting point is 01:34:30 True. Unfortunately, books don't make very much these days either, but I kind of felt I had to write it. Yeah, that's wonderful. I'm really supportive of it. So everyone check out The Coming Wave and familiarize yourself with the sim we're all in that we'll all find out shortly.
Starting point is 01:34:45 And play with Pi. Yes, play with Pi. You can find Pi at www.pi.ai. Thank you. I need to play with it. You know what's funny is I talk about it and think about it and I've yet to do any of it. Same. I'm scared. Try it. I know. I am going to try it. You'll never go back. You'll realize we really are in a
Starting point is 01:35:04 simulation. Exactly. It's Pi's world. Once I can teach Pi to edit You'll realize we really are in a simulator Exactly It's Pi's world Once I can teach Pi to edit the show Oh, they can already do it, I'm sure Because it has 600 times 2 hours It has 3,000 hours of how the show is edited I bet you it could do it If I put all the unedited episodes in
Starting point is 01:35:21 And then all my edits I do think it could do it It'd be pretty close don't you think i think it would probably be all right yeah you should give it a go i mean it could probably generate an entirely new show in your style that would be quite that would be we have thought about that like basically like those actors got to promote something so they license their ai to me i'm no longer doing this my ai conducts the interview in our style it could all be done in a way that probably at some point
Starting point is 01:35:47 you won't be able to tell the difference the voices are getting incredible have you heard these voices? yes we that we've been working with yeah five minutes of audio you don't even need high quality just like speaking into your phone if you have a data set like we have
Starting point is 01:36:01 which is thousands of hours of us talking different emotions relapses house purchases so much a whole gallon a set like we have, which is thousands of hours of us talking. Different emotions. Relapses, house purchases. So much. Fucking the whole gamut. All right, Mustafa, thank you so much for coming by. This has been a delight, and I hope everyone reads The Coming Wave. Thank you so much for having me.
Starting point is 01:36:17 It's been fun being in the simulation. Next up is the fact check. I don't even care about facts. I just want to get in your pants. Hi. Yesterday we hung out. Yes. And that's a wrapping it up because on the last fact check we said we were going to hang out this weekend.
Starting point is 01:36:42 That's right. And we successfully did. And we didn't just hang out. We. That's right. And we successfully did. And we didn't just hang out. We also went in the sauna together. We did. I really. Don't like the sauna? No, I like it.
Starting point is 01:36:52 Oh, God. A lot. Because I love it. I know. You don't want to like it? I think for some reason I don't want to like it. I think because you guys are so addicted to it. I'm like, ugh.
Starting point is 01:37:04 You feel left out of it. Yeah, I feel left out so addicted to it. I'm like, ugh. You feel left out of it. Yeah, I feel left out. There we go. That's the honest opinion. Okay. And you guys all have one. Well, I insisted. You were trying not to go in it.
Starting point is 01:37:16 Despite what you're saying, which is you, I agree that you feel left out. But I was actively inviting you and you're like, no, no, no, no. And then finally I was like, Monica, go get a bathing suit. I didn't have a bathing suit. So I didn't want, I didn't want to like have to borrow one of Kristen's because what if she didn't want me to borrow it? But then now she's in this weird position
Starting point is 01:37:33 where she has to say she- She loves letting people borrow her bathing suits. I know she does. But, and she's very generous. But you know, I don't know. You guys had a plan and then I popped in and I didn't want to ruin it. Come on now. That's your shadow talking.
Starting point is 01:37:51 Yeah, but anyhow. I immediately was like, you got to go in the sauna with us. We're about to go in the sauna. Yeah. It was really fun and I do feel so good after. Although then later I did get a bad headache and now I feel a little sick. Oh, okay. Well, you probably didn't drink enough water while you were in the sauna, did you?
Starting point is 01:38:08 I drank one of those whole Coke things. Oh, you had a blender bottle? Well, the Coke ones. Oh, the big Coke pizza hut cups. Yeah, and then I drank a lot of water when I got home. Maybe I'm just a little under the weather. That could be it as well. Who knows?
Starting point is 01:38:22 But anyway, I did have so much energy after, and I was like, I like this a lot. And I'm going to probably try to get one at my house. Okay, but also why don't you just enjoy it with us more often? I could do that. I will. But also I'm going to try to get one at my house. But we've talked about this. I have fears around me going in alone.
Starting point is 01:38:40 Right. So don't do that. Also, nothing's going to happen to you if you go alone. I go alone every night. I get so scared I'll get stuck in there. Right. So, don't do that. Also, nothing's gonna happen to you if you go alone. I go alone every night. I get so scared I'll get stuck in there. No. You're not gonna get stuck in there. What if? I mean, you never know. No, you know. No way. In mine, I'm gonna have an
Starting point is 01:38:56 axe in there just in case. Sure. If that makes you feel safe, then do it. Yeah, I'm gonna. You should also have a pair of shoes in there because if you use the axe you're gonna have to walk on glass. Okay, good idea. Cooler for the axe so it doesn't get too hot. No, I wondered about that. Will it melt or something?
Starting point is 01:39:12 No, it won't melt but it'll be too hot to touch. Oh, you're right. So you need a cooler and fresh ice every time you go in and a pair of shoes. Okay. And safety goggles. For when you smash the glass. And I want a panic button that calls this house so that if I'm dying or panicked or stuck. You hit the button on the wall.
Starting point is 01:39:32 Yeah, and then it calls you guys. Yes, and we run the 12 feet over to your house. Yeah. And I kick, how will I do this? How will you get in? I'll want to kick the door in, but I won't because then the glass will be everywhere. I'll just open it normally. Okay. Despite it's my opportunity
Starting point is 01:39:47 to be a hero. And in this scenario, you're acting like you can just open the door. I can, yeah. Remember the, on the plane I had to really, really resist my urge to kick the door down because we were trapped inside the plane. Yes. But I knew the owner of the plane should kick
Starting point is 01:40:04 the door down. Yes, but he wanted to so bad. But I was chomping at the bit like,. But I knew the owner of the plane should kick the door down. Yes, but he wanted to so bad. But I was chomping at the bit like, oh, I've been dreaming about this my whole life. We're trapped. I'm allowed to kick a wooden door down? I know. Fuck. I'm proud of you for not forcing it
Starting point is 01:40:16 and just like jumping in his way and doing it like a little five-year-old. But what I was going to say is after we did our sauna, you know, I walked out the gate and was walking to my car and i thought wow it'll be so nice when i live there and i can just walk into my house walk 40 feet it'll be so nice get out of our sauna and then walk and get in your sauna round two and then i'll definitely die definitely die then yeah um so we're looking at an April 2025.
Starting point is 01:40:46 For a move-in date. Yeah, that's the current status. Is that less or more than you had originally been told? Well, I was told 18 months. It's in the ballpark of that. It's a little more than 18 months, but that's to be expected. Oh, yeah. Easter party, is that what will be?
Starting point is 01:41:05 Oh, we'll have missed Easter. When is Easter? April? Easter is April. March 31st. Oh, okay. So we'll have just missed it. What if you'll get in early?
Starting point is 01:41:15 No, because I don't. Easter's not my thing. Why? You're very religious. I know. And Christian. That's why. Because people make a mockery of it.
Starting point is 01:41:26 That makes sense. And there's like a bunny. Christmas and Easter. Easter 2024 is March 31st, but 2025 is April 20th. A full month later. Just because of my house? I think so. Oh, my God.
Starting point is 01:41:39 420. In honor of your house. Ooh, 420 is a pot smoking day on Easter. Yeah, wow. Okay. Congratulations, stoners. Well, Easter belongs to Molly and Eric. We already know20 is a pot smoking day on Easter. Yeah, wow. Okay. Congratulations, stoners. Well, Easter belongs to Molly and Eric.
Starting point is 01:41:48 We already know this. Hold on. What a pairing. A pot holiday when you have a basket of chocolate. Oh, that's nice. And eggs. Do stoners love eggs? They love eggs. Oh, I didn't know that.
Starting point is 01:42:00 Everyone knows that. Okay, so what a perfect pairing. Only thing that could be better would be Halloween and 420. Yeah, candies. Candy galore. We all, in this beautiful group of ours, we have different holidays. I think we've talked about this a little bit because I said I probably have to take Valentine's Day and then I'm going to have that Cupid party. Right.
Starting point is 01:42:22 Because all the other holidays are taken. Who has Easter? Molly and Eric. Oh, duh. They do the big— But they're not doing that anymore, right? Well, they say that, but I think they will. He will, yeah.
Starting point is 01:42:32 He can't resist. He can't resist it. He can't help himself. But—and that's fine, again, because I wouldn't pick Easter. Right. It would make no sense. I didn't grow up with Easter, so it means nothing to me. Yeah.
Starting point is 01:42:42 You know, my parents did Christmas because, like— It's fun. It's fun. And there's presents. Exactly. And I think they knew, we have to celebrate this. Yeah, we brought her here. Yeah.
Starting point is 01:42:54 Yeah. She's really gonna—and my grandparents, my mom's parents did Christmas when they moved here, too. In Savannah. Yeah. Yeah. So that I'm grateful for. Yeah. But they were not going to get on board with Easter. No, nor should they have. Yeah. So that I'm grateful for. But they were not going to get on board with Easter.
Starting point is 01:43:06 No, nor should they have. No. And so we didn't have baskets or anything. Although they're pretty fun baskets. I do remember my mom got me one every year. I know. And I am jealous of that. Yeah.
Starting point is 01:43:17 And you got kind of a present. I think we always got a pair of shoes on Easter as well. Oh. Yeah. Or Nikes. But sometimes my mom would get me a Valentine's Day chocolate. Oh, yeah, yeah. My mom would always get me a Valentine's Day card.
Starting point is 01:43:30 Will you be my Valentine? Oh, my God. Brag. No, Matt. I've seen you. I know, but I think my mom, there was like every now and then when they remembered. Oh, okay. And she never asked me to be her Valentine.
Starting point is 01:43:43 Okay. She should have. No, she didn't. She didn't have time. Yeah. But. Comput me to be her valentine. Okay. She should have. No, she didn't. She didn't have time. Yeah. But. Computers to be programmed. Exactly.
Starting point is 01:43:50 They're not going to program themselves. I mean, now they will, but back then they weren't. Back then, no. She got out of the game just in time. She did. She's like the last biological programmer. Obviously, my dad planned that in the sim. Yeah.
Starting point is 01:44:03 Pretty convenient. She's got to retire. Right at the exact right time. Yeah, before she was fired by a robot. Do you think they'll have AI do the firing? I mean, why not? If they're going to have them do everything else. Yeah, because that's the hard part. Your phone will ring. Hello, how'd you get this number? I have
Starting point is 01:44:20 all of the numbers. Your services are no longer required. What? Sorry, I'm programmed not to interrupt, but I did. Sometimes these things happen. You have all of the numbers. Your services are no longer required. Sorry, I'm programmed not to interrupt, but I did. Sometimes these things happen. Ha, ha, ha, ha, ha. Oops. Oh, that robot.
Starting point is 01:44:36 Is that robot related to our robot? I'm not that robot. I'm this robot. Why are you referring to me in whatever tense that is? Okay, see, this robot. Ding, ding, robot ding ding ding we didn't even mean it this is mustafa's episode oh wow that's lazy sim mixed with ai maybe all the sims talk was really just ai talk like we didn't have it well ai is probably involved in the original place that all this is happening. AI definitely creates the Sim.
Starting point is 01:45:09 Right, but they use human bodies and human wants. Plugged in back in Zanzibar, the planet we're all on. Anyway, Valentine's Day, I'm going to have that party, but I'm really excited to walk over from here to there. As I will be excited to walk from here to there on Valentine's Day to enjoy the party with the cupids. What was the, we were debating whether or not you could hire little people to play cupids. No, I'm not doing that.
Starting point is 01:45:33 I'm hiring male models. But fucking cupids are not, they're supposed to be smaller. Well, they're not. These aren't. Then this is all I ask. Okay. If you're gonna have tall cupids,
Starting point is 01:45:44 you better hire giants to come to the going to have tall cupids, you better hire giants to come to the party to make the cupids look smaller relative to the giants. Oh. So you're going to have to go through all the retired centers. You're going to have to find Manute Bull. You're going to have to find Vladi Divat. You're going to have to find... These are the players you've got to assemble.
Starting point is 01:46:00 Shaquille O'Neal's got to be there. Oh, that would be fun. I mean, they're stationed by the cupids. They can never be more than three feet away from the cupids. Okay, I'll consider this. Okay. It's an expensive party. It's going to be pricey.
Starting point is 01:46:14 I think to get Shaq to attend your Valentine's Day party has to be over a million for sure. Man, okay, I got to save up. How much would you, oh, here's a great game. Mm-hmm. How much to attend a stranger's Valentine's Day party? This runs the risk of being alienating. It does. It does.
Starting point is 01:46:32 Do I travel or is it in LA? You travel. You got to fly. They pick you up at the airport, not a car service. And are they, is it a charity? No. It's a birthday, regular birthday. Well, it's a regular Valentine's Day are they, is it a charity? No. It's a birthday, regular birthday. Well, it's a regular Valentine's Day.
Starting point is 01:46:49 Oh, I'm sorry. It's a regular Valentine's? No, someone's birthday might be on February 14th. Right. But that'll be a coincidence. Are they creepy? Is it a man or a woman or a couple? This makes a difference for me.
Starting point is 01:46:59 Let's keep it neutral. It's a family. Okay. Okay. The whole family loves you. They'd really love for you to be at their Valentine's Day party. I'm going to call. Hi, Monica.
Starting point is 01:47:09 Sorry, I got your number from one of your friends in Georgia. They don't keep it as tight as your LA friends, so we had to go through them. But the point is, we would love for you to attend our Valentine's Day party this year in St. Louis. Is there some kind of fee we could pay? We've saved up all of our money to pay this fee. Well, no, you can't say that wrong. Then I'd go for free. No, but we're rich.
Starting point is 01:47:32 Oh, okay. We're very wealthy. Okay. We don't want to brag, but we're very wealthy. Wealthier than me. We're the most wealthy people in St. Louis, but don't tell anyone. Okay, I won't. I'd like... I'm going to go... go Now you gotta be careful Monica Because you're gonna say a price right now
Starting point is 01:47:49 And there may be someone in the listening audience That takes you up on this So it has to be real Oh my god This sucks right Because then you're gonna You know people will be like Fuck I'd go to a Valentine's party for 10 bucks
Starting point is 01:48:01 Well I know You've put me in a horrible position to sound like a huge brat. Until you reverse it. But until you reverse it. We do know the richest people in St. Louis listen to the show, too. Okay. They've written in several times. That's right.
Starting point is 01:48:13 Okay. Just reminding people that this is a scale. The number changes depending on how much money these people have. Right. The hosts. Yeah, the rich. This host is the richest family in St. Louis. In St. Louis.
Starting point is 01:48:27 Which is, there's money there. Oh, old money. Yeah. Yep. So I'm going to say $350,000. $350,000. Yeah. That was a little higher than I was expecting.
Starting point is 01:48:38 Oh, really? Oh, dex. Wait, you wouldn't go there for a quarter million dollars? I mean, I would. Right. But I have to aim high. Oh, you're trying. I see what you're trying to do.
Starting point is 01:48:48 Because there'll be some negotiation. Okay. So you're going to start at $350, but you'll close at $250? Yeah. Would you go for $200? I don't want to say. You got to say that's the point of this exercise. Fine, yes.
Starting point is 01:49:04 Would you go for $150? No. You want it. $200 is the floor. Am I going to have any duties at this thing? It depends how much you eat on the plane. God. That's extra.
Starting point is 01:49:20 Wait. They're like, you have to duty at our party. If I have to duty, that's $750. Million. I'd actually take less if to doody at our party. If I have to doody, that's $7.50. Million. I'd actually take less if I can doody at their party. No. If they're going to make me doody.
Starting point is 01:49:34 They're not going to. Actually, if they're going to make me doody. There's two different prices probably. It's 1.5. If they watch you doody? That's a lot more. How much is that? Be real. Do they see my vagina?
Starting point is 01:49:46 Or can I put a blanket? Yeah. Well, no. They're behind you. You're squatted. Oh, they're behind me. Yeah, they have a grassy lawn. And they've made a little ceremonial area.
Starting point is 01:49:58 And they'll be behind you. So I don't think they'll see your vagina. Can I cover it with a towel? Your vagina? Yeah. Yeah, you can put one of those sticky things don't think they'll see your vagina. Can I cover it with a towel? Your vagina? Yeah. Yeah, you can put one of those sticky things over it that they wear in the movies. Well, and they've curtained off the area, so it's just them, not the whole party. That's what I was going to ask.
Starting point is 01:50:13 How many people are watching? The whole party. Oh, the whole party now. Well, they don't pay you millions to come in and do this and duty in the yard if they don't want people to see it. This is their thing. Oh, it's their thing, And everyone there loves this thing. They love it. And they go, yes, well done.
Starting point is 01:50:28 What a beautiful duty. Okay. My Valentine's Day is complete. Thank you, Herb and Terrence Walker. 10 million. 10 million. Yeah. You think you could do it?
Starting point is 01:50:39 Yeah. When it came time to it? For 10 million, I could do it. Wow. Yeah. Wow, wow, wow. You know what? You may have just gotten yourself into trouble.
Starting point is 01:50:48 Wait, $10 million? I'm hosting a Valentine's Day party. $10 million. And I'm going to talk to Kristen and see if we can, as a family, pay your fee. $10 million after taxes. So $20 million. After taxes. I need to-
Starting point is 01:51:00 I need it after taxes. I'm not going to do it. I am going to do it. Think about what you're asking of me. I know. It's a very tall order. This is insane. So it's $20,010,000,000.
Starting point is 01:51:14 Don't worry about that. Just stick with the $10,000,000,000. No, because that's not enough. Five is not enough. No, I know. After taxes. After taxes, $10,000,000. Well, okay.
Starting point is 01:51:24 They'll have their accountant handle that. Yeah, you will have. Net $10 million. Okay, $10 million in the bank after taxes. They've already contacted the IRS. Yeah, everything's been worked out. It's a charitable donation. How much for you, Rob, to take a dump in front of a whole party at a Valentine's Day gathering in St. Louis?
Starting point is 01:51:47 I'd do it for a quarter million. 250? Yeah. That's nice. But also men, it's different. Although harder to like if I bend over in front of the audience they're going to see my testicles and my penis hanging down. Yeah, but you like that. That's true. Maybe I'll knock some off the fee if they
Starting point is 01:52:04 all look. I know. And it's people that never see again. Right. Yeah. No photographs. That's- Obviously. Yeah, yeah, yeah.
Starting point is 01:52:11 Or video. What's the price for video? Because- No, no, but let me be- Let's just work this out. You go, oh, fuck. Then the whole world could see me take a dump. But maybe $5 billion is worth that?
Starting point is 01:52:24 You could like cure cancer. I know. That's why, yeah, there is obviously a number because then I could give a lot to charity. Elon Musk is getting very excited right now about this opportunity. Who do you think would be more likely to pay for this,
Starting point is 01:52:38 Elon Musk or Jeff Bezos? Elon Musk. That's the obvious answer, but I almost think it'd be Bezos. No, he is, he cares. But he pay in pesos. That's how he would trick you. He doesn't want to be a crazy person, Jeff Bezos.
Starting point is 01:52:56 Even if he is, let's say he is. But you might find it titillating. He might, but he wouldn't want it. Well, he would actually. Okay, we don't know. We don't know what his kinks are. But he wouldn't want it. Well, he would, actually. Okay. We don't know. We don't know what his kinks are. Okay. But he wouldn't want to put a video out like that.
Starting point is 01:53:09 Elon Musk would. He's bonkers. He's on another mission. Yeah, exactly. So- To humiliate all cheerleaders. Elon, I would do- Like a pay-per-view event.
Starting point is 01:53:24 Oh, wow. That's interesting. That's really interesting, Rob. I like this wrinkle. Yeah, how much to do a pay-per-view event? Oh, my God. A billion? A billion, yeah.
Starting point is 01:53:38 A billion, yeah. I would do it for, I wanted to say five billion, but that's a lie. I would do it for a billion dollars. Yeah, and probably 800 million. No. No,1 billion. Yeah, and probably $800 million. No. No, $1 billion. That's the ceiling, the floor. Because I remember taxes.
Starting point is 01:53:50 Okay. Well, you want to make $1 billion after taxes, right? Yes, yes, yes. Right. How much, Rob, for you if it's pay-per-view? I mean, $250? No, don't lie. Don't be –
Starting point is 01:54:03 Yeah, for real. Because people might put together the money. This could be000? Don't lie. Don't be... Yeah. Be honest. For real. Because people might put together the money. This could be a GoFundMe. No, I... Like $2 million. $2 million. Okay, great. And what would...
Starting point is 01:54:15 You would never, right? Because you're too public of a person. Well, let's start with going to the party. I would go to someone's Valentineentine's day party in another state for five hundred thousand dollars no you wouldn't i think i would after taxes is it an armchair oh yeah but remember this is a really rich person remember it's a scale we might go to someone's valentine's day party for free no we would but this is a rich family that we don't know and you're gonna be awkward family in st louis yes be awkward. The richest family in St. Louis. Yes. St. Louis is the richest family.
Starting point is 01:54:45 It's how they prefer to be called. And they actually want to pay. Let's make that part of it, right? Because— They're looking to get rid of—it's too much money. They want to pay because they want it to be a servant. Like, they don't want you to be doing them a favor. They want it—
Starting point is 01:54:59 They send a jet and 500 grand, I would go. What did I say? $350,000. Right. Okay. Now, pooping at their house— But they do have to pay for first class. And 500 grand I would go What did I say? 350 Right Okay Now pooping at their house But they do have to pay for first class Pooping at their house No one watches
Starting point is 01:55:11 500,000 But no one watches at all Correct Just using the toilet at the party That's right They want to know that I made duty while I was there Yeah for me that was 750 Right
Starting point is 01:55:22 And now in their yard with the immediate family The richest family in St. Louis, watching me take a poop. $300,000. $80. Is probably $10 million. Okay. Yeah.
Starting point is 01:55:38 And then for the world to see, I won't do it. You're not doing it. Yeah. Even a billion. Mine was a billion. The thing is, I'm good. Like, You're not doing it, yeah. No, except my one. Even a billion. Mine was a billion. Oh, world. The thing is, I'm good.
Starting point is 01:55:47 Like, how much, what do I need? Okay, I have to cut that. What? Because everyone. No, I think this is the opposite of that. This doesn't sound entitled. To me, it's like, how much money do you need? I know.
Starting point is 01:56:00 But maybe you put it, when you sold it to me, it was like charity and clearing cancer and stuff. Well, then that was Rob. Let's be clear. I don't want you to cure any disease. I want you to take the money and blow it on a yacht and a plane. No, I'm not. No, no.
Starting point is 01:56:13 Buy the robe. Just buy it. No. Yeah, they work for you now. You tell them, I need a bathing suit. Well, I knew what you would spend it on if you had a billion. SPCC? The animal shelter?
Starting point is 01:56:25 ASPCA. It's not sentientbeans.org. Yeah, I'd revamp sentientanimals.org. No, I would only do it for that amount of money, so I would be able to help people. Okay. Let's take charity out. I wouldn't do it. There's not a price.
Starting point is 01:56:43 Okay. All right. For everyone to see. Yeah. For pay-per-view. I missed one of the rungs, I think, which is the whole party watches. Because I said $10 million for the immediate family. And I'm going to say for the whole party to watch $15.
Starting point is 01:56:56 I'm going to kind of throw that in. That's going to be a little bit cheaper. Really? For me, there's a big leap between just the family and the whole party. To me, it's like you're already doing it in front of five strangers. Why not do it in front of 50 strangers? Now, the whole world and filming, now that's, you know. Yeah.
Starting point is 01:57:13 Also, you know how I feel about my butthole. Like I don't want anyone to see my butthole. Final cut would be just don't use this. No, but I mean, no, because if they're like zoomed in and it's like, no, I want you to use a different edit. You got to be wide. We would agree on a lens size. Okay. You got to be in a 35 and be at least 12 feet away.
Starting point is 01:57:30 But 8K. I mean, oh, God. Oh, boy. Okay, well, we've done that long enough. I'm sure people are throwing up in there. I hope someone in St. Louis is drafting their letter to us, to be honest. I'd love to get this money. I would actually, too, because that house is burning a hole.
Starting point is 01:57:47 Yeah, I know. That's why I thought you were uniquely vulnerable right now to this conversation. I am. I'm going to raise my Valentine for the richest family in St. Louis to $400,000. Okay. Party. Okay. You'll go for $350,000.
Starting point is 01:58:04 I mean, just know. I know you will. But they do have to pay for my first class flight. And yeah, pick me up from the airport and drop me off. Probably a hotel. You don't want to stay with them. Oh, good call. Or do I leave same day?
Starting point is 01:58:16 Same day. Yeah. Well, unless you want to see St. Louis. I don't want to stick around St. Louis after I've done that. Oh, no. If it's regular Valentine's Day party, yes, I do. Yeah, you want to go see the Gateway to the West, the Arches. Okay, west okay three nights at the four seasons st louis i don't know if that's real if that's a real place but that's a good question they got one they got one yeah okay three nights
Starting point is 01:58:36 so you'll do it for 300 no i'm just trying to get realistic no i want as much as i can get i know but three nights at the four seasons and and First Class Airfare, stop by a party. You're doing that for $300,000? You just don't think I'm worth $400,000. No, I definitely think you are. I'm going up to $750,000 again. You're doing market value. He's trying to get to the philosophical answer.
Starting point is 01:59:03 I'm trying to get to the real number. Someone offers you that and you say no. So someone calls and they're like, will you come to our party airfare? Let's be honest. I'd go for $100,000. Well, wow. Okay, now we're here. Can we be extra honest?
Starting point is 01:59:18 I'd probably go for $50,000. Okay, with the airfare and the hotels? If the hotel and then the—actually, no. $50,000 is too low for that. $100,000. Okay. Okay, with the airfare and the hotels? Yeah, if the hotel and then the—actually, no, 50 is too low for that. 100. Okay. Okay, great. You mean—now I feel bad about myself.
Starting point is 01:59:34 The stock just went way down. It did. It did. You ruined my whole stock. And also, you offended everyone who would probably cut off a finger for 50 grand. You offended everyone who would probably cut off a finger for 50 grand. The whole purpose of you asking this is that this richest family in St. Louis. St. Louis is richest family.
Starting point is 01:59:54 That's what they prefer. Is a huge fan of me. Enormous. So it can't, it's not just like. Price is no object. And price is no object. And price is no obstacle. And price is no option. But price is no object and price is no object and price is no obstacle and price is no option. But price is an option.
Starting point is 02:00:13 Well, if it's an individual versus a brand, does that change stuff like Gap Kids wants to? A hundred percent. Well, I imagine that this, this richest family in St. Louis, St. Louis richest family. St. Louis is a richest family. Is akin to a Gap. Like it's, that's the amount of wealth we're working with. So I'm back up to three and I am not budging. Okay, great. Okay.
Starting point is 02:00:31 I think you're going to get it. I think you're going to get it. I don't think I'm going to get 500 grand. You're going to get 300. Not to go to a party. Well, that's why I asked if there was speaking and stuff. Because if there was speaking and stuff, it'd be more. Just ch go to a party. Well, that's why I asked if there was like speaking and stuff.
Starting point is 02:00:45 Because if there was speaking and stuff, it would be more. Just chit-chat, small talk, have snacks. That sounds great, actually. I wish a ton of people would do this. And I could just like rack up. Yeah, once a week. Once a month. If you guys want me to come to your Valentine's Day party, I might come.
Starting point is 02:01:02 Let's just put it this way. We fly around the country to do shows where we make much less so we do that so I guess in reality we probably but not for the richest
Starting point is 02:01:10 no that's because armchairs we want to do that that was a bad analogy because we want to go do live we want to and people are paying like people
Starting point is 02:01:19 real people yeah and this is Gap and also the St. Louis' richest family. So,
Starting point is 02:01:26 that's different. Anywho, okay. Well, wow. Covered a lot. Look at the time fly. Um,
Starting point is 02:01:36 ooh, do you think they'll pay for my glam and my, um, wardrobe and I get to keep the wardrobe?
Starting point is 02:01:42 Oh, wow. This turned into a commercial. It's getting expensive. Wow. Um, wow. This turned into a commercial. It's getting expensive. Okay. I don't want to be picked up. I'm glad you brought that up. I don't want to be picked up. I want you to rent me an Escalade. Okay, let's continue.
Starting point is 02:01:55 I want to be picked up. Right, you do, but I don't trust anyone. And I don't want the person to talk. Well, I don't think you want to say that out loud. Why? Wait, actually, this is an interesting conversation. I'm going to be honest. Okay. In an Uber or a pickup situation, well, I have a few exceptions.
Starting point is 02:02:13 We know some drivers who drive us every now and then. Yeah. But we know them. Yeah. And so I like talking to them. Yeah. But the stranger, I don't love, well, I hate small talk. Yeah, I'm not great at it.
Starting point is 02:02:27 Which is why you have to pay me $300,000 to do it at this party because I don't like it. Well, that's what's funny is I'd almost rather have to go give a speech in the backyard at the party than mill about and do small talk. Oh, but it's the pre-planning, though, of the speech. But it's the pre-planning, though, of the speech. I'm just saying, as uncomfortable as it is to have to make a speech, random small talk does scare me. Yeah, small talk is awkward. Some people are great at it.
Starting point is 02:02:56 I was just with Aaron and Erin in Michigan. And Tyrell, I was taking note of it. He's so good at small talk. I couldn't believe the moves he was pulling because we were interacting with a ton of strangers on that trip. What did he, give us some tips. God, what, just like his enthusiasm about anything they were saying. He was so good at like being
Starting point is 02:03:14 very enthusiastic and like just thinking of questions really quick to ask them about themselves. I was really kind of blown away with it. I was like, that's such a skill. You're good at it. And by the way, not to be braggy, but I think I'm good at it. I just don't with it. That's nice. And I was like, that's such a skill. You're good at it. And by the way, not to be braggy, but I think I'm good at it. I just don't like it. Like I can do it and you can do it.
Starting point is 02:03:31 Yeah. But it's not preferred for me. And the worst is when you're in an Uber and then you're stuck there. Uh-huh. Yeah. Captive audience. Uh-huh. So I prefer-
Starting point is 02:03:43 Can't you put on that thing? Yes. Don't talk to me. Yeah, you can, which is when I added that to the driver thing, you made it sound like I was being a brat, but it's so ubiquitous of a thing that I think a lot of people relate to that it's on the app itself. But I love it when—by the way, caveat, I don't feel this way with armchairs. So if you're like, oh, no, I know she doesn't like small talk. That's not how I feel.
Starting point is 02:04:06 I love when armchairs come up. Same, same, same, same. A thousand percent. Really? Yeah, yeah. That's very true. No, this is just like everyone's gathered in a spot and you're standing next to somebody and you have to awkwardly talk to them. I don't like that.
Starting point is 02:04:19 Being in an elevator and you got to awkwardly talk. Yeah. Back to the drivers. Agreed with you for the most part on the uber drivers but interestingly i love it when cab drivers talk to me in fact i want cab drivers to talk to me because i'm they're always from somewhere else so i want to know where they're from how long they've been here do they like it do they hate it what did they do at home this is my dad also i love it he loves he's always talking. It's like a big mystery. You sit down and you're like, how'd you end up here?
Starting point is 02:04:47 Yeah. What was the route that landed us together in this moving vehicle in Manhattan? It's nice. I love it. But Uber, I'm like, I know what you live here. You give my dad like two beers. Uh-huh. Inhale.
Starting point is 02:05:00 The cab ride home is just like a nonstop chatter. He and I should take a trip together and have some Kingfisher. No, you're not allowed to have that. You've been on a trip with your dad? Not even that. What about an Indian? Do you think I'm an addict in India? Yeah.
Starting point is 02:05:13 Maybe not. Yes. Go back to the motherland. Something might work. Not your motherland. Your motherland. Yeah. A motherland.
Starting point is 02:05:21 A motherland of some people. Never did have a Kingfisher. Speaking of, you gave me a good compliment. Which one? Ding, ding, ding, the sauna. And it's a ding, ding, ding, my motherland. You said I can take the heat really well. And I can.
Starting point is 02:05:35 Yeah. And it's because of my Indian roots. Because you're not, you do the sauna once every three months. Yeah. And you hang in there for the 20, no problem. I can tell you could go longer. Yeah. One time I was in there with my full clothes.
Starting point is 02:05:49 Do you remember that? Yeah, that was crazy. That was crazy. That was dangerous. Anyway, okay, let's get into just a couple facts. Okay. We talked a lot up top about Altoids. Oh, correct.
Starting point is 02:06:03 Yeah. Yes. And we were praising them because the new, well, we assume the tint hasn't changed. Right. But then we thought, oh, it could be retro. It is. It has changed. It has changed. Over time.
Starting point is 02:06:15 But it still keeps a lot of that essence. Okay. It's similar to the old one. And you knew they were curiously strong, right? Or Mustafa didn't know they were curiously strong. I didn't know that either. But it was created in the 1780s. Whoa. Yeah.
Starting point is 02:06:31 Holy smokes. Is that the oldest company in the world? By the London-based Smith & Company and became part of the Callard & Browser Company in the 19th century. Oh. Yeah, kind of like a pep talk. Duty pants. Yeah. Well, peppermint's good for the stomach. It is.
Starting point is 02:07:05 I use them for my breath, not my intestines. So maybe I was using them wrong. It's double. Oh, wait. Are you supposed to put them in anally? I've been putting them in my mouth. Oh, no. Oh.
Starting point is 02:07:14 It's a reptile. Okay. That makes more sense. Is this suppository? Curiously suppository. Speaking of, I think, I'm not sure, but I think I might have to do some vaginal ones for the next freezing. Some what? Vaginal suppositories.
Starting point is 02:07:32 And how do those work? You just insert there. Just like a butt suppository. Like an Altoid. Just like you would an Altoid. I'm not 100% certain I will, but I might have to. Well, Rob has some experience with that if you need a hand. That's not funny.
Starting point is 02:07:50 Okay. Sorry. Remember to be gentle. This was a practice round. For September 18th? September 18th. This was a test. And it did not pass.
Starting point is 02:08:00 Okay. I'm glad we run these tests. Yeah. Me too. How long did Michelangelo, I mean, how long did it take? Michelangelo to create David? Yeah, to make the David. Yeah.
Starting point is 02:08:10 Why did I, I got scared all of a sudden that it was not him. You thought the name of the statue was Michelangelo for a second. No, I didn't. Okay. I thought maybe it was Da Vinci. Oh, sure. All of a sudden I got scared. That's easy to mix those.
Starting point is 02:08:24 Okay. Oh, sure. All of a sudden I got scared. That's easy to mix those. Okay. The work was commissioned by the Opera del Duomo for the Cathedral of Florence in 1501 and took roughly three years to complete. Three years. Chisel, chisel, chisel, chisel, chisel.
Starting point is 02:08:36 He was only 26 at the time of the commission. Whoa. Wow. And 29 when it concluded? I guess so. I might have said Leonardo da Vinci in the episode. Did I say Michelangelo? I don't think you even—
Starting point is 02:08:49 I could see me getting that wrong. Well, I said I was reading this book, and they said—they asked Michelangelo, how did he make the David? No, that was a different episode. This one was—you didn't say that part. Oh, okay. But you just brought it up. I've been on a kick of saying that. Well, I just read it in a book.
Starting point is 02:09:00 Okay. But you just brought it up. I've been on a kick of saying that. Well, I just read it in a book. Well, these young prodigies, just like the new U.S. Open women's. Coco? Coco. Yeah. I love her.
Starting point is 02:09:15 I'm trying to get her. She's 19. 19. She's so, did you see the video of her dad crying? I haven't seen any of it. Oh, my God. I just saw headlines that a 19-year-old won. It's so sweet her dad well first of all there's this viral video now of her at the u.s open in
Starting point is 02:09:31 maybe 2012 when serena and venus were playing and she's like in the stands and she's dancing she's not a baby but she's young she's a little kid she's there with her dad and she's dancing we know exactly how old she was actually yeah would you say 2012 yeah there with her dad and she's dancing. Well, we know exactly how old she was, actually. Yeah. Would you say 2012? Yeah. So 11 years ago and she's 19. She's 8. Yeah, she was 8. Okay. We did it. You did it. And she's
Starting point is 02:09:56 dancing in the stands and then you see this full circle moment. It's so beautiful. You'll probably cry when you see it, I hope. I just love dads, you know? Yeah. This dad took her to the match.
Starting point is 02:10:11 Obviously, he's invested in her. Dream. Yeah. And then got to see it. This is the one of her dad. It was the first time I've ever seen my dad cry. He doesn't want me to tell you all that. He thinks he's so hard, but you know, it's not. So thank you guys. I mean,
Starting point is 02:10:41 you believed in me from the beginning. I've been coming to this tournament. My dad took me to this tournament, sitting right there watching Venus and Serena compete. this tournament my dad took me to this tournament sitting right there watching venus and serena compete oh my god it's so we love those stories oh god i love them. Okay. The nuclear powers. Okay. Nine countries hold nuclear weapons. Okay. U.S., Russia, France, China, U.K. India.
Starting point is 02:11:16 Pakistan, India, Israel, North Korea. Mm-hmm. Overall inventory of nuclear weapons is declining, but the pace of reductions is slowing compared with the past 30 years. Moreover, these reductions are happening only because the U.S. and Russia are still dismantling previously retired warheads. U.S. and Russia overwhelmingly have the most. It's like they're both in 5,000 and the next closest is 400 with China. Wow.
Starting point is 02:11:43 Does everyone get scared when we talk about this that like North Korea is going to listen and then get angry? They only have 30. Well, don't say that. Then they'll feel, like, really defensive. Well, it's good that they only have 30 because they probably don't want to waste any. But they might want more. Like, us, we could waste a bunch. Oh, God.
Starting point is 02:11:58 But I think he's going to visit Putin. I know. I saw that. We're hanging out. Okay, now we're going to watch this other video. You know, if you were to follow a busy doctor as he makes his daily round of calls, you would find yourself having a mighty busy time keeping up with him. Time out for many men of medicine usually means just long enough to enjoy a cigarette. And because they know what a pleasure
Starting point is 02:12:24 it is to smoke a mild, And because they know what a pleasure it is to smoke a mild, good-tasting cigarette, they're particular about the brand they choose. In a repeated national survey, doctors in all branches of medicine, doctors in all parts of the country were asked, what cigarette do you smoke, doctor? Once again, the brand named most was Camel.
Starting point is 02:12:46 Yes, according to this repeated nationwide survey, more doctors smoke Camels than any other cigarette. This makes so much sense. Why not change to Camels for the next 30 days and see what a difference it makes in your smoking enjoyment. We only ask people 30 days. See how Camels agree with your throat.
Starting point is 02:13:02 See how mild and good tasting a cigarette can be. That makes so much sense. I smoke camels. Oh, wow. And this doctor smokes camels. That's right. Smoked.
Starting point is 02:13:16 I mean, I didn't even put that together. Wow. It was probably because I was just a doctor. And we are over-indexed on camels. I was Camel Lights, but I don't think they existed at the time of that no no no no i liked it even milder than they were pitching let me just tell you an original camels not that mild really oh no that's a that's a mouthful still never done it don't do it don't do it haven't done it don't regret it not gonna do it. Don't do it. Haven't done it. Don't regret it. Not gonna do it.
Starting point is 02:13:46 Okay. So we talked about, well, for a second you mentioned Quaaludes is the drug that we were able to eradicate. Uh-huh. I'm gonna read. This is from Frontline. Oh. Very trusted brand. Maybe even where I got my info.
Starting point is 02:14:03 My guess is yes. Yeah. I'm gonna read, though. I'm gonna read. Okay. This even where I got my info. My guess is yes. Yeah. I'm going to read, though. I'm going to read. Okay. This is Taylor Swift's commencement speech. Oh, my gosh. Okay.
Starting point is 02:14:13 During the 1970s, Quaalude became a widely abused sleeping pill. The key chemical needed to make Quaalude was methoqualone, first developed in India, ding, ding, ding, in the 1950s as an anti-malarial drug. By the mid-60s, U.S. doctors began prescribing qualude as a non-addictive alternative to barbiturates. However, by the late 70s, illegal use of the drug had surged, especially among teenagers. Users would, quote, lude out, combining the drug with alcohol to achieve a drunken, sleepy high.
Starting point is 02:14:45 Overuse could lead to respiratory arrest, delirium, kidney or liver damage, coma, and death. As the abuse reached its peak, it was linked to overdoses, suicide attempts, injuries, and car accidents. The drug supply came from legitimately manufactured pills, diverted into the illegal drug trade, and from counterfeit pills coming from South America and illegal labs within the United States. By 1981, the DEA ranked Quaalude used second only to marijuana and estimated that 80 to 90% of world production went into the illegal drug trade. And yet, within just a few years, the DEA got the problem of Quaalude abuse under control.
Starting point is 02:15:20 By 1984, Quaaludes had all but disappeared from the U.S. marketplace. Gene Haislip played by Gene Hacman Professor Gene Hacman. Gene Haislip, the DEA's number three man at the time, was the idea man behind shutting down Quaaludes. He came up with a solution
Starting point is 02:15:40 to go after Quaaludes at their source. The chemical manufacturers of methylqualone powder in West Germany, Austria, Hungary, and China. Hayslip traveled around the world, convincing the government of every country with the factory that made the chemical to shut down its trade. Well, it took some time, but in the end,
Starting point is 02:15:56 the Colombians would no longer get their drug powder, Hayslip tells Frontline. They didn't know what to do. They gave it up. We eliminated the problem. We beat them. Only time. I think it up. We eliminated the problem. We beat them. Only time. I think it's the only time.
Starting point is 02:16:07 The factory. So now we know the countries. Right. And I was wrong. What did I say? Switzerland or something? I thought I was only right. At least I got Germany.
Starting point is 02:16:15 That's a neighbor. And Hungary. You did great. You mainly knew. The main thrust of my point was that the DA did shut down. Yeah, they did. They did. They did.
Starting point is 02:16:24 They did. And then it is said that the same thing could be done for meth. Do you think now? Now it's all, it's like so proliferated. Well, still meth is only made in India, I think. Manufactured. What they need to make crystal meth is still a very, very complicated. Not them taking Sudafed and making meth out of it, which you can do it in a garage. But the precursors to all that are only made in India.
Starting point is 02:16:47 And so, yeah. So, yeah, you definitely can't go to the motherland and have Kingfisher. That place is full of meth. Well, I don't know if it's full of meth. It's full of meth and it's full of quaaludes. Well. Meth-a-quaalone. One of my great regrets is that I never got to try it.
Starting point is 02:17:04 No. That's everyone. Everyone that likes drugs has a fantasy about Quaaludes. Really? That's my age. Did you watch Wolf of Wall Street? That did not look like an attractive drug. From the outside, but I think on the inside it was fun.
Starting point is 02:17:20 That's interesting. For someone who cares a lot about being attractive, I'm kind of surprised that you don't care about that. Well, I wouldn't go out to hit on someone and get looted up, but maybe in the evening or sitting around the couch with the boys, who I didn't care if they saw me drooling or burning my fingers with my cigarette, my Camel Light, because I'm a doctor. Oh.
Starting point is 02:17:46 You know. Wow. My father was very into them. I think maybe just because I- He did Quaaludes? Oh, yeah. He loved Quaaludes. Really?
Starting point is 02:17:52 Everyone in the 80s did Quaaludes. And did you see him look all funny like they did in Wolf of Wall Street? I never saw him, to my knowledge, looted out. Oh, really? No. Just drunk. And, you know, who knows? He did coke, too. So I don't know when he was on what.
Starting point is 02:18:06 Yeah. I was a child. That's it. That's it. Mm-hmm. That was juicy. A lot of theoreticals. A lot of theoreticals.
Starting point is 02:18:15 A lot. I wish you would have caught me. We used to play this game when I washed cars for the 14 years I washed cars. Because it's so boring. Yeah. We washed cars all day long. And we would play this game. And the things were, like, insane like insane you know eat a pile of dog poop just you know I would do fucking anything for 100 bucks I know I mean anything well this was interesting I don't
Starting point is 02:18:37 remember if we've talked about this already but we had an interesting group conversation on a trip this summer with our whole pod because it was right after the submarine incident. And somehow we got on the subject of how much would you need to, maybe it was to do that. To go down in that submarine? Again. Again. And in that submarine? Again. Again. And that varies across the board. Yeah. And it is worth saying, few of us were like, no money.
Starting point is 02:19:14 Like, absolutely no money. Right, it's a huge luxury. Yes. Mm-hmm. And then I overheard one person in the group saying, it's interesting how this breaks down. Who's saying yes and who's saying no obviously correlates to how much money you have. Yeah, it's exactly like whose kids play football? Yeah.
Starting point is 02:19:35 Whose kids do, like it's, yeah, it's all across our society. Money is so interesting. It's so funny because, yeah, it's so like on one hand, of course, Money is so interesting. None of us would do anything that would potentially take us off the planet for money. I know. It's a bummer. Even though I, most of my life would have done almost anything for it. But it is, on the surface, it's really a bummer. It is.
Starting point is 02:20:15 That's kind of how I felt. I was like, this is such a bummer that the people that I love. This thing has only existed for 5,000 years. This whole thing, money, didn't even exist 5,000 years ago. Out of 300, years yeah i guess it just feels heightened because we're all in the same group uh-huh so then to see it break down like that and to see some people people who i love like yeah if you pay me this amount i do that no you'll die you will die you are not allowed to do that. Yeah. And they would. Yeah.
Starting point is 02:20:47 Money is. There's a lot of dads. Maybe some. I just know dads. There's a lot of dads that would die if it gave their family a ton of money. And they would do that and not realize that the family would never, ever want the money. Exactly. But I know several dads that would. I've heard many talk about it.
Starting point is 02:21:08 Oh, my God. That really breaks my heart. Yeah. It's just so backwards. And obviously, you're not even taking the time to put yourself in that position because you would never want your dad to die. Like, who would want their dad to die? The Mendez brothers, brothers but yeah with exception okay we're not talking about abusive parents no like that just makes me so sad yeah it's kind of an easter egg for something another sad episode coming up. Oh, yeah. It's like the things you'll do to feel societally loved.
Starting point is 02:21:48 To have status and be worthy of love, yeah. Ugh. Okay, let's talk about St. Louis. Let's clear the pail. Okay, so I'm pooping in public. Listen, I know that Monica's 350. Did you say 300? 750.
Starting point is 02:22:03 No, it was three. Down to 100,000. And I'm 500. Here's the point. Right now, you're 350. I'm 500. Yeah. You can get both of us for 750. That's $100,000 savings.
Starting point is 02:22:17 Okay. If you get both of us to come to the party. Yeah, because that'd be fun. Yeah. Then we do it for free. If I get to go with you, then I'll just go. Don't tell them that. No, remember, they want to pay.
Starting point is 02:22:29 Oh, right. They're dying to pay. Yeah. They have to, actually. It's the law. Okay, legally. Yeah. Okay.
Starting point is 02:22:34 Love you. I'll see you there. They love you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.