Armchair Expert with Dax Shepard - Reid Hoffman Returns (on an optimistic AI)

Episode Date: January 29, 2025

Reid Hoffman (Superagency: What Could Possibly Go Right With our AI Future) is an internet entrepreneur, author, and co-founder of LinkedIn and Inflection AI. Reid returns to the Armchair Exp...ert to discuss Socrates’s musing on the failings of writing, an argument for pushing back on confirmation bias, and why you can’t have science or the middle class without the printing press. Reid and Dax talk about why AI is different from other technologies that have changed our lives, how powerloom weaving in England was the original small business, and ChatGPT being an accidental mic drop. Reid explains the way we get more toward a healthy society is by establishing networks of trust, runs through some fun AI prompts, and discusses why as biological creatures we first go through what could go wrong instead of right in any given situation.Follow Armchair Expert on the Wondery App or wherever you get your podcasts. Watch new content on YouTube or listen to Armchair Expert early and ad-free by joining Wondery+ in the Wondery App, Apple Podcasts, or Spotify. Start your free trial by visiting wondery.com/links/armchair-expert-with-dax-shepard/ now.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Transcript
Discussion (0)
Starting point is 00:00:00 Wondry Plus subscribers can listen to Armchair Expert early and ad free right now. Join Wondry Plus in the Wondry app or on Apple podcasts, or you can listen for free wherever you get your podcasts. Welcome, welcome, welcome to Armchair Expert, experts on expert. I'm Dan Shepard and I'm joined by Lily Padman. Hi. Hi, nice stripes.
Starting point is 00:00:20 Thank you, I'm in stripes today. You look like a convict. Yeah, I'm going for that look. That's a new look, they said 2025 is convict. That's right. Convict's a new black. Returning guests today. Yeah.
Starting point is 00:00:34 But in person this time. Yes, we had them during our COVID era. That's happening a lot now, we're starting to get folks that we had done during COVID. And I gotta say, it feels in person like it's the first time we interviewed them. Definitely. I mean, I feel bad saying this, but that time feels lost.
Starting point is 00:00:50 Like it almost feels like those episodes don't count. Well, I think we had a lot of great, I know we had a lot of great episodes. We did. Eric Lander, that's a phenomenal episode, but it didn't embed itself in my memory. None of the ones that were over Zoom embedded. Yeah, same.
Starting point is 00:01:06 It's like, I don't really remember the people. I know. So we just gotta have them all back. We gotta have them all back. Yeah. Talking to you, Barry. Yeah. I don't know if that was even COVID.
Starting point is 00:01:15 He just didn't want to come. Well, right. Let's be honest. He did not want to come. So that's why we found out Kristen wasn't here. But anyways, Reid Hoffman's here. And to remind people, Reid Hoffman is a software entrepreneur and an author, he is the co-founder of LinkedIn,
Starting point is 00:01:33 also a leading voice in AI. He's one of, if you listen to our favorite podcast. Acquired. He's all over it. He was one of the first Meta people, he was the first, you know, he's- He's a huge investor in massive companies. Yeah, he's got a damn good eye
Starting point is 00:01:47 for what's going to actually work and be desirable by people. He has several books, Masters of Scale, Blitz Scaling, The Alliance, The Startup of You. His new book, which is out on the 28th, is Super Agency. What could possibly go right with our AI future?
Starting point is 00:02:08 And he wrote that with Greg Bito. So I want to give a shout out to Greg. This was really, really fun. And it led me to design a van that I'm in love with, with AI. That's really cool. Anyways, love Reid Hoffman. This was really, really fun.
Starting point is 00:02:24 Please enjoy Reid Hoffman. This was really, really fun. Please enjoy Reid Hoffman. He's an up-chair expert. He's an up-chair expert. He's an up-chair expert. My first curiosity Reid, and this is a sincere question, what's the longest you are anywhere? Because what I've been able to observe from a distance is you seem to be somewhere else every three days. It can be the stay and then move, move, move, move, stay.
Starting point is 00:02:58 Is home Silicon Valley? No, home is Washington state. Oh, what do you mean? Yeah, it's just outside of Seattle. And you're on a lot of boards. Do you have to be in person at those meetings or are those now Zooming? So it used to be all in person.
Starting point is 00:03:10 And now one of the benefits of the kind of post pandemic, like, well, frequently in person, but I happen to be in Rome this time and it's just not convenient for me to come back, I'll Zoom. And so it's kind of a mixture me to come back. I'll zoom. And so it's kind of a mixture, but with more in person. Okay, so you've written another book. Yes.
Starting point is 00:03:30 You're in so many different things. You're a very active investor. You're on the board of all these companies. What's the inspiration to write a book? It doesn't make sense, I don't think, almost anymore for anyone to only write a book. But it still does make sense to write books. And the reason why you write a book is because you want to think through something deeply.
Starting point is 00:03:47 The process of you writing it, you're thinking about it, you're circulating it to your friends and critics and getting them to challenge you and you're thinking about why it is I hold this particular view very deeply. And then let me elaborate it in terms of the detail of the argument. And some people, small number of people,
Starting point is 00:04:05 really like reading books. So like, okay, if you're going to go write the book to think through it, then also produce it and kind of have it available. Yeah, why not? One of my fun book blurbs on the Super Agency book is Yuval Harari, who was basically like, great, compelling future of AI and humanity.
Starting point is 00:04:21 I disagree with some of the main arguments, but you should read it and I hope he's right, decide for yourself and that's why you do a book. So that people who are wrestling with what is truth here can engage with you in some depth. But what you should really do is podcasts and social media and a bunch of other things, because that's where millions of people are actually going to engage with what you're doing.
Starting point is 00:04:43 So you shouldn't just do a book. If I had to do one, I'd do this, not the book. The book is the thinking to get ready for all the conversation. It's like an elongated pressure test of it. Yes, you're right. And when you're hearing your own argument in print, you somehow can have some objectivity about your argument. That makes total sense in your head. But once you're reading, you're like, that's a little weak.
Starting point is 00:05:04 Can't change that. Yeah, that doesn't work. It's funny you bring up Yuval, because one of my only pushback questions is something when he was here, he brought up, which I found to be very compelling. I love Yuval. And actually, one of the things that we discovered in arguing about it is,
Starting point is 00:05:16 frequently in arguments with critics, I'll find that I disagree with the question or I disagree with kind of the variable. Yuval and I see the elements of the world in very similar ways. We just have different assignments to variables. And that's part of the reason why he's like, look, I fear that I'm right and I hope that you're right.
Starting point is 00:05:32 Yeah, yeah, yeah. Oh, sure. And it's like, okay, let me tell you why I think we should be motivated by hope, not fear. Yeah. Yeah. So maybe let's start with, cause the book starts with Socrates musing on the failings of writing. So tell us about what Socrates thought of writing.
Starting point is 00:05:47 By the way, some scholars disagree with this particular interpretation because scholars have all kinds of different interpretations, but I think it's a reasonable interpretation. So Socrates in Phaedrus was, I mistrust the written word versus the spoken word. And I mistrust it because when I'm here, when we're speaking words together,
Starting point is 00:06:04 I can hear how you're hearing my words and I can adjust it. I can be dynamically interactive with you. I can challenge, I can question. Yes, so the dialectic process of this discourse in the spoken word is more true, more close to reality, more a shared learning experience than what might happen with a written word, which could be who knows where it goes,
Starting point is 00:06:26 and it could be abused, and it could be misrepresented, and it could be misunderstood. Mainly that. In person, you can read intention. This happens a lot when we're texting. I'm like, we gotta just get on the phone. Yeah. Because I think we are not on the same page.
Starting point is 00:06:38 That was a very short sentence. I feel like you're mad. Yeah. Phone now. Exactly. The first recording of technology is the written word. There were even challenges then. And then through all major technological innovations.
Starting point is 00:06:52 And that's why the opening with the Socrates invaders. But of course, the written word is essential. We do not have anything of the modern world without the written word. And that doesn't say that Socrates' concerns are even invalid today, all our texting. Yeah, exactly. Well, I was gonna add something
Starting point is 00:07:08 he couldn't have anticipated, which is there's something that gives validity to a concept or an idea, simply the fact that it made it to print and made it to your newspaper. There's some other magic trick where you kind of overinflate the information you're reading because it has gone through, you think, some steps.
Starting point is 00:07:25 Yes. You wouldn't have even predicted that. I think that's largely what happened on the internet is like people are reading things and it's in the shape of a New York Times article and it has a headline and you're like, this is news, this is substantial. A gatekeeper came through here, but not always.
Starting point is 00:07:38 Yeah, this is quality information because it looks like the information I associate with quality. Oh, and by the way, the mistake that most people make, which we now even have to harder steer against is, well, it's quality because I naturally associate with quality. Oh, and by the way, the mistake that most people make, which we now even have to harder steer against is, well, it's quality because I naturally agree with it. It has confirmation bias. Yeah, yeah, yeah.
Starting point is 00:07:51 And you're like, no, no. It intuitively feels correct. Yeah, it feels like it's right to me. And so it's what the people in my community are telling me, et cetera. Every generation of technology, you have to have the, no, no, let's think about why does I have that basis? What's the challenge of it?
Starting point is 00:08:03 What is true on this? And realize that my reflexive belief is not truth. My first question when I like an opinion should be, oh, do I benefit from this worldview at all? That's an interesting place to start. I think even if we could admit to ourselves, yeah, I have this opinion that sounds logical and sound, but hold on a second.
Starting point is 00:08:23 Would I somehow benefit if that were true? And generally, almost all my opinions, yes, in some way or another, I would benefit from them. You're living your life in a way out of what you believe, and so therefore, if you suddenly go, oh, action vectors is a thing I believe, well, how do I now lead my life so that I'm benefiting
Starting point is 00:08:41 from the world that I believe to be true? It's not necessarily a challenge of the validity of your belief, but you should ask the question and you should adjust. Yeah, my conclusion was I should most trust the opinions I have that I actually don't benefit whatsoever from. Not necessarily true. So for example, you go, Reed, are you writing Super Agency because you're investing in these AI technologies that you're going in these AI technologies
Starting point is 00:09:23 that you're going to make money from the companies If I said, oh, I actually don't believe in this, I believe in producing new place, then I would go do the producing new place. I think the asking the question of, well, where is your self-interest is an important one. But if you said, oh, well, you shouldn't do business where you think the right thing for humanity is, well, that's silly, because business is one of the great creators
Starting point is 00:09:40 of the middle class and all the prosperity that we have and so on. So you actually should align them. What you should do is not do businesses that you think are bad. You go, oh, I think the cigarette business is unhealthy for society. I don't do anything with the cigarette business.
Starting point is 00:09:55 I invest in the businesses that I think are good for humanity. Pro-social. Yes, and by the way, that then is an amplifier. For example, podcasting and being in dialogue, that's good. You can invest in the microphone business. You can invest in the camera business. That's fine.
Starting point is 00:10:09 That's oh, you shouldn't then talk about microphones because you invested in the microphone business. Well, it's nutty. Yeah, yeah, yeah. That's true, that's true. I guess I was more thinking, I've not advocated for many things in my life, but one happened to be gay marriage a long time ago.
Starting point is 00:10:22 Well, I certainly don't benefit from that. I don't have a gay best friend that was urging me to get involved who I would then win the love and approval. It was just kind of, okay, I trust that opinion because I don't really benefit from it. Yeah, that's how I feel about taxes. I'm happy to give a lot of money away. I should probably want to keep it, hoard it,
Starting point is 00:10:41 but I do think it's telling that this is actually against my own benefit, but I believe it as but I do think it's telling that this is actually against my own benefit, but I believe it as a value that it's important for society. Usually it's complicated because the reason you're willing to do that, of course, is you benefit from this great society that we live in with kind of the rule of law and all the rest. So it's what you perceive to be actually a benefit. Yeah, you're right. That we collectively do. And by the way,
Starting point is 00:11:02 do you think that the right society is one that allows people to be people, and actually in fact, sexual orientation is just people. Or is this other, like for example, someone who has a religious point of view that's really negative on that, goes no, no, no, because that's in your interest. So I actually think the right way to approach interest
Starting point is 00:11:18 is to be clear about here's why I believe my interests are right, and my interests align with a good future humanity. You're right, gay marriage is a single issue within a larger just society I want to be a part of. Yes, and of course I'm arguing for things that may fit my ideology, may fit my economic thing. But by the way, the idea is to be truthful
Starting point is 00:11:39 and in discussion about it and transparent about it and in dialogue about it. And that's why Super Agency, I actually view this to be good for humanity. Yeah, so there's a lot of perfect, well, they're imperfect, but they're damn near perfect parallels throughout history for this current AI revolution. So I think we should go through a couple of them.
Starting point is 00:11:57 We could start with the printing press. What were people's thoughts, fears, concerns? How does it parallel AI? So here's what's stunning, which most people won't realize, I think printing press. Of course it's great. I mean, here we have all these books. It's how we create a middle class.
Starting point is 00:12:09 We can't have science without the printing press. You have to be able to print and disseminate information in scientific journals in order to have science. All of our modern life is based on scientific advancement. It's super critical. And yet the discourse around when the printing press was created was, this is the destruction of human society. It's the destruction of knowledge. It's the destruction of human capabilities like memory.
Starting point is 00:12:32 And so a lot of the discourse we're having right now around AI was the discourse around the printing press. And there's both good news and the challenging news around this. The good news is, of course, we get to the other side of the praying press, and modern society is enabled. The challenge is, we had nearly a century of war because of it. Because all this religious war, who can distribute God's Word in printed form? Martin Luther said, we have a text, you don't need a bishop and you don't need all these people.
Starting point is 00:13:03 All you need is this book. And so the Roman Catholic Church, which was the hegemonic power of the day, was like, this is the end of us. Exactly. And so we as human beings adjust very badly to new technological, and we create a lot of problems in doing that.
Starting point is 00:13:20 And that's true of every major technological revolution. We've encountered this before, we're encountering it now again. I'm trying to get us to think positively and creatively about how to navigate these otherwise very difficult transitions. And by the way the transitions are always gonna be painful. I'm not saying it's pain-free, it's easy, kick back, sip your margarita, it's all gonna be fine. That sounds nice. Yeah. The transition will be difficult. There will be lots of challenges
Starting point is 00:13:48 and I don't wanna have to learn this new tool and I don't wanna have to adjust this new world that the new tool creates. And by the way, this will change political power and change economic power and change the power and the position I'm in and it will change my job. Yes, there's gonna be a bunch of that. That's not easy and say,
Starting point is 00:14:04 well, who are you to make that happen? Yeah, you clearly going to be a bunch of that. That's not easy. And say, well, who are you to make that happen? Yeah, you clearly already are the beneficiary of this thing. You're on the other side of this. Because you're making it a benefits you. This is classically how human societies work and make progress is that we have freedom to make new technologies and we have freedom to make new businesses. We offer them to people as consumers and they have the freedom to adopt them. And that's the normal process stuff. And that's of course where the AI revolution is happening. As it begins, that's a whole set of freedoms that enable that.
Starting point is 00:14:31 So what was the historical context from which the super loom arose? What percentage of the world was occupied with the production of textiles and stuff? How significant of an industry was it? There was a huge weaver industry, especially in England. People were like, oh, it's kind of the original small business. What will the orphans do that we have in these loom factories from the Dickens novels? That we're not paying or doing anything.
Starting point is 00:14:56 And this is where the term Luddite comes from, because Ludd was like, this is the destruction of our way of life. This is the destruction of our industry. We need to bomb the power looms. We need to kill the factory owners because this is gonna change everything. And it did, of course, because now home weaving is a hobby.
Starting point is 00:15:14 Yeah. Right? There's some Etsy in this. There was still some business about the production of all the clothing that we are all wearing here. None of this is the home weaving. But without embracing that power loom,
Starting point is 00:15:25 without embracing that part of the industrial revolution, we don't have a middle class. We don't have clothing that everyone can wear and we can't clothe the world. And new industries are created. And they said, okay, that industry massively transformed. It was suffering and pain for the people in that industry. Part of the reason why there was a rebellion around it.
Starting point is 00:15:43 If the Luddites had won, England would have never had an empire. England never would have technologically progressed. It would have been other places that did that. Their children, their grandchildren, only had a future by embracing the power loom. Well, it's hard at that moment to predict that, okay, at this point,
Starting point is 00:16:00 only 2% of the population is buying a new garment a year because the price is so expensive because it is hand-done. They're not forecasting well those jobs go away but now all of a sudden everyone's buying clothes. They'll be clothing shops, they'll be all these different things. It could in fact explode exponentially versus this little pocket we're trying to protect. Yes. What did the telegram do? All of a sudden telegram now means that information can transit at super fast speeds that it's not controlled through What did the telegram do? All of a sudden, telegram now means that information can transit at super fast speeds,
Starting point is 00:16:28 that it's not controlled through the people who control either the printing press or the messages by which the printed messages get there, whether it's government, security, or other kinds of things. So we are the current channel distributors of information. And so the people who are doing it now have a change of who has power for the immediate information.
Starting point is 00:16:46 So all this is bad, this is a destruction of society, you can't cross check it, you can't look at it, the current institutions of knowledge and truth should be the ones who are doing it. And so again, resistance. And every new technological thing, especially around information, but not only information, causes that no, no, no, we should stay with where we are,
Starting point is 00:17:04 not go to the new one and by the way you can Always identify all the problems of the new one especially at the very beginning because by the way we haven't Remodified it we haven't done iterative deployment and changed it to go Oh, here is how we get the vast majority of the benefits and here's how we squash some of the bad things right? See that safety glass airbags. Yes. We were just talking about that with little women. Oh, yeah Yeah, I just watched a Little Women a couple days ago. Yeah, and at the end they're showing the making of that book and how insane the process was to make a book.
Starting point is 00:17:32 Hand stitching the binding. I was like, oh wow, we think there's wealth disparity now. But back then, to read a book you had to be so rich you could get your hands on that one copy that took that long to make. Everything's a little bit more democratized with these advancements in technology. You're so rich you could get your hands on that one copy that took that long to make. Everything's a little bit more democratized with these advancements in technology.
Starting point is 00:17:49 It's in hindsight that you look back and you're like, oh, that was good, but throughout it's tough. The transitions are always tough. I'm not trying to tell people the transition's not tough. We'll get to the negative shit. We're just going through the history first. The one in the book that I had never stopped to think about, obviously I grew up in the era of it, is the telephone.
Starting point is 00:18:06 Because we're pointing out different things. There's a lot of concerns with AI. One is jobs, one is how is it gonna mold culture, what's it gonna do to inform it, all these kind of things. But of course there's a lot of kind of more societal interpersonal things and I didn't realize that when the phone debuted, people thought it was gonna destroy civil society because you would no longer go
Starting point is 00:18:26 to someone's house to visit them. That it would completely get rid of any kind of interpersonal in-person, which is kind of the same thing we're yelling about right now. I had no idea that there was an uproar about that. We forget it because we go, oh, it's so natural. And of course, the telephone, great, because rather than, I'm not gonna go buy your house
Starting point is 00:18:44 to visit you because I don't Know if you're there and go hey, are you there? Hey, can I come by great and all of a sudden increase social connectivity, right? Opposite outcome as they feared. All right the camera the radio the automobile small list of technologies that have completely changed our lives How is AI different than those? What are the ones you can admit to because I guess guess right out of the gates, none of those technologies has the potential to be fully autonomous, out of our power, out of our control, savvier than us. I do want to say there are some major differences
Starting point is 00:19:14 between the previous technologies and the current one. 100%. And by the way, in each time, and I'm not saying that to be dismissive, each time there were new differences. It's not like the new technology was exactly like the old ones. So there's always new things. Oh, it's much faster.
Starting point is 00:19:26 It's being adopted much faster. And then this one's much faster too. And that doesn't mean much faster is always okay. But each time it was, oh, it's being adopted much faster and it worked out. Now, maybe that's until it doesn't. So speed is one thing. Because now that we all have smartphones,
Starting point is 00:19:39 I will launch a new AI technology. And literally it could be tomorrow, a billion people are engaged with it. Now the other parts of it are, so one, speed. Two is cognitive. Even though some of them, like the printing press and other things, had cognitive participations change the way that we function, communicate, trust information, share information. I've got to add, Martin Luther did succeed in raising literacy rates from like 3% to 95%. And in the process of learning to read,
Starting point is 00:20:09 that cognitive ability did displace some of our other abilities. It did permanently change the structure of our brain, interestingly. 1,000% for good. Yeah. Yeah. As part of back to why'd you write a book is because actually writing a book is a way of thinking. And it changes our patterns of how we think.
Starting point is 00:20:24 Because the way we think now is perfect. We don't need to think better. And you're like, yes, we do. How do we get to the new ways of thinking? I think it's terrible. I don't know how everyone else is feeling about their thinking. Hey, your thinking seems fine.
Starting point is 00:20:34 No, I have to shoot down four bad ideas before I ever get to a good idea. But that's good. It is good, yeah. It's discerning. It's part of the process. Part of the technology is first we reject them, then we integrate them in terms of how we're being human.
Starting point is 00:20:47 Back to the question of new is one is speed. Another one is it's a depth of cognitive that is completely different new from anything we've done before. Already AI writes better than the vast majority of human beings. And so it's like, oh shit. Makes music better than I can
Starting point is 00:21:04 and does a lot of things better than I can do but part of it is the instruments can make better sounds I can't reproduce those sounds my vocal cords so when I play the cello or the guitar I'm playing new sounds and I can only play those sounds with that instrument and their superior to a human sound in various vectors so now I have to adjust to I'm talented with my hands and the guitar, I can make better sounds with the AI doing it, and the AI can make sounds, but I can still add something to the mix.
Starting point is 00:21:31 And I'm adjusting as I integrate it in, and I'm shaping what we create, and I'm adjusting myself to doing it. And that's the progress of human technology. But the new zone is, this is a set of different creativity, and cognitive, and expression, that's totally new from before.
Starting point is 00:21:45 And then you add, and it also is capable of being agentic. What's agentic? It can operate on its own. That's a derivative of agent. Yes, exactly. And that's why I chose to deliberately steer into it because that's kind of at the root of what most people go, is like, am I losing my agency?
Starting point is 00:22:01 Will I be the agent of this thing? Exactly, and that's why I deliberately steered into the fear with the title super agency. Will I be the agent of this thing? Exactly. And that's why I deliberately steered into the fear with the title super agency. You can go, oh my God, AI is coming for me, or oh my God, I can use AI. You only get to the positive future by taking the steering wheel. And that doesn't mean that it's perfect. In every technology, your agency transforms. It isn't that you had the exact same agency before, plus some new things.
Starting point is 00:22:27 It's new agency. Trade-offs. But so far, every new agency has been massively better than the old agencies. The thesis is, and this one can be too. Let's put a bow on the kind of employment fear. So what are current unemployment rates versus in the past throughout any of these huge technological revolutions?
Starting point is 00:22:48 When we did the agricultural revolution to the industrial revolution, it was difficult. All of a sudden, people moving into cities, not having jobs, being displaced from farms, a stack of different things, and how do I learn these new jobs? The new jobs aren't created yet, and all this stuff. People say, why are you irritated by the critics? Not irritated by the critics. I think the critical process is a good thing for saying how do we steer through the transition?
Starting point is 00:23:11 Where do I predict will be the first major replacement areas is actually customer service. Because any place where we're trying to get the human being to just be on script, yeah, I can do the script better. Rather than, well, we've outsourced the job to some Philippine or Indian or other worker who is just following a script. Can't actually pull a lever.
Starting point is 00:23:30 It's this enormously irritating experience, which by the way, since it's a cost center, we all have the experience that you get on the phone and you're waiting and you're waiting. Now, customer service is like, I'm here right away. Yeah. Right. And what's more, I'm much better than those previous scripts because I've got this dynamic kind of AI process.
Starting point is 00:23:47 The AI is trained to be much more compassionate, empathetic, patient. It's never triggered by your tone. Yes, exactly. Where you're like, fuck you, I'm so pissed off. This is my 10th call. Yeah, and I'm like, I'm really sorry about that. And I understand it's difficult.
Starting point is 00:23:59 It makes the human on the other end, like if you're interacting with the AI, you feel dumb yelling at it. You're right. Monica, look at me. So many ways you just are like, whatever. It makes the human on the other end, see more next year and the year after. You say, oh, well, there'll be no customer service people and we're like, no, no, no, but the customer service job will now be different, which is I'm reviewing how the experience with the agents work.
Starting point is 00:24:29 Then I went, I have an idea about how we could make this a lot better. What if, for example, the agent started with a, hey, obviously you're calling me because something was difficult. I'm really sorry, I'll hear about that. Then let me end with, here's a little thing that might make your day more delightful.
Starting point is 00:24:42 I guess the unemployment point I was hoping to make was, I listed seven of these that were gonna end all employment. And I think currently our unemployment rate is about as low as it's ever been. Lower as it's ever been. Historically. Yes. So in spite of all these things, once it's flushed out and we get through
Starting point is 00:25:01 the rocky part of it, our unemployment tends to go down and down and down. We have very natural reasons to try to enable employment. A, we want it as a society, much healthier society. But B, if I can figure out a tool that helps enable employment, then a bunch of people are gonna wanna use it to enable employment.
Starting point is 00:25:18 One of the parallels I frequently use is Airbnb. One of the reasons I led this series in Airbnb is if you can enable a bunch of people to creating new ways of traveling, new ways of staying in places, new ways of experiencing locations, that's a really good thing. Does it transform some of the hotel experience? Sure. Does it eliminate hotels? Absolutely not. Because there's still reasons at different times you want hotels, but it enables a bunch of micro entrepreneurship. opportunities and jobs are created and say, well, can you predict them right now? No, nobody could have predicted Airbnb off the internet and the mobile phone. Or for me who loves to drive,
Starting point is 00:26:10 that you would get in a car with a stranger and have them take you across town while you're drunk. The first thing you're taught in kindergarten, don't get in a car with a stranger, is we're all doing it. We were trained, now no longer. Yeah, it's wild. Right.
Starting point is 00:26:23 You have a couple of good quotes in the book. One of them I wrote down that I like was, you'll never get the future you want simply by prohibiting the future you don't want. I think that's like a good admission to make right out of the gates. You're not going to get anywhere by just throwing darts at it. Yeah, it's perfectly good to be critical.
Starting point is 00:26:40 I welcome the critics. But if you want to be helpful, put your criticisms in ways of saying, this is the kind of thing we want. Not, no, no, you can't change. Yeah, start delineating what the thing is you want. This technological wave is coming. Because by the way, even if you said, I can stop this technologist from making it,
Starting point is 00:26:55 there's tens of thousands and millions of those in other countries, other places. It's happening. There's no version where America sits this out. Yes. So forget it. So help steer it. Yeah.
Starting point is 00:27:07 Because of healthy human outcomes. Cause people are like, so why don't you slow down? If you slow down, then other people will shape it too. I use a driving metaphor. Doesn't mean close your eyes and hit the accelerator. That's kind of dumb. Accelerate to the maximum possible safe speed.
Starting point is 00:27:22 That's a good idea. Okay, so I think the first AI product that hit our doorstep in a way that people understood was chat GPT. How was it different and how successful is it and why? They thought we were doing a research release. They just thought, hey, we know this is coming. Let's start getting some human feedback and then boom.
Starting point is 00:27:42 Oh wow. The mic drop. It was a tweet to announce it, right? It wasn't like a press conference. They didn't anticipate any of that happening. All of a sudden it was like, oh my God. So what did it do that you think captivated people? It made it accessible.
Starting point is 00:27:55 Just about anybody who could have access to a smartphone or a computer could go interact with it and then see something magical. Whatever your thing was, which could have range from the mundane, like, well, I got this shit in my refrigerator, what can I make? There's a recipe that's actually pretty good that's interesting to my kind of esoteric interest.
Starting point is 00:28:14 Like I have a philosophy degree. So I said, what would Wittgenstein make of these modern AI things? And it writes a little essay. And you're like, whoa, that's pretty good. Yeah. And you, as a philosophy major, found that critique valuable.
Starting point is 00:28:27 It was a good thing to interact with. I should test it with automotive shits. That's the only thing I'm an expert at. You absolutely should. This is when I try to prompt people into this, prompt, deliver, pun, is to say, do it on something, not just on the new things you don't know how to do. Like write a sonnet for my kid's birthday.
Starting point is 00:28:42 Do that too, that's very cool. Yeah, that's fun. Right, fun to do. You should do the new superpowers make some hip-hop lyrics on this thing that I care about do the things that are completely out of your skill set but also do it for things that matter to you these new AI tools like Chachibiddy they're great at taking a role that you ask them to take so a simple role is I'm gonna make the following argument make the counter argument okay right just argue against me. We would love that, you and I.
Starting point is 00:29:06 What would we do? We'd just sit there. We'd stop talking to each other. No, no, no, no, no. Here we go, now we're on the back. That's the end of our friendship. Not at all, you get better. That's true, you learn.
Starting point is 00:29:13 By the way, you could train it on me. You could go like, what would Dax Shepard's counter argument be? And I might be able to create it. It could probably predict our entire conversation. In doing that, you could then make your conversation even better. Guys, going to the edges, thinking of things,
Starting point is 00:29:26 and oh, argument three, it's better for me to do that. I was doing super agency, what would the counterarguments be? And you can't put this infinite 2,000 page book with all the counterarguments, but you okay, let's address the really important ones, let's put the key ones in to make that work. That's the simplest role taking. Another one is, what would a race car driver
Starting point is 00:29:43 think about what I'm saying? What would a manufacturer of race cars think about what I'm saying? What would a manufacturer of race cars think about what I'm doing? What would a future car be like here? Be the technological innovator, be the race car driver, be the industrialist. Be the energy force. You can do it from any perspective
Starting point is 00:29:56 in the automotive industry basically. And be in dialogue with that now too. How would an ad agency sell this new thing? Exactly. Stay tuned for more Armchair Expert... if you dare. On January 5th, 2024, an Alaska Airlines door plug tore away mid-flight, leaving a gaping hole in the side of a plane that carried 171 passengers.
Starting point is 00:30:23 This heart-stopping incident was just the latest in a string of crises surrounding the aviation manufacturing giant, Boeing. In the past decade, Boeing has been involved in a series of damning scandals and deadly crashes that have chipped away at its once sterling reputation. At the center of it all, the 737 MAX, the latest season of business wars, explores how Boeing, once the gold standard of aviation engineering, descended into a nightmare of safety concerns and public mistrust, the decisions, denials and devastating consequences bringing the Titan to its knees, and what if anything can save the company's reputation.
Starting point is 00:31:00 Now, follow Business Wars on the Wondery app or wherever you get your podcasts. You can binge Business Wars, The Unraveling of Boeing early and ad free right now on Wondery Plus. New year, new resolutions. And this year on the Best Idea Yet podcast, we're revealing the untold origin stories of the products you're obsessed with. And we promise you have never heard these before.
Starting point is 00:31:22 Ever wonder how the iconic Reese's peanut butter cup was invented? Because it was by accident. H.B. Reese, a former frog salesman, True story. Stumbled upon the idea after accidentally burning a batch of peanuts. Classic.
Starting point is 00:31:35 Proving that sometimes our best ideas arise from what seem like our biggest mistakes. And Jack, did you know there's a scientific explanation why humans crave that surprising combo of peanut butter and chocolate? I didn't, but it sounds delicious. It is delicious. So if you're looking to get inspired
Starting point is 00:31:50 and creative this year, tune in to The Best Idea Yet. You can find us on the Wondery app or wherever you get your podcasts. And if you're looking for more podcasts to help you start this year off right, check out New Year, New Mindset on the Wondery app. Who knows, your next great idea could be an accident that you
Starting point is 00:32:05 burned. This is Nick and this is Jack and we'll see you on the best idea yet. I think so much of the fear with AI but I think every single one of these revolutions is ego because we have a friend who uses it all the time for work and she'll just be like, here's some points, put it in a feasible email that I can send out. And when she was saying that, I was like, that's cheating. I devise an email. You can listen to the osempic argument.
Starting point is 00:32:37 Yeah, it is. It's like, that's cheating. I do that because I have the skill to do that. I want to protect the rare skill. You want that to still be a valuable skill. Yeah, because I have it skill to do that. I want to protect the rare skill. You want that to still be a valuable skill. Yeah, because I have it. That's right. But of course when it's the music thing I don't have,
Starting point is 00:32:50 I'm like, yes, I want that. If AI can make everyone's all, I'm fucked. It's just ego, ultimately. And the idea is don't keep your ego rooted in history, make your ego part of the future. I have the skill with email, great. How to use whatever your chap tool of choice, Copilot, whatever, and make it even better.
Starting point is 00:33:06 I know, it's transitioning into, okay, well that is gone. Now imagine Monica Plus this thing. Exactly. One of the things we will do at the end of this podcast, I'm gonna pull my phone out of my pocket, I'm gonna pull GBT-4 up on audio mode, and I'm gonna prompt you into using it,
Starting point is 00:33:21 because I'm gonna show you some of it. I need you to. Oh, this is exciting. No, we will do it. I need you to. Yes. Hold my hand out again. No, we will do that. I literally did that with my mom at Christmas. This is me having to get grandma's phone out and show her the icon for podcasts and show her how to get there.
Starting point is 00:33:33 Because I thought about it, I said, look, what's my Christmas present in addition to all the little nice things to my mom? You better have bought her a house, you rich bastard. I did that too. Of course I did that too. I did that years ago. But part of what you want to do is share the journey with the people you love.
Starting point is 00:33:49 So she loves painting. I put chat GPT on audio mode and I said, what would Monet be doing with current AI image generation things? How would you imagine that Monet would think about this? If you love painting, what are ways that you'd use AI to help you paint better? Show her something that mattered to her in this. If you love painting, what are ways that you'd use AI to help you paint better?
Starting point is 00:34:05 Show her something that mattered to her in this. So Reed, one of my curiosities is if you give it the same prompt. So I've consumed some AI stuff that I find incredible. There's a dude who makes these mashup songs. So one that I loved is he said, do Metallica as Yacht Rock. And it came out with a song that was truly a banger. I played it on him. It's a great fucking song. My question is, if he put in that identical prompt over and over again, it wouldn't always spit out that song or would it? Would it come up with a new song every time?
Starting point is 00:34:31 It won't be necessarily exactly the same. These are very large probabilistic computers. So even if you scrubbed the entire chat history back to where you were starting, there's probability variables. Does time and space intervene? Like you could ask in front of mom, show me Monet's current take on blank.
Starting point is 00:34:47 And then three weeks later, you do it. Would you get the same images? If you set up the exact same thing, you'd get something probably very similar. Now the changes are, it's a probability thing. So there's random number generators below some of it just to make it performant. There's algorithm updates. I was doing it on my account. It has all of my history in there.
Starting point is 00:35:09 So the fact that I've done this before, if it sees it, it will track that and then go, ooh, we probably want something new. Oh, right. It's learning what you want. So there's a bunch of different things. They're making changes. So it's not necessarily the same. But if you kind of said, okay, fresh phone UBT here's the Monet prompt you'll probably get something that sounds very much like the earlier one even if it's a little different because it's like best guess of how experts think about impressionist painting and what Monet was so revolutionary at. I want it to be different because again if it's
Starting point is 00:35:37 gonna be something extends my agency I don't want everyone else to have the exact identical. It's very easy to make it difference. Here's some of the fun stuff. So I was beginning with some of the easy prompting. Tell it to take a roll. Be Ferrari in your answer. Enzo. Be Enzo in responding to me. And it'll go, OK.
Starting point is 00:35:54 OK. Yes. Exactly. Right? For example, some of the fun prompting is do things like, consider the Zenkouen, what is the sound of one hand clapping? And do whatever prompt you're doing. That'll give you a different answer.
Starting point is 00:36:06 Wow, just let that swirl as you give me the answer to something unrelated. Yes, by the way, one of the things about being creative people, oh my God, I could do so many more creative things now. It seems to me from the outside, like it was a bit of a shocker. And I'm curious where these other monoliths were at in this. Like where was Google?
Starting point is 00:36:24 Where was Microsoft? I know Microsoft had already invested years before, And I'm curious where these other monoliths were at in this. Where was Google or was Microsoft? I know Microsoft had already invested years before, but where were these other Behemoth companies at when this came up? Because I think the big shocker, and I could be wrong, is that Google has really nothing. and scientific work that enabled a lot of this. Part of the genius of the OpenAI crew was going, oh my God, this at scale, this is magic. And Google kind of knew, this at scale would be cool. And they were then kind of classically more fearful
Starting point is 00:36:55 about how people were going to criticize them with it. And OpenAI was like, no, no, this is going to be super important for humanity. Well, they had no brand to protect. Right, exactly. Or market position or a bunch of other stuff. These are researchers who want to demonstrate that it wasn't zero brand, but they were willing to take the risk and do it. I guess it just seems like a shocker that the holder of all the info we're using in
Starting point is 00:37:15 some way wasn't at the forefront of this. Yeah, they have the most natural position for all of it. But it's part of the reason why I track the news. Lots of people yelling criticism at Google, including DOJ and natural human responses, oh, let's try to not increase the amount that we're being criticized. Yeah, let's not stick our neck out any further. And it's why imagine what could possibly go right, not just what could possibly go wrong.
Starting point is 00:37:38 Yeah, you keep saying super agency with the full title super agency, what could possibly go right with our AI future? Is there any device in the works to check the work of the AI? Tons. Everyone's working on that. What you're really trying to do is not get fallibility zero, you're trying to get fallibility to low. But that's relatively straightforward. Everyone's working on that. The fear is hallucination. What does that mean? I mean I know what it means, but what does it mean in this context? I invented shit, right? Oh. And I invented shit and sometimes it's wrong. But by the way, hallucination, imagination, same thing.
Starting point is 00:38:06 Right. And imagination sometimes a really good thing. So I want to keep all the imagination and use all the imagination. Now, when you're saying something that matters to me is true, what I want to do is I want to put in, for example, what Google does with the search algorithm. I want to make it so that it's like the first page or two,
Starting point is 00:38:20 generally speaking, has some shit that's on target. Seemingly the most vetted. And so that's what I want in AI. Yeah, like if it says sponsored, I throw it right out. So what you want to do is say, reflect a rigorous and improving truth-telling process to me. Don't manipulate me, at least without telling me that you're trying to get sponsored.
Starting point is 00:38:39 Don't lie to me. That's what we want. The most comical case I've heard in the news of this is the lawyer that wrote his debrief and had AI do it, and it cited law cases and the argument that didn't exist, and by some miracle, the judge detected that. Which I wouldn't have expected a judge actually went
Starting point is 00:38:57 and looked at all these references, but did. There's an example of this person really had a lot of confidence in this thing and put it in front of a fucking judge. And the the judge was like there's no Peterson versus Disbarred for that. I think he's got humiliated in the news That was a good public humiliation because the answer is when we're in processes legal medical Educational where truth really matters and by the way news we should have a truth-telling thing
Starting point is 00:39:24 I just waiting for the day for this to be created is, here is our news checking AI bot. Right. God, yeah. That we have collaboratively made, both red and blue people have fought very hard for what things are in it. And now you have something that you're broadcasting, it's cross checked with the news bot.
Starting point is 00:39:42 We need to call that AI Walter Cronkite. Yes. We gotta go back to our last trusted. I don't think that means that if you disagree with the AI news bot, you can't say what you want to say. Sure. But I love the you said X and AI news bot says, oh, by the way, you should check some facts. The moment I was most discouraged by this was in the wake of the Capitol Six. Oh my God, happy anniversary. Surge.
Starting point is 00:40:04 Here we are on January 6th. I turned on CNN first because I'm a liberal and the pundits saying that the police officers were in cahoots with the Trump administration. I was like, what, the police officers wanted to storm on the Capitol? Hey, that was not true at all. Yeah.
Starting point is 00:40:19 Then I went over to Fox and it was like this triumphant kind of, and I was like, well, neither of these perspectives are even approaching reality. Could we merge them and come to truth? I don't know, but that's when I got kind of scared of the whole system. Wherever technology helps create a problem,
Starting point is 00:40:34 it can also create the solution. In the book, you cite some polls, and the worst of the polls was at 15%, and the very best was at 35%. Only 15% of Americans in the worst case think that the benefits of AI are bigger than the wins of AI. And in the best case scenario, 35% of Americans think that the benefits
Starting point is 00:40:54 will outweigh the risk. Why is it that stark? I think if you could go back in history and run that in any new technology, it would always look like that. So I think that's why, because it's like, look, I'm pretty comfortable with my life as it is, and we as biological creatures, most of us,
Starting point is 00:41:10 tend to imagine what could go wrong. Yes, of course. We're trying to avert risk and disaster. Here's a car? Oh, it's a two-ton death machine? I might die? Someone will only run into me? Other people on the road?
Starting point is 00:41:20 Well, the joy of a healthy fruit off of a tree versus the potential death of the poisonous one, they're not equal. It's a very natural reflex. Natural reflex does not mean correct. Yeah. Right. He referenced Orwell's 1984 quite a bit in one of the chapters. So what was Orwell's fear and what happened? And he wrote it in 1949, which I found to be interesting considering what computing power existed. Wasn't it 1948? Well, you got an issue with your book.
Starting point is 00:41:43 Okay, then I may be the one that got it wrong. I didn't know what year it was but it was still there. Because in 84 maybe he wrote it then. Also the copy I have says in huge letters uncorrected copy. So who knows. That's funny. Right. The inversion of that as part of the prediction of the future was kind of a clever thing and it was like these new technologies will be the instruments by which the autocratic governments will exert control over their population, turn all of the people in the population to dis-agents, disempowered, take their agency away, use the technology to control it, use the technology to fashion the society the way, and this is why we should be fearful of these technologies.
Starting point is 00:42:24 Yeah, like ignorance is good. Right. And when you look at it and you say, well, what happened with the creation of all of these information technologies and so forth? And the answer is, it's actually a little bit like I'm more a fan of Neil Postman's amusing ourselves to death. It's more Aldous Huxley, but neither of those dystopias play down. We have enormous amount of individual empowerment and that individual empowerment allows us to learn things, explore, say things as part of the creation of bold new things, which also creates a lot of misinformation and other kinds of stuff
Starting point is 00:42:50 that we're still sorting our way through and what is the way that we do that. How do we sort to truth? We have panels of expert people who are involved in the institutions, part of the reason why we like classic journalism in various ways, why we like scientific method, why we like classic journalism in various ways, why we like scientific method, why we like courtrooms and juries,
Starting point is 00:43:08 is we have panels of people who kind of work through and try to help us get through that individual, like I know what's true, but the actual, just all the technologies was individual empowerment, not individual disempowerment. Well, yeah, if you were a leader during that Arab Spring, you certainly didn't think technology was at your assistance.
Starting point is 00:43:25 Yes. Any leader that we're fearing is taking away technology. That's the way to get people really under your spell. What is a network of trust? The way we formulate most of our beliefs, knowledge, et cetera, is through network of information we live in. And so what we do is we try to formulate networks trust. The most simple one is friends, communities we live in. And so what we do is we try to formulate networks trust. The most simple one is friends,
Starting point is 00:43:45 communities we live in and so forth. Part of how we get to more healthy societies is having networks of trust. And technologies can help us do that and help us form new networks of trust. Yeah, it seems like AI would be the only thing that could handle the sheer volume of that information that exists.
Starting point is 00:44:04 Like when we talk about an AI that could be some kind of blue and red beat out position. Here's the blue answer, here's the red answer. There's a lot of blue people, a lot of red people. Here's what each of them would say. Here's what the arguments for or against on each one. The real important thing is we're all exposed to that. If you're exposed to just the blue, that's a problem.
Starting point is 00:44:20 If you're exposed to just red, that's a problem. Part of what we're confronting is we try to regain Or keep an American soul is we got to engage and we got to understand what this stuff is Because it's not that either side is completely looney tunes now some stuff I think is looney tunes the shoulders of both are looney tunes Yes, let's go through some of the things that could go right because I do think there's some really exciting things So how on earth could child childcare be impacted by AI? Why would you want an AI watching kids?
Starting point is 00:44:49 You can't watch your kid all the time. And by the way, sometimes you shouldn't. No, no, they need some anonymity. But say, for example, the AI is watching kids go, oh, wait, the kid's getting into the drawer that has the rat poison in it. Or the edibles or the fire starters. Err, err, let's do something.
Starting point is 00:45:02 Yeah. Ah, great. Or like an interaction. I'm angry at my friend. Well, do you think your friend might have had a point of view? Or what's the right way to engage with that anger? As a greatly human amplifying, both protect from safety, help learn, like for example, say,
Starting point is 00:45:17 well, I'm only going to act with AI, because I'm only going to have AI friends. A little bit depends on how you teach your AI friend. Like at inflection with Pi or personal intelligence, if you sat down with Pi and said, you're my friend, and it goes, no, no, I'm your companion. Let's talk about your friends. Have you seen any of your friends recently?
Starting point is 00:45:33 Is there one maybe you should call today? Yeah, I can envision a version where you have set the parameters of your own personal points of view on parenting, and what you want it to do is to remind you of the best version of yourself in that situation. Because there's many times where I'm overwhelmed and I don't hit my own parameter. You're entitled to your own parameters,
Starting point is 00:45:53 but is there a device that could nudge you into the best version of your own parameters? It's already there. Like William Gibson, one of my favorite science fiction authors, that's called, the future is already here but unevenlyEvidently Distributed. So you can already talk to Chad GBT.
Starting point is 00:46:08 This is the kind of parent I'd like to be. And so remember that. Now I'm going to talk to you about, oh, I'm having this interaction. This little shit just came in. Just broke this thing. Oh, here we go Monica, good, good, good, good. Just because mistakes are such a huge part
Starting point is 00:46:22 of learning and growing as people. So if there's something that's popping up that's basically stopping you from making any mistakes in front of your children, there's something to be said about seeing a parent make a mistake, repair, learn from that. You know? But Moni, we would put that into our parameters.
Starting point is 00:46:38 Oh my God. You could do that? You could say, I value rip and repair. Right. Then I might cause fights so that it can repair. By the way, it's impossible to eliminate all mistakes. I know. No, no, but it's impossible.
Starting point is 00:46:52 What you want to make is constructive mistakes. Say, look, I actually value having constructive mistakes. One of the things people say, oh, I want to avoid conflict with other people. Actually, me personally, I steer into the areas where I think there's constructive conflict. Right. Because we can learn from it. That's the thing is like any argument you or I can think of in the way you can pick that version of it. Just do you think everyone's self-aware enough?
Starting point is 00:47:14 People who do like construct, exactly. People who do like constructive conflict, a lot of them don't know that. They would say, I don't like conflict, but then they do. I think you just have to be very self-aware of what you really want, the way you really want to live your life and the way you really want to be.
Starting point is 00:47:31 That goes into the wonderful philosophical homo deus, you've all horari, are we servicing our experiential self or our narrative self? You also have to be clear, what percentage of yourself are you servicing? Yeah. All that stuff is a good use of your time and thought, right? I agree, yeah.
Starting point is 00:47:47 Like to just stumble through life and think about everything after the fact is what we mostly do. It almost forces you to have some intention because you're kind of declaring you have a goal or an objective. Tax filing and legal advice, this feels very democratizing
Starting point is 00:48:00 because I think most people who are struggling, they don't have access to either of those things. Yes, I think what we will get to is copilot's assistance for everything. Oh, you have a problem with your landlord. Can't afford a lawyer. Don't know which lawyer to hire, which one's a good one, which one's an indifferent one, which one's a shyster. Well now you can say hey, I can consult and even today you can put your rental contract in the chat GPT and say What should I be concerned about and have a dialogue with the chat GPT about your rental contract in the Chatch EBT and say, what should I be concerned about and have a dialogue with the Chatch EBT about your rental contract?
Starting point is 00:48:27 What's it going to do in medicine? We've touched on this a little bit. We were at that conference, the thing that I found to be mind blowing is that to perform the amount of experiments required to get to a FDA phase one trial, the amount of man hours involved, the amount of expenditure,
Starting point is 00:48:44 where these companies now using AI that are just running a bazillion different experiments the amount of man hours involved, the amount of expenditure where these companies now using Iod that are just running a bazillion different experiments to see what happens. How does all that materialize? that you don't create a medical assistant today that's available to everyone who has a smartphone, 24 by seven, that's one. Drug discovery is gonna get massively amplified. Because by the way, when you slow down the invention of a new drug that will actually work, there's millions of sufferings and deaths between that.
Starting point is 00:49:17 We're gonna see a renaissance of all that. Probably on both sides too, like the FDA can use this technology to take on such huge quantity to evaluate. That seems to be one of the huge logger jams in this whole thing. They have a finite capacity to evaluate all this stuff. It is funny, I of course, like Monica's email thing, I have a knee-jerk reaction to kids writing term papers. Me too, essays.
Starting point is 00:49:43 Because I fucking killed myself and it was one of the things I did well Something you got praised for something I'm praised for I mean probably got me through I had other columns that were very low and that was lifted by that ability and then another part of me that's realistic goes Yeah, okay, so they're gonna write the term paper. It'll be good or bad. They'll have to evaluate that. They'll turn it in They'll get a grade the same skill set will be applied when they go get the job where they're required to write. The bill doesn't ever come due because it's not like anyone's taking that tool away from them. Why should that bill come due?
Starting point is 00:50:13 We don't teach handwriting script anymore. Being able to hand write at length, that doesn't matter. Yeah, people were pissed when I no longer had to know how to spell. Now as a dyslexic, I was like, yeah, finally I'm not handicapped by this thing. I have spell check. Because I'm as smart as you, I just fucking can't do that thing. So I, of course, was a beneficiary
Starting point is 00:50:31 of one of these technologies. I no longer suffered from not being able to spell. Yeah, I guess that's true. It's so dependent on the actual thing. I guess most people would say everyone should be able to spell somewhat. But I guess if you're a spelling bee champion, you'd be pissed about spell check, perhaps.
Starting point is 00:50:46 I have no problem with, hey, we have these unai-assisted spelling bee competitions. Yeah. And great, great, that's fun. Well, that's the thing, even with spell check, it's not that now we can all go win the spelling bee. It's still a very specific group of people who can do that. And same with the essays.
Starting point is 00:51:03 It's like now everyone can turn in an essay, but it doesn't mean everyone can write the best book. A kind of broader question would more be like, will some of these tests of merit just actually go by the wayside? So even like a college admission essay, if at some point everyone you read is a perfect, great essay, the college admission process might go,
Starting point is 00:51:20 well, this is no longer really a relevant measure of anything. I think they just all change. Why should the college admissions process stay the same? True, true. It's like that awesome system that everyone should be fighting that whole. It's like, well, we'll use this one. We really like it.
Starting point is 00:51:35 It's like, what's the better one? Let's be the better one. We've got new technology and all. Yeah. What is the triumph of the private commons? It's kind of playing on the tragedy of the public commons, kind of classic Adam Smith and everything else, you know, physical goods and people take from it
Starting point is 00:51:48 and don't give to it because it benefits them. The triumph of the private commons is, as we create these digital things, they're actually infinitely generative. Being a free rider doesn't take anything from anybody. Yes. For the first time ever. And we should take advantage of it.
Starting point is 00:52:02 Right, right. Okay, why is innovation safety? People usually think of innovation as only risk, because there is risk, but actually as you innovate, you create the new safety. So you go, oh, I create a car, it's risk. It's like, well, actually, in fact, I create bumpers, and I create brakes, and I create seat belts,
Starting point is 00:52:20 and I create airbags, and I create safety glasses. And you're willingly ignoring how many people died in horse-related injuries. Yes, exactly. Which was probably infinitely higher than even these horrific new crashes we were witnessing. Yes, and if you even took that to an extra point, you say the important thing is getting there at some speed.
Starting point is 00:52:36 What's my safety at getting there at speed? And as we innovate, we can up the speed that we can get there very safely. Yeah, we just went to Mexico City over the break, and I said, you know, we all want teleporting, but we fucking have it. If you told someone from the 1700s that in the morning I could be in Los Angeles
Starting point is 00:52:54 and in the evening I could be eating dinner in Mexico City, that's insane. I know, but I still do want teleporting. I really want it. You still teleport, you use the holy mouth. I am. But the problem with teleporting, Star Trek girl, is which of the two universes is it kills you and makes a clone and it's you going?
Starting point is 00:53:12 Because we tell two different narrative stories. Our current understanding of physics would be a clone would be the other side, not you. Yeah. Can't we have AI work on this where we don't have to die and get clones? We could do two hours in teleporting. So I've said to her, even if the technology ever existed, there are infinite industries incentivized to kill that. Every industry would collapse with teleporting.
Starting point is 00:53:35 People said that about AI. That might be the one that breaks everything. If you can live on a ranch in Montana and eat dinner at Emily Burger in Manhattan every night, who the fuck is going to pay for the rent in New York? You can go there when you want and then you can be on a ranch in Montana, eat dinner at Emily Burger in Manhattan every night, who the fuck's going to pay for the rent in New York? You can go there when you want and then you can get your ranch. But it's part of the changing of economics that technology likes. That's what cars did, is they made suburbs. Distributing the prices of real estate across more space, that's good for humanity.
Starting point is 00:53:57 It's good for humanity, not good for real estate owners. That's what I'm saying. The incentives to crush that would be the most astronomical ever. Okay, tell me about America's role and responsibility in an AI future. I think one of the things that we've done really, really well over the decades since World War II, it doesn't say that America's leading the world is perfect. There's lots of legitimate criticisms, lots of shit we got wrong. One of the things I love about America is we self-criticize and accept other criticism. But overall, among the things we've done is created a lot of technology leadership and created a lot of new technologies,
Starting point is 00:54:28 like creating new vaccines and creating the way the internet worked. A bunch of this stuff has been just net massively positive. Allowing innovators, technologists, individuals, he or they or he to make new things in the future is a great part of America. And that's one of the things we do. And the responsibilities are try to be
Starting point is 00:54:47 the better version of human. We in building these technologies and building these new companies and building these new products, like drive and drive responsibly. There's obvious reasons we're uniquely positioned for this. A, the technology's largely coming from 300 miles north of here.
Starting point is 00:55:03 That seems helpful. We have more capital than anyone to be steering it. Does the democratizing nature of the technology actually threaten America's? All technological transformations have a threatening resort and technology naturally, this is part of the 1984 worries, can generally be deployed by more autocratic regimes
Starting point is 00:55:24 to control things. China loves to use surveillance technology for all kinds of things, and we want to enable that individual freedom. And so that's not a reason not to do the new technology, but to figure out what values we want embedded in the technology that we as individuals use, that we as society use, that we have the government use.
Starting point is 00:55:41 For example, one of the things that's very popular on the left is, oh, we shouldn't have facial recognition technologies available to the government. It For example, one of the things that's very popular on the left is they, oh, we shouldn't have facial recognition technologies available to government. It's like, no, I actually think we should have facial technologies available. That's part of how we track down terrorists, how we might prevent that tragedy that happened in New Orleans. We just have good, healthy governance of it.
Starting point is 00:55:58 You say, I'm going to issue a facial awareness surveillance alert. Well, it should go through the judicial process. Right. And the facial recognition make it so easy to get through the airport. I love it. Yes, exactly. It's so fast.
Starting point is 00:56:10 I was gonna bring that up. I just got cleared for global entry on the way home and I'm like, hold up a second. I don't even have to go in my pocket to pull out my passport now. You really are sometimes a technology-late adopter. Oh, always. Yeah, yeah, yeah, yeah. I got that so many years ago. You really are sometimes a technology-laded doctor. Oh, always. Yeah, yeah, yeah, yeah.
Starting point is 00:56:26 I got that so many years ago. Well, he got kind of into life. You probably lived closer to the global entry office you had to drive to. I've taken two kids there. It's four hours of my day to get to LAX and do the whole thing. I was smart enough to do it at the airport
Starting point is 00:56:40 when I landed this time. Yeah, you have to do it when you land. It's called planning. Uh-huh. But it might be a new concept to you. I can help you with it. Yeah, you're to do it when you land. It's called planning. Uh huh. But it might be a new concept to you, I can help you with it. Yeah, you're right, I'm a big contradiction. I'm going to ask you my Yuval Harari pushback. We had him on to discuss his new book.
Starting point is 00:56:53 What I love and appreciate about the way he thinks is, which is almost impossible to do in this day and age, he finds a thing to worry about that almost no one else is worrying about. He focuses on bureaucracies and the power of bureaucracies and how much bureaucracies have shaped our world more than the charismatic leaders in question. Jesus didn't write the Bible.
Starting point is 00:57:11 A crew of people edited many, many stories to create the Bible. That's really interesting. What really has lived on is the Bible, not Jesus. And ultimately that was bureaucrats that did it. So that's a fascinating thing to think about. The point that he made that I find myself repeating a lot when this topic comes up at dinner parties
Starting point is 00:57:29 is we are organic creatures. We work in cycles. The stock market's open, it's closed. There's a nuclear attack. The market's not open. That's convenient. Our world is built in these cycles and for a reason. His point is these aren't going to help us.
Starting point is 00:57:48 They're going to run all day long because that's what they do. And they will force us to join them as they run all day long. In the stock market being an example, if that thing ran 24 seven and you are in that industry, you could tell yourself you're only working. Well think about email.
Starting point is 00:58:02 Just that one ad vent has made work 24 hours, essentially. Yeah, so philosophy major, what's your philosophical answer to that really rudimentary point? Well, we live in great societies because we have institutions. This is one of the problems that I have with all the critics, you know,
Starting point is 00:58:19 government bad, universities bad, et cetera, et cetera. This morning I was thinking a sector of this country hates the government, and what I think is like, when I, et cetera. This morning I was thinking a sector of this country hates the government. And what I think is like when I drink water out of my faucet and it doesn't poison me, guys, that's also the government. Yes, exactly. When there's an earthquake in the fucking building and it doesn't fall down on you, that's the government.
Starting point is 00:58:36 The fire department. You're focused on the five things they did that were fucked up and conspiratorial, but you're missing the fact that the stoplight worked. Everybody should be a renovate institutionalist. A renovate institutionalist, what does that mean? So not destroy institution, not, oh, allow the institution to whatever.
Starting point is 00:58:55 How do we constantly be renovating the institution? The institution's good. The institution is, the cars are safe. The institution is, we have stoplights. The institution is, you're not allowed to rob someone in the street. Yeah, you can't put mercury in my well. Yes, exactly.
Starting point is 00:59:06 Public universities. And by the way, doesn't mean perfect. So what we should do is have discourse and expertise to how do we renovate the institutions and always be renovating. And we might have red versus blue on which renovation we should have. And we should have some discussion and argument
Starting point is 00:59:21 about that, about what they should be. That's the thing to do. So for example, I am anti-government. You're like, oh my God, you're anti-American society. You may not realize that's what you're saying, but that's what you're saying. Stay tuned for more Armchair Expert, if you dare. to be a patriot and anti-government is in itself antithetical. You might just say I'm an anti-patriot.
Starting point is 00:59:53 You don't realize that's what you're saying. Yeah. That's what you're saying. I think if you had to sum up your overall arching opinion is like guys give yourself some fucking credit. You've done it. You have the capacity to do it. You're not going to lose your critical thinking on it.
Starting point is 01:00:09 Even if we don't do anything, the future will be magical. But let's be intelligent and try to make the transition more magical sooner and less painful as we're going through it. And let's imagine what are the things that we could possibly do with this. And by the way, on the America point, in these revolutions, so I call this the cognitive industrial revolution, the societies and the industries that embrace it first
Starting point is 01:00:30 have massive benefits. The power loom. Yes, let's do that, let's not fuck it up. That's again, the third bucket I find myself in. We have no option but to try to lead this. Not only no option, it'll be great. If you had to isolate a single promise that you're most excited about,
Starting point is 01:00:48 what do you think is going to be the very most impactful? Well, the first one that I think is the reason I started doing this, and I think it's begun getting some broad is the medical assistant on every smartphone. You and I have doctors we can call 24 hours. The vast majority of human beings do not have that. Oh, no.
Starting point is 01:01:03 We were walking around Mexico City, and the aforementioned friend who uses his chat GPT all the time, Eric, said, I'm really sincerely curious, how does the medical system work here? We're seeing all these different people and we're in Mexico City. And you imagine that every one of those people
Starting point is 01:01:17 had a doctor as good as my doctor, is pretty mind blowing. Yes. And so fucking positive. You, your kids, your family, your friends, now please, now. Yeah. When you talk about safety,
Starting point is 01:01:31 that's safety in your pocket. Yeah, innovation of safety. Now this has nothing to do with your book. I have one nosy curiosity, and then I have you, which I don't regularly get you, and I want your opinion on something completely unrelated. My nosy question is, have you made more money from investing or from exiting the companies you founded?
Starting point is 01:01:49 Never asked myself that question. You have to know the answer to that. You can all say it in your business, but I am curious. You come up, Reed. What do you do? It was fun. We're listening to Acquired, and we're listening to the story of Meta.
Starting point is 01:02:01 Probably from LinkedIn, but by the way, it's close. I got so excited when your name came up. I was like, oh, I didn't know that you were deeply involved. In PayPal, in LinkedIn. The way that I keep score is the impact I have on people's lives. Well, we're different. I would be looking at that bottom line.
Starting point is 01:02:17 I look at it occasionally. It's not that I think it's wrong to do it, and obviously you manage it. I believe you though, that you're just more interested in getting in on it and getting it going, and then this result ends up happening. You're not doing podcasting for money. Exactly. That's not why you started this.
Starting point is 01:02:30 Right. OK, that was my nosy question. Canna for mentioned Eric Richardson. Shout out third time. He and I argue all the time about crypto. What do you think about crypto? 2014 is when I bought Bitcoin. How much was a Bitcoin? $200, I think.
Starting point is 01:02:43 It's currently at $100,000 or something? Just under $100,000. Wow. So again, it's what the future could be. There's a bunch of stuff that's broken about crypto that needs to be fixed. Use for criminal organizations, use for terrorism. I think all the solutions for it nullifies all the benefits.
Starting point is 01:03:00 Well, today I think a reasonable person can argue that. Take a snapshot of where the puck is today versus where the puck could be. You'd go, okay, but actually I believe in where the puck could be. So let me make a couple of the arguments. Yeah, please. Not just the obvious ones of like, oh, look, you create a financial system that is programmable. You have a little bunch of entrepreneurial innovation on it for payments and for assets. Here's one thing that Bitcoin does an excellent job of today.
Starting point is 01:03:23 There's 200 governments in the world. You can argue zero of the governments are well run, but maybe 50 of those governments are well run. What happens to all those Venezuelan citizens and other people where the governments are totally fucking shitty run? They work their life, they amass the saving, and then these assholes completely devalued everything they've ever done. Devalue, steal, etc, etc. This creates the world safety net. You live in a bad fucking government. Now there's something you can do. How does it differ from just buying US dollars?
Starting point is 01:03:51 Many of these places you can't buy US dollars. US dollars not available. That's a good answer. I don't know that. And by the way, some of those places, you might not trust US dollars either. I think you should. But you know, by the way, there's variability. Buy the pound. I don't know, buy something that's stuck around for a long time.
Starting point is 01:04:06 Well, US dollars is the post-World War II. As long as they trade oil in it, it's safe. Yes. But it has that as a feature. When you get to a certain levels of wealth, upper middle class, even above, you go, hey, distributed portfolio holding is good. And so distributed portfolio holding could, of course, be some stocks and some bonds or 401K. And distributing that more is actually in fact a good thing. And crypto becomes one more mechanism for doing that.
Starting point is 01:04:31 Those are things where it's good today. Now like I said, a rational argument say all this fucking cybercrime, all these rogue states, that's terrible. And we should do something about that. What's the way we should mod cryptocurrency in regulations and other kinds of things to say all of these good things we should enable and future good things we should enable and these bad things we should disable.
Starting point is 01:04:52 So I'm positive on cryptocurrency, but just like on AI, I'm not positive on anything that could be made with crypto and any use that could be made with crypto. That's just good and you should just love it. I just watched a doc about the greatest heist of all time, right? These two stole $72 million of Bitcoin in 2017.
Starting point is 01:05:14 At the time it was $72 million. Its current value is $4.5 billion, right? So what was funny is in learning about how they caught the people was this thing I never thought of. The appeal for so many people of crypto was, yeah, the government has nothing to do with it. It's anonymous.
Starting point is 01:05:30 You can't trigger. Well, the ledger of it all is accessible and known at all times. Every single bit. Public to everybody. Yeah. And so you can hold it anonymously, which they did, but you can convert it into any good service or property or anything. You have to at some point convert it into US dollar.
Starting point is 01:05:50 Now once you do that, you're fucked. You have to present some identification. And so again, one of the big selling points when you look at it in practice, well, that's not true. It's anonymous so long as you never get a good or service with it. So that whole appeal to me is useless The fact that you could get it stolen. There's no one to call that sucks So now we're getting involved you the SEC
Starting point is 01:06:11 So really all that's happening in my opinion is you're just more and more funneling it into our currency So everything it was supposed to be it doesn't function unless we get it to be an actual Currency that our government is gonna somehow investigate when it's stolen and so Regulate. Yes. Am I wrong about that? Well, I'm generally positive. Don't come after me crypto bros. They're as bad as Taylor Swift fans.
Starting point is 01:06:36 Yes. I'm so scared of them. Please don't come after me. One aphorism that I sometimes use just to say is you want to regulate where bad regulation is better than no regulation and they say well how could that possibly be true? Look well actually financial systems one of them because a functioning financial system and it not breaking is more important than getting the future innovation. Right. A collapse is a massive disaster so financial regulation good. Yes. Now that doesn't mean all financial regulation doesn't mean don't be smart versus stupid,
Starting point is 01:07:06 try to be smart versus stupid, et cetera. So do I think there should be regulation around cryptocurrency? Absolutely yes. Is it a currency exactly, unclear? There's lots of tradeable assets. Are diamonds a currency? Is gold a currency?
Starting point is 01:07:22 Is your car collection a currency? God, I hope so. Read at this point. Yes, right. But you know, there's lots of different assets, value and so forth. And people say, well, the currency is because it has no intrinsic value other than its value as a currency.
Starting point is 01:07:36 And we have various things which have that kind of no intrinsic value system. Well, that's where I'll call myself a hypocrite. I look at the crypto and I go, of all the stories that have been told, this one is the furthest out there. Well, that's where I'll call myself a hypocrite. I look at the crypto and I go, of all the stories that have been told, this one is the furthest out there. Yeah, well, it's newest. It's the newest, but it is not linked to a government, an institution, to any kind of GDP. There's no way to assess its value other than you all agreed on it. Well, I'll admit the US dollar to me is a story I believe in. Yeah, it's a bunch of
Starting point is 01:08:02 paper. So I can admit I'm hypocritical. I go, I believe in one version of the story. This story feels a little out there for me. I can't latch on to this story. Things have value based on how many people and which circumstances believe in a story. Yes, exactly. So, you know, storyteller.
Starting point is 01:08:19 Okay, so you're pro crypto. I'm pro crypto, but again, just like super agency, let's be smart about it, not whatever you can do. I'm irked with the people who are trying to co-opt a phrase that I think I started with the techno optimism, which like, look, it's not whatever you can do, it's be smart about it. Right, right, right, right.
Starting point is 01:08:37 Oh, well, Ray, this was a blast. Wait, do you want to get out your phone? Yeah, yeah. Oh, yeah, yeah, yeah, it's your phone. Do you want to do it on the show? Yes, of course. All right, let's do it. Why not? Let me go get it. Is that an Android? Yes, of course. Yeah, let's do it. Why not?
Starting point is 01:08:45 Let me go get it. Is that an Android? It is an Android. Wow, you are mixed messages. I'm poly message. All message. All message. Well, if it works on an Android, it'll really work.
Starting point is 01:08:54 Yes, exactly. I'm teasing. So, what do you want to discuss today? What I would love to discuss today is, tell me how AI is going to reinvent the car industry and cars. For now we have software, we now have autonomous vehicles, but don't just do autonomous vehicles.
Starting point is 01:09:12 Do you like how it reinvent the actual physical car itself? AI is set to revolutionize the car industry in some fascinating ways. Beyond autonomous driving, AI can help design more efficient and safer car structures by simulating countless scenarios to find the best materials and shapes. We'll also see AI enhancing manufacturing efficiency, predictive maintenance, and even personalizing in-car experiences, making vehicles more adaptable to individual preferences.
Starting point is 01:09:40 The integration of AI could fundamentally transform how cars are built, maintained and experienced. And if I collected old cars and rebuilt them, what's some of the ways that I could use AI to help me? AI can be a game changer for rebuilding old cars. You could use AI to optimize your restoration plans by suggesting the best parts or techniques based on historical data. AI-powered design software can help visualize modifications or improvements. Additionally, AI can assist in sourcing rare parts by scanning databases or even suggesting
Starting point is 01:10:13 3D printing options. It can also enhance diagnostics, making it easier to identify and fix issues in old vehicles. You wanna ask a question? Does it only know your voice? No, no, everyone's. So, okay, let me think. Sure, take your time.
Starting point is 01:10:28 Thank you. Are you aware of the podcast Armchair Expert? Yes, I'm familiar with Armchair Expert. It's a popular podcast hosted by Dax Shepard. Do you enjoy listening to it? I do, quite a bit, embarrassingly so. Let me ask you, if I wanted to recommend an episode to somebody, what are the best five options?
Starting point is 01:10:49 They only have three good episodes. What if I broke? Please continue. Armchair expert has a few that often come up as favorites are the ones with experts like Brene Brown, who talks about vulnerability, and the one with Bill Gates, which dives into some fascinating topics. The episode with Esther Perel about relationships
Starting point is 01:11:11 is also highly recommended. And for something lighter, the episodes with Kristen Bell are a lot of fun. Sure. So if she was on my phone, do you think she'd recommend different episodes? Possibly because ChatGPD remembers your chat history. So all my chat history is in this.
Starting point is 01:11:26 Like she knows you and Bill is a great recommendation. We saw our friend, fifth time now, Eric, he said based on my searches please draw a picture of me or create a picture of me. And it was this huge muscle bone guy at a desk trading stocks because he's been searching peptides on the internet non-stop and he's a stock trader. Yeah, it's pretty funny. So it does have that, but it is the same program, but there's a different history of interaction. Now here, it might not have been able to have enough data on that to fully match.
Starting point is 01:11:56 What it will tend to do is go, what could possibly be something interesting? Let me throw out a few. Right. Right. So if you notice, we're like, well, some human relationship stuff, some technology stuff, oh, and then some interpersonal stuff.
Starting point is 01:12:11 Well, I got one to try. If my favorite movies are Once Upon a Time in Hollywood, Thief, Raising Arizona, Flirting with Disaster, what's another movie I might love? Based on your favorites, you might love Fargo. It's a quirky blend of dark humor and crime, much like Raising Arizona and flirting with disaster. Plus it has that rich storytelling and style you enjoy.
Starting point is 01:12:37 Give it a try. If Fargo were made today, what would be a modern Fargo? A modern Fargo might involve a quirky crime in a contemporary small town, perhaps involving internet scams or cryptocurrency gone wrong. It could feature a cast of eccentric characters navigating both the digital world
Starting point is 01:12:56 and their tight-knit community with a similar mix of dark humor and suspense. Maybe something like a tech startup's shady dealings leading to unexpected and chaotic consequences in a snowy, isolated setting. Oh, this leads me to a question I forgot to ask. How does ownership of content it creates work? So I can think of it more in images.
Starting point is 01:13:16 Like if I create an image through it, could I copyright that image once it's been created? Can I own it? It's new ground. Current legal process says, nope, there's no ownership of it because own it? It's new ground. Current legal process says nope, there's no ownership of it because ownership is something only created by humans. But of course that has a massive thing of,
Starting point is 01:13:32 you get an image and then I go, okay, now I touch this one pixel. Aha! Yay! I'm gonna give you a very specific example. I've written all these scripts, they went unproduced. I think it's called Google Veo 2, maybe, and it created a musician had a song,
Starting point is 01:13:49 wanted a video, and it was this crazy cool sequence of a car chase through a post-apocalyptic thing. I saw that too. It was very cool. I mean, it's not there there for me, but it was very, very cool. But it's on path. It's on fucking path.
Starting point is 01:14:01 But I was thinking, could I load in one of my unproduced screenplays and make it and let's say I Accomplished that goal now. I have this movie that came from my screenplay What could I then do with that and I guess my thought was I guess I would put it on YouTube and monetize How many clicks it got could I sell it to a studio if it was completed and worth distribution like what would happen if I? did end up creating a movie on one of these things with a script I had written? What could I do with that movie?
Starting point is 01:14:30 You could do anything that you could do with a video today. And you probably know this tons better than I do. Ranges from a YouTube channel to an event with a private screening. That's kind of a fun idea. Does Google own any of that or no? I don't know what the Google ownership, you're using the tool, but the terms of service on the license are. That's kind of a fun idea. They don't try to do that. And what they try to do is have enough ownership
Starting point is 01:15:03 that they can go, well, look, all the rights that we needed to do to do the service that we do. To publish it. To publish an artist. We have that, but we're not stopping anything else you're doing. Interesting. You picked a perfect time,
Starting point is 01:15:13 because I finally saw something that really lit a bulb in my head. Where it's like, God, if I could see these things I've written come to fruition that I needed a studio to give me the money for, that's a very exciting future for me. One thing, do it today, tomorrow, whatever. Pick one of your scripts, upload it,
Starting point is 01:15:30 and say, critique the script from the following perspective. What are the three areas that I would most improve in the script and why? Right? Just start playing with it. Part of what the AI does in terms of creativity is the speed of the interaction, because it allows you to think more like that
Starting point is 01:15:46 versus the I'm staring at it, I'm staring at it. It would really work with my kind of ADHD leaning thought speed. Exactly. Yeah. And being creative about it, like try it. Yeah, let me see it. Cool, oh, I don't like that. Okay, Reed, this has been so much fun.
Starting point is 01:16:00 So fun. Yes. Pleasure. Reed, I adore you. It's always so fun to bump into you anywhere on the planet. And I hope you'll come back with your next great thought. In delight and with AI on. Yeah. Please everyone check out Super Agency,
Starting point is 01:16:14 what could possibly go right with our AI future. All right, thanks Reed. Awesome. He is an art fair expert, but he makes mistakes all the time. Thank God Monica's here. She's gotta let him have the facts. Two peas in an hour, Monica, I'm actually concerned. Oh.
Starting point is 01:16:31 It's a little scary. That's a lot of peas. I only pee once a day. Yeah. I know, I mean. In two in an hour, you're like. Some would think that's scary, but. Initially that part's scary,
Starting point is 01:16:42 but then you get used to that. That's right. And then it's scarier if she goes twice in an hour. Yeah. Well, I did think for a minute that I was getting a UTI, but I'm not. Okay, okay. It's my mom's fault because she... She gave you dirty underwear?
Starting point is 01:17:01 Yeah. She does my laundry. No, she takes cranberry pills. dirty underwear? Yeah. She does my laundry. No, she takes cranberry pills. She's been taking them for a long, long, long time. And she swears by the cranberry pills. And so I started taking them for a while. And then I stopped.
Starting point is 01:17:16 And so when I was home for Christmas, she was taking hers. And she said, are you still taking your cranberry pills? And I said, no, I'm not taking them anymore. And she was like, why you should? I don't fuck with those anymore. Yeah. She she's like why you should take them that whatever. I was like I'm not taking them and then on New Year's Day I called and we were talking about our resolutions and she said what are your resolutions cranberry pills? I was so annoyed. I was like no I'm not doing it. Like three or four days later I was like I think I'm so annoyed. I was like, no, I'm not doing it. Like three or four days later,
Starting point is 01:17:47 I was like, I think I'm getting it. Great. Yeah. And I had to go pick up some good old cranberry pills. She'd be so happy, right? How could she not? I just told her, I'm back taking the cranberry pills. I didn't tell her that.
Starting point is 01:18:00 You didn't admit. No, I didn't. It was under duress? I did not. Well, I'll just say in a forthcoming episode soon, next week, we have a guest on who's talking about throwing a birthday party and I went on and on about how I don't want a birthday party.
Starting point is 01:18:12 Half the episode is about how I don't want a birthday party and we walked out of the garage from that interview and I was like, that's weird, Kristin's got a man over in the swimming pool. A big man. Yeah, I did not notice it was Erin at first. I was just like, oh, she's got a man over. I saw a little bit of, there's a woman in the mix
Starting point is 01:18:35 and I'm like, oh, these are her singing friends. Friends over, yeah. So then I come around the hedges and I see my son's here, Sharon Beasley. Yeah. And I go. son's here, Sharon Beasley. Yeah. And I go. He's in the hot tub. My very first thought, not to get too dark for the cherries,
Starting point is 01:18:50 but this is our past, I'm like, the only time Aaron has ever shown up in California without letting me know he was coming was the day I opened my apartment door in San Monica and there was Danny Vann. It looked like he had been drugged behind a car for a week, and he had been smoking crack for five days. Oh, you can't. In a tiny moment of I think I'm gonna die
Starting point is 01:19:12 or go get on an airplane and fly to my dad, he got on an airplane and flew to me. So you thought something bad was happening? I was already sober for a year maybe at that point. Yeah. I just got a bit of a PTSD, like why would Aaron just be here without telling me he's coming?
Starting point is 01:19:28 But did it make you feel better that he was in the hot tub and laughing? I said, surprise. And he goes, are you coming off of crack again? And I said, I'm here and I'm healthy. Oh good. I didn't really think that. And Ruthie was there too.
Starting point is 01:19:47 But you never know, she might have been like, we gotta get Erin to get back to town. An intervention. Oh, God. It wasn't that really deep, as I'm saying. I just had this really weird moment where this has only happened one other time. So, Kristen at that point had to acknowledge,
Starting point is 01:20:02 you're having a surprise party. I've planned a party, and I brought Erin and Ruthie here for the party. And then, then I was really excited. Good. Cause I didn't have to ask anyone to come. I didn't have to get scared about that. There wasn't a bunch of time for me to worry
Starting point is 01:20:17 about who's not going to show up. Cause it was just going to be the next day, which was last night. It was perfect timing. She planned, she knew she was going to tell you, which I thought was a good plan, that there wasn't going to be a real jump out moment. And also Rob and I did great acting
Starting point is 01:20:36 because during that interview we all knew. You already knew. Of course we knew. Did you feel so bad for me? Like what a dumb dumb, he's digging himself deeper and deeper. No, I was just like, oh this is gonna be a funny moment. Yeah.
Starting point is 01:20:48 But then I also was like, if he is then mad that this is happening or anxious, we had an interview scheduled the day of your party, which was yesterday. Yeah. And it was scheduled for two o'clock, the party started at five o'clock. Yeah. And I was like, oh no, okay,
Starting point is 01:21:06 I think we should probably move up the recording. When is she gonna tell him? Because he needs to have enough time to do his research. He's gonna be mad if he doesn't have enough time to do his research and if he already doesn't want the party. Oh man, I'm gonna react to everything. I'm learning terrible stuff about myself right now. He doesn't want the party to happen.
Starting point is 01:21:23 He's gonna be even more mad about it. He's going to hit people. He's going to hit the children. But you were very happy. Hey, my son's there, so I am happy right away. Aaron's here. This is unexpected. I had just seen him a couple days before in Detroit.
Starting point is 01:21:35 I mean, which was a surprise. Or it was last minute. Last minute and so exciting. Yeah. I was like, oh, I get to see dad before my trip. Twice. And get to see dad before my trip and get to pretend that it's our last night together. You get to go acting too.
Starting point is 01:21:49 Yeah. But Aaron's only been here for 10 minutes, but boy, we've already had a very eventful trip. So yesterday, before the interview and the fact check that was moved up, I said, do you want to go for a ride in the new station I'm in? To which Aaron was like, absolutely. So we went for a little rip around town, it was so fun.
Starting point is 01:22:07 And then I was like, let's quickly, I think we can, before the fact check, we'll get some eggs, so we stop at Cafe 101, we parallel park the wagon on the street. We go inside, we sit down, we're there for, I don't know, a minute and a half. And a stranger walks in and he looks at me and goes, white car? And I go, yeah, and he, a minute and a half. And a stranger walks in and he looks at me and goes, white car?
Starting point is 01:22:25 And I go, yeah, and he goes, it's on fire. And I just. PTSD. Oh my God. Too many fires in this city. I sprint out of the restaurant. I'm fully expecting it to be engulfed in flames. So we sprint outside, get to the wagon,
Starting point is 01:22:41 sure enough there is smoke coming out of the hood. Pull the hood pins, opened it up, a wiring harness is on fire, there's flames. I don't know how this worked. Did you see this happen? The spit? Yeah. I just went, I spit as hard as I could at the light.
Starting point is 01:23:00 You spit at the fire and it went out. It went out. I might have a super power I didn't know about. That was so stupid. Just grabbing a fire extinguisher, I couldn't figure out how to pull the plastic pin off. And by the time I got the pin, I was watching him and he went, boom!
Starting point is 01:23:15 And the fire was gone! It was just smoldering. I don't know how that worked. I do worry about that with fires, about using the fire extinguisher, because it's not something you can practice. Well, you should, I guess. I guess. I realize that.
Starting point is 01:23:30 Yeah. Oh, I should know how to do this. Yeah, I know. I have one by my bed, and I look at it a lot, and I think, I don't really know how. You should get the safety off and everything ready to go. It seems really simple, and then also when you think about it, it seems really hard.
Starting point is 01:23:44 There's a video of Taylor Swift trying to use one in her kitchen. Oh, there is. And she's also like, what the fuck? She didn't do a good job. She did end up doing a good job, but she always does. Good job. Was something on fire in front of her? Yeah, in her kitchen.
Starting point is 01:23:56 Oh, and someone's running video. And then I'm piecing together, because there's now a woman behind the car, who I guess was either parking or got in her car. And then what became obvious is she must've got the hotel guy, but I do wonder had we not just had the fires, if like she would've been so incredibly, she might've saw smoke coming out of her car
Starting point is 01:24:17 and be like, I don't know, it's not my problem. Exactly. But yeah, everyone's on such high alert that I think she immediately went. I think that's right. She's like, this whole area is about to yes I was then thinking like oh my god What if I had set another round of fires in Hollywood?
Starting point is 01:24:34 You finally got Los Feliz People would have been like that entitled bastard Why is he gonna have these cars that catch on fire? These old new cars that keep catching on fire. Yeah, he can afford a new one, but he bought an old one. He should be driving electric. That wouldn't get caught on fire. You went near the end of it.
Starting point is 01:24:55 Yeah, but you can drown in them, I should say. Well, when they catch on fire, you can't go breathe anywhere near them, as we're now finding out. That's the crazy part. But also, you know that horrible story story and I don't know all the details So I don't and it's very sad but Mitch McConnell's Sister-in-law drove into a pond. Yes and a Tesla and like couldn't get out and died
Starting point is 01:25:20 And I think it might have to do with that Tesla. It probably shuts down and water. The door opens electrically, not mechanically. Or I think, I don't want to be sued. I think there's a release on there as well. If you had read the manual and practiced like a fire extinguisher. It's in the fire extinguisher, no one's doing that. It's in the door though, it's right there.
Starting point is 01:25:41 It's what? It's right there in the door though. Oh, okay. Well, apparently people like came and tried and they were there and they couldn't. Yeah, I'm vaguely aware of that. Yeah, I probably talked about it before. I think you did. It was when you were wanting everyone to have a window smasher.
Starting point is 01:25:56 Yeah, now I have one of those smashes. Oh, nice. Aaron, I have an update for you. Oh, please. I think one of the last times you were here, we were talking about my running. Yeah, I remember this. Yes. And I want you to know since then,
Starting point is 01:26:11 a sign has been put on the yard, not by me, that says no. It says no dogs allowed slash no trespassing. I forget exactly what it says. It says something explicit, like no more of this. So it was solved for me. I thought you were gonna bring up his chest hair. That's where I thought this was going. No.
Starting point is 01:26:34 The last time I saw you talked about having shaved his chest hair, how sexy he felt. Okay, so what's the update on that? Well, I just had my back shaved again. Are you making cashmere? I should because it's an enormous amount of fur that comes off. Really quick, Erin, I've never let you feel this.
Starting point is 01:26:55 No, but I've seen it. You'll be a very select group. Go ahead and touch that. Be really careful. Give an honest what you think about that. Oh my God. Is that not the softest thing you've ever felt? Can I sleep with it tonight?
Starting point is 01:27:10 Oh. Oh. There's a blanket made of that? No. So you can make some, I know, you gotta be careful. That is, it is cashmere, isn't it? It sure is, there's no denying it. And you know, now Delta's leg hairs are in there too.
Starting point is 01:27:28 Oh, nice. So we're getting it, we're harvesting from my back and her legs. No, family. Maybe it can be a joint, Aaron's hair, because then we can really get some movement on it. At the risk of hurting Aaron's feelings, Aaron has had dense body hair his whole life.
Starting point is 01:27:43 He's a man, he's always been a man. I'm not, I've always not been a man. I've always felt like a preteen or a woman next to him. And he's always had a gorgeous Burt Reynolds chest hair since we were 17. Okay, so you think it's coarser. I think it's coarser. Would you?
Starting point is 01:27:59 Yeah, yeah, yeah. Yeah, all I'm saying is the amount. Yeah. Okay, well you don't have to have cash, Mary. We can is the amount. Yeah. Okay, well you don't have to have cashmere. We can do mohair for you. So Ruthie just shaved me again with a dog. You know, it's got a vacuum. I may have said this before, but it has a hose and you watch it accumulate in a chamber.
Starting point is 01:28:23 And this is so fun because that's why I realized, I'm like, oh my God, I have so much hair, like this much. My table went black. A Ruthie's super grossed out by it. Well, she can't touch the cashmere then. I know we're both getting hair now everywhere. It's maddening.
Starting point is 01:28:45 But have you noticed, we already talked about our armpit hair. Have you noticed your armpit hair is like will be two feet long sometimes? If we haven't talked about it, yeah, I've been chopping mine. With scissors? Yeah. Same.
Starting point is 01:28:59 I now give myself like a haircut. Yes. In my armpit. Me too. And then I have this very weird, I wonder, do you have this where it's like, I have it in my armpit. Me too. And then I have this very weird, I wonder, do you have this where it's like, I have it in my mind it's supposed to be some length, or else why would it be that long?
Starting point is 01:29:12 Because I'm tempted to just like cut it as close to the skin as possible, so I have to do it less, but then I think, aren't you supposed, it's gonna be some length? I wonder that too. Can you just keep, like grow it out just so we can see how long it gets?
Starting point is 01:29:25 You're right. There should be video evidence of some of this. You know why I had to finally confront it other than like maybe my kids pointed it out and I was embarrassed. But also you can't even get your deodorant to the skin anymore because it just clumps up in your hair. And then you have dingleberries in your armpit and you're like, what the fuck is going on? Why am I a mess like this? Red light. Well, 50, you know, not Aaron. Yeah, not yet.
Starting point is 01:29:51 Hmm, not even close. Six months. So it's like we're at a party. Okay, great. How about you guys talk about my party? Oh, it was a yacht rock theme. It was a yacht rock theme. It was at my favorite roller skate rink,
Starting point is 01:30:03 Moonlight Rollerway. To name drop, Kimmel made the playlists, as he's made me many playlists in the past that I love, and Kristen enlisted him to do that. Tons of blasts from the past. A lot of my Groundlings friends, just an incredible, perfect group of people. My dad came all the way out from across town,
Starting point is 01:30:23 Tom Hanson. Adam Perry Lang made barbecue. Adam Perry Lang cooked. My dad came all the way out from across town. Tom Hansen. Adam Perry Lang made barbecue. Adam Perry Lang cooked. It was incredible. Yeah, did you get on both of the meats? I just had the pork. The pork ribs.
Starting point is 01:30:34 Yeah, so insane. How good were those? They were so good, and the cornbread was so good. Oh, it was damp as a mother-fragrance. It was like a, what do you call it, cake? A spoon. Pound cake. Oh, it was damp as a mother-fuck. It was like a, what do you call it? Cake, a spoon. Pound cake. Pound cake.
Starting point is 01:30:48 Yeah, it was delicious. I was really excited about the Adam Perry lane of it all because most of my friends that were there didn't know about him. And I don't know what their expectations were at a roller skating rink, how good the barbecue was gonna be, but he is one of the best barbecue.
Starting point is 01:31:04 He does these pop-ups around LA and there's fucking lines around the block. Yeah, and he's just there cutting ribs. And a roller skating rink. Yeah. It was really sweet of him to do that. He's a very nice man. He is. I've also enlisted him to teach Kristen how to make crepes for Christmas one year, her
Starting point is 01:31:22 birthday I forget, and got her all the machinery and stuff and now she knows how to make crepes for Christmas one year, her birthday I forget, and got her all the machinery and stuff, and now she knows how to make crepes. She stuck with it. She goes through little phases. There's no crepes for months and months and months, and then there's crepes 10 days in a row, she cooks them compulsively. I hope there's crepes tomorrow.
Starting point is 01:31:36 Well, exactly. Oh, you should bring that up, she would love to. I thought, I had a craving for crepes three days ago, and it was nine p.m., so I didn't think it was possible I had a craving for crepes like three days ago. And it was 9 p.m. So I didn't think it was possible for me to get it, but I wanted it so bad. And then the next day I told her I was craving them. She said, I'll make them for you.
Starting point is 01:31:53 And then guess what the word was on Wirtle today. Crepes. Plural or singular? Singular. Singular. Isn't that weird? That's suspicious. So normally this is Delta's birthday party is here every December and we always get the rink for two hours
Starting point is 01:32:12 And it's always my commitment to skate as long as possible because it's such a unique opportunity But the window of this party was 5 p.m. To 10 p.m. And I was like, I don't oh I got a backup one other thing that happened because I didn't know I was having a party, is that day before, I rode the furthest I've ever ridden on my new road bike. I told you, I passed your house a couple different times. The armchair he sent me, I was even paparazzi'd, I didn't know it.
Starting point is 01:32:38 Yeah, I went everywhere. I was down in the LA River, down past Spoke, then out in Glendale, then back into Silver Lake. So that whole day when I came out and he saw him in the pool, I was like, I went too hard, I can barely walk. And then she said, five to 10, we're skating. I was like, five hours of skating.
Starting point is 01:33:00 And we're gonna skate, we're gonna skate hard. And oh, one thing I gotta add is people really went for it with the yacht rock theme. Aaron had an incredible outfit. He really did. Turnkey bought off of Amazon. It was perfect. Yeah, it was white.
Starting point is 01:33:14 Captain's outfit, it was white. All white, all white. Hat, pipe. Pipe. The pipe really pushed it over. The pipe was bright. I don't even know why it makes sense. It doesn't, doesn't. I don't even know why it makes sense. It doesn't, doesn't.
Starting point is 01:33:26 I don't know, I can't like, I'm racking my brain of where I've, this is some famous image of a captain with a pipe. It's like from Love Boat or whatever. But again, I don't even think on Love Boat that was happening. What's the other one where they're all shipwrecked? Gilligan's Island?
Starting point is 01:33:40 Yeah, Gilligan's Island. The skipper didn't smoke a pipe. Did like, Gorton's fishermen have a pipe or something? Oh, yeah, but again, that's a yellow rain slicker. This is like Mandela effect. We all just think we know these captains have pipes, but none of them do. I don't think any of them do.
Starting point is 01:33:57 And how long have you been since you skated, Aaron? Probably seven years. Seven years. I was a lot heavier and I, and I. You were at 320 at that point. Yeah, and I remember really, because I was like, I'm going to do this, and I was real wobbly, and then after a few laps,
Starting point is 01:34:16 I was expecting everyone to be watching me, because I thought I was that good, so fucking stupid. Oh, I'm like, oh really, I'm just a fat guy roller skating and I'm old. But you're waiting for them to notice how fucking cool you look. Yeah. Shit, I didn't know your old man ripped. Yeah, everyone's gonna high five my fucking kids.
Starting point is 01:34:41 I'll hoist you up on their shoulders. They probably got beat up the next day. Oh. Well, despite that, you did quite well. And Monica, let me thank you in public. You have sat out the last two skatings. I have. But then it was your birthday, and so I thought,
Starting point is 01:35:00 I gotta do it for your birthday. That was so sweet of you. And did it, you did. I did it. You did quite a bit of skating. I did like 12 for your birthday. That was so sweet of you. And did it, you did. I did it, I did like. You did quite a bit of skating. I did like 12 laps or something, maybe more. I think more, cause you say, yeah. I wanna say you had three different sessions, didn't you?
Starting point is 01:35:15 I had multiple sessions. Even after eating, I put them back on. And I learned today, earlier today, that you actually had gone out by yourself to do a little trial run like the garage in Tennessee. I did, where I learned to ride my bike. Yes. Yeah, I got a little nervous
Starting point is 01:35:30 not having been on them for a couple years, and I was first to arrive at the party. And so I just went out on my own and did a little practice. Yeah. Bound my sea legs. I'm not good on them. And I don't, Ryan tried to help me. He was like, you know, go more side to side.
Starting point is 01:35:51 I go more front to back, I guess. Yeah. It's sort of my proprietary move. So I'm just not very good at it, but it's fine. It's hard for me to not be good at it, but yes, you kill it. I ate shit though, let me say that. Once. And then by the way, on the way there, I said to Erin,
Starting point is 01:36:08 okay, if you haven't gone in a long time, my prediction for you is you think you're gonna be bad, and then you're gonna get out there and go, oh my God, I'm actually quite good, and then you're gonna get overconfident about 15 minutes, and then that's when the big wreck comes. And I had been skating for a good hour and a half, two hours at this point, I was very loose,
Starting point is 01:36:24 I was feeling very confident, and I decided, you know what, as I skate by Joy, because she was high-fiving people on the sidelines, I'm like, you know what's gonna be radical, is I'm gonna blow by shooting the duck. Oh, is that what you were trying to do? Oh, did you think I just fell out of nowhere? No, I knew you were doing something with Joy,
Starting point is 01:36:40 but I didn't know that was possible. So when people don't know shooting the duck, you get on one skate and you bend down, I can't even bend down on one leg if I wasn't know that was possible. So when people don't know shooting the deck, you get on one skate and you bend down. I can't even bend down on one leg if I wasn't wearing a skate, especially with the biking from the day before and the two hours of skating. And I got halfway into this commitment and realized I'm going down,
Starting point is 01:36:55 and I went down pretty hard on my hip. Yeah, it's scary. Luckily you were on the rink at that moment. I was. And you were right behind me, so you saw the whole thing. I did. Made a loud enough noise that I think everyone looked. I missed it. I don't think so, actually.
Starting point is 01:37:09 I was skating, but I missed it. Yeah, I don't think so. I'm not sure if I want, I wish you had. Well, I wish I had, but you've skated up to me and you go, the fucking duck bit back. And I was like, what? And I didn't even know that happened. You got up really fast.
Starting point is 01:37:29 And you didn't look hurt. That's the goal. Yeah, but I knew you were. And then you hadn't seen, I knew you were, you hadn't seen Fall Guy. Right. And so, have you seen Fall Guy?
Starting point is 01:37:40 I did, yeah. Oh, what a movie, right? So good. Every time this stunt's over, they give the thumbs up, like I'm good. They're not good, their back's broken, what a movie, right? So good. Every time this stunt's over, they give the thumbs up, like I'm good. They're not good, their backs broken, they're like, right? So then I was like, I gave the thumbs up and for the people who had seen it,
Starting point is 01:37:52 but you hadn't seen it. And then I felt compelled to try to explain to you my jokes. So I was in a good deal of pain and I had given you the thumbs up and I needed you to know that was a no, I was making a joke. Yeah, Guy said, are you okay? And then he gave the thumbs up and I was like, oh.
Starting point is 01:38:04 And then you had to come back and tell me what, in fact, wasn't okay. It was not okay. But I kept going. I was like, is it, is my hip fractured? Maybe it's fractured. Let's just keep going. There's nothing they can do for that.
Starting point is 01:38:19 So you can put a castor on your hip. This is why it's really, roller skating's so tricky because you have to commit if you're gonna have fun and if you're gonna be good. But then you will fall. So you're choosing. Especially if you try to show off like me. Well, once you're feeling it, like everyone's gonna try,
Starting point is 01:38:38 and that's what I don't do, I just do half because I don't wanna fall, and I don't fall, so that's great. But then it's half. There were very few falls last night. Right. In fact, kind of a shocking few falls. Because at Delta's party the first few years, people were going home really bad.
Starting point is 01:38:56 People had wrist injuries. It was like 20% of the dads were fucked up the next week. Ana fell last time. Yeah, she went down pretty hard, I saw that. Yeah. Kind of great though. And everyone also, you see it, and then you just like, you kind of look away.
Starting point is 01:39:11 You look away, yeah. Let people recover on their own. Well, no, so that's a great, as you know, I like getting embarrassed. And when I fell down at the movies, you know, I was laughing hysterically. But the pain was so sharp, like I wanted to laugh because it's just embarrassing.
Starting point is 01:39:26 I tried to shoot the duck, Joy watched the whole thing, her friends watching, she doesn't know me, so there's a new person I've, at least Joy is my like sister. I've known her for 12 years. It was painful enough that I couldn't actually get to the laughing part. I just got to, I could shake it.
Starting point is 01:39:40 You were smiling pretty big though. Well, I was performing. Yeah. I came on a retirement from acting and put on a real good thing, then gave a thumbs up You were smiling pretty big though. I was performing. Yeah. Ha ha ha ha. I came on a retirement from acting and put on a real good thing, then gave a thumbs up and only two people got it because I thought everyone saw Fall Guide and they hadn't.
Starting point is 01:39:53 Ha ha ha ha. Stay tuned for more Armchair Expert, if you dare. If you dare And then Kristen somehow I'm not I mean master the logistics But our favorite treat in the world for Michigan is Saunders bumpy cake for people who've never had Saunders bumpy cake it is a chocolate damp Devil's food cake base and Then these huge ribbons of Twinkie filling all covered in a fudge
Starting point is 01:40:35 icing Did you try it? Yeah, I did I hadn't had it before I did you like it. Yeah, I loved it You're not that's not your thing. Oh fuck. Yeah Did you like it? Yeah, I loved it. You're not that's not your thing. Oh fuck it. Oh Really good kids you didn't love chocolate right? Yeah, right right. I do know I had some for breakfast And I fucking went nuts I had my immune suppressant before the party cuz I knew we were gonna have bumpy cake and I ate so way too much Wait, so I coupled the hip pain with like, I made myself sick, but then I was like, you must get out and push through all of this.
Starting point is 01:41:08 Did another couple hours of skating, it was a fan. It was a great, great party. It was a lovely party. And as I told you what was really kind of novel about this party, because I've had them in the past, I'm always feeling really guilty, like Tom Hanson, he drove 50 miles across town through rush hour traffic to get to the party,
Starting point is 01:41:26 and he's older, and then I'm out skating the whole time, and the whole time I'm like, I gotta go sit and talk to Tom. He drove all the way here, and then virtually everyone that's showing up, I feel like I should be talking to them for a half hour, and then I had this light bulb moment where I was like, everyone here is a blast to talk to.
Starting point is 01:41:41 When I look around, everyone's quite happy talking to whoever. I'm not needed here, I can stop feeling talk to you. When I look around, everyone's quite happy talking to whoever. I'm not needed here. I can stop feeling guilty about that. Did you have the moment where you were like, oh, all these people I've collected. Yeah, I tried to have that moment. I did.
Starting point is 01:41:55 I was in the bathroom and I actually even did my mantra for like two minutes. I was like, I really want to, yeah, have that moment and see that. And then Tom Hanson, who knows me so well, he hugged me and he said, are you able to let any of this love in? And I said, I'm trying real hard. But also I'm pretty sure everyone's here
Starting point is 01:42:17 just because Kristen invited them. Like if I'm being honest. So silly. That's so unfair. That's so unfair to all of us. To you. Yeah, I mean. To anyone who went. I don't know, still my residual low self-esteem
Starting point is 01:42:34 or something, I don't know. It was the funnest party I've ever been through. And there's no fucking booze there. That's saying something. Yeah, I mean, I'm still riding on it today. I gotta push back a little bit. I guess you're not counting your birthday parties, because Aaron's birthday is July 2nd,
Starting point is 01:42:52 which becomes a 4th of July Aaron's birthday spectacular. It was often at his dad's farm, and it would be like a three-day party with 50 people camping. Those were fucking awesome. Those were damn good. Those were good, yeah. But I'm happy you had a great time.
Starting point is 01:43:11 I fucking had fun, man. I had so much fun. It was really fun. So much fun and I don't want you to get the wrong impression. I felt very loved. Good. I mean, when you were blowing out your candle, I looked around and I was like, oh, this is so lovely.
Starting point is 01:43:26 Well, what I thought of many times is like, this is, you'd be hard pressed to find a better group of 40 human beings. So it was a really good group. I felt lucky that I had been collected. Oh, nice. Over the 50 years. I was happy to be there.
Starting point is 01:43:41 Did you have a favorite conversation with anyone? I had so many. I had one I can't repeat. Oh, okay. That was the most memorable. Was it Noddy? It was with Eric. Yeah, it was Noddy.
Starting point is 01:43:50 Oh, okay. He's so open though. He's so provocative. And I can't because it's not about him. Okay. I can't wait to hear that. Jesus. He never fails.
Starting point is 01:44:02 Yeah. I think I know every, I think there's not one thing he can say that will get me. And man. He got fails. I think I know every, I think there's not one thing you can say that will get me and man, man does he. I got to talk to Joy, I ate with her and that was so nice. I hadn't talked to her in a long time. I was excited about that. I was more looking around seeing people I love talking
Starting point is 01:44:19 to each other and watching you guys chat. I was like, oh yeah, my sisters are chatting. Yeah. And she was wearing all day glow yellow. Yeah. And watching you guys chat, I was like, oh yeah, my sisters are chatting. Like, yeah. And she was wearing all day glow yellow, like full. Oh yeah, like neon. She looked amazing. Yeah, neon, fishnets. So awesome.
Starting point is 01:44:34 We came up with my superhero name, because we were of course talking about spades. And her friend she was with was black as well. And she's like, this is the upside down world. They play spades, we don't play spades. And then her and I came up with that. My superhero name is The Appropriator. And my catchphrase is, I'll take it.
Starting point is 01:44:52 I see something from your culture. I like, I'll take it. The Appropriator. Oh, that's funny. Really fun to talk to Nate and talk to, see Nate in such a long time. Yeah, again, if Nate talks at a party, what am I worried about? Anyone finding someone to talk to?
Starting point is 01:45:08 How about you? Did you chat with anyone new that you were like, oh, that's good. I loved catching up with all the, yeah, like the old groundlings, your crew from back then. Like I hadn't seen Ben, Tim Loves that. Yeah. So that was all very fun and super comfortable. I haven't seen Ben, Tim Loves to Nap. So that was all very fun and super comfortable. Like we all just dove into conversations that were fucking great.
Starting point is 01:45:33 Yeah, Ben sent me a text like, I talked a lot with Aaron Weekly, it was so fun. Yeah, it was great. And that Vaxen, of course, probably stole the show. He was the same white outfit Aaron had, but his was short shorts. Oh my God, yeah. And he was very sexy on those roller blades. He was so athletic. He was so same white outfit Aaron had but his was short shorts Oh my god yeah And he was very sexy on those rollerblades
Starting point is 01:45:46 He was so athletic He was so good on those Yeah Like I said, best party I've ever been to Oh man It's kind of your birthday too You get to take, you guys get to share birthdays Yeah
Starting point is 01:45:59 I like it Yeah Kristen did a damn good job there She did She really did And she was very worried. She said, boy, he's really been talking a lot and that he's very grateful he didn't have a party.
Starting point is 01:46:13 And what a load off. I think she had heard me say like four times between my birthday and yesterday, I'm so glad I would hate a surprise party. And the poor girl had planned a surprise party. But what do you think now that it's happened? I'm so happy. And that just goes to show I'm known better than I know myself in a lot of domains.
Starting point is 01:46:37 Yeah, I was delighted. As long as I didn't have to go through any of the anxiety of it all, it was absolutely delightful. Would you have been upset if it was a real surprise? I guess there's no way to know. I don't think so, but I just, I don't know how you get me to the roller rink. Exactly, it's too hard. I think she would have had to tell.
Starting point is 01:46:53 I'm just so annoyingly curious about why would we be doing it. Yeah, because Penne wasn't gonna take you to the roller rink. The only thing that had been planned and I had been tricked on is that Penne and Nate and I were supposed to have dinner that night. That was the big plan. And even like on Monday I asked Kristen,
Starting point is 01:47:11 can I go to dinner with Nate and Penne? And she's like, yeah. I'm thinking she must've been laughing her buns off. The plan was coming together perfectly. And then so I was both really excited about the party and then I was like, oh man, I was really looking forward to dinner with Penne and Nate. So we was really looking for a go to dinner with Andy, so we just decided we will also go to dinner next week.
Starting point is 01:47:28 Were you guys going to Morton's? I hope. I hope we bring George Kittle with us. Aaron and I decided the best compliment you could give any guy is to say like, oh, I'd love to go to Morton's with that dude. Like, if you don't know him, but you've seen him from afar, and you go, fuck, I'd love to go to Morton's with that dude. Like if you don't know him, but you've seen him from afar and you go,
Starting point is 01:47:46 fuck, I'd love to go to Morton's with that guy. I mean, he is a great guy. I like Dan Campbell. Can you imagine going to fucking Morton's with him? He goes directly to Morton's. Don't stop. You guys wouldn't even have to say it. We'd get there and the food would be on the table.
Starting point is 01:48:01 Oh, goodness. What's the equivalent, is there equivalent of that for the gals? Oh, like the spa? Oh, yeah. No, because that's a solo. Yeah, I should, the mall, I guess. Oh, yeah.
Starting point is 01:48:14 Shopping. Yeah. Go right to the mall with her. Yeah, I think I'd go shopping with her. Like what do you want to do with Ashley and Mary Kate the first time you hang out? Besides tour. Oh, my pants just fell off.
Starting point is 01:48:25 Exploded. My underwear fell off. I don't know if I can go shopping with them. That's confusing. I imagine it. I would be self-conscious if I were you, because I'd be afraid everything I said I liked, you'd worry they didn't think it was good style.
Starting point is 01:48:41 I guess we would go to their store. Okay. And I'd think, what do you guys think I should get? I'll pay. And they'll be like, I know you'll pay. We don't give discounts. We don't fuck around like that. And then they'd pick out some stuff for me
Starting point is 01:48:54 and then we'd all go have martinis. They would smoke cigarettes. I wouldn't. Oh man. You wouldn't smoke with them? No, I've never done it. And even for them, I won't. Wow.
Starting point is 01:49:03 I'll roller skate for you, but I won't smoke a cigarette for you. Let me go to bed early. Oh. Yeah. Too early. Yeah. Right away. Yeah.
Starting point is 01:49:14 Anyway, well it was lovely and I hope you feel happy today. Oh I do. Just I took my immune suppressant and I think I have a cold that came back and I think I have a fractured hip, but also just beaming with gratitude. Yeah. I felt hungover this morning.
Starting point is 01:49:37 In fact, so many times, I don't know if you remember this, do you remember I got like a really cool sweatsuit for Christmas and a watch and I was at your house. We were in the backyard, and I somehow fell and fucking split my lip like wide open. When you were little?
Starting point is 01:49:53 No, when I was like home for Christmas, and our Christmas started at a certain time, and then we had stayed up till nine in the morning the day before, and then so I didn't wake up till five. So I got to Christmas Eve, our big family tradition, quite late, and I just had this enormous bloody scab on my mouth from the night before. We had a lot of those.
Starting point is 01:50:16 So anyways, when I woke up, I was like, I have a headache from eating all the cake, I'm sick, and my hip hurts. I feel like I was at Arizona. I was at 99 or something. Okay, should we do a little just quick facts? Some factys? Yeah, this is for Reid Hoffman.
Starting point is 01:50:31 Okay. You said that Martin Luther raised literacy rates from 3% to 95% according to AI. Okay. While there isn't a precise percentage readily available to quantify how much Martin Luther raised literacy rates, historical research indicates that the Protestant Reformation, largely driven by Luther's teachings, significantly increased literacy levels across Europe, with some studies showing a difference
Starting point is 01:50:55 of nearly 20 percentage points between predominantly Protestant and Catholic regions. Essentially, areas with a larger Protestant population had substantially higher literacy rates, but there's no percentage. I'm clearly way off given that percentage. But you got it from the weirdest people on earth. Joseph Henrik.
Starting point is 01:51:13 Joey Henrik. Friend of the pie. Go back and listen to that one. Is our unemployment rate as low as it's ever been? The unemployment rate was 3.4% in January and April 2023, the lowest in 55 years. 55 years. The last time the unemployment rate in the US was this low was in the first five months of 1969.
Starting point is 01:51:34 The average unemployment rate during the Biden administration was 4.1%, really low. Yeah, very low. A lot of countries have 10% and above. Yeah, 1984 did come out in 1949. But we think he wrote it in 1948. Yes. But he had said, read our guests,
Starting point is 01:51:53 now it was 48, which was clever, because 1984 was a flip of 48. And we still think that's true, but it did not come out until 49. The price of Bitcoin today is $104,627.70. Wow. Wow, wow, wow. Erin, did you watch that greatest high stock on Netflix?
Starting point is 01:52:16 Not yet. Okay, well in a nutshell, these two incredibly weird people, I don't feel bad saying that. Their names are Ilya Dutch Lichtenstein and Heather Raselkhan Morgan. Yeah, so a bit of a clue in their names. And they stole, stole, I always say stole, that's one of my things.
Starting point is 01:52:39 They stole 70, yeah, there they are. Oh, man. Boy, that photo of him on the left. He might have Stockholm. This makes me feel sad, actually. Oh, it's not, yeah, you are not happy for either of them. But they stole $72 million in Bitcoin that they really couldn't unload,
Starting point is 01:53:02 so they just held it. By the time they were arrested, it was worth $4.5 billion. So it's the greatest theft in the history of the world. Oh my God, no, I have not seen this. Now worth 10.8 billion. Now worth 10.8 billion, if you had 72 million. It was stolen from Bitfinex or Bitfinex, I don't know,
Starting point is 01:53:23 and it's a Hong Kong based virtual cryptocurrency exchange. So that's yikes. We've had friends that were involved in some thefts of that, right? Yes, we did. Well, that was of... We have a friend that's a victim of it. A victim, that's what I mean. Yeah.
Starting point is 01:53:41 Yikes. A lot of money. Which I can only imagine what it's worth now because that was what, five years ago? Yeah Yeah, yeah, yeah. Yikes. A lot of money. Which I can only imagine what it's worth now because that was what, five years ago? Yeah, it was, yeah. Oh, I have a thing marked here. So you made a car. You designed a car using AI.
Starting point is 01:53:54 Yes. And you decided you wanted to show it. I would love to. Wob, you want to put on the TV because you're so good at that now? Yep. You can email me it. Oh, and I'm going to send you one picture after that
Starting point is 01:54:04 of a Ted Segerars poster that I, well, there's two, there's three actually that I really like, let's see. Oh, I'm gonna send four things. Oh my God. Because then I did something for our show. Exciting, all on AI? All on AI.
Starting point is 01:54:19 I played around with some merch stuff on AI. That was fun. Ooh, okay, so. Right, so. For the listener. Ooh, okay, so. Right, so. For the listener. Yeah, so I want a 15 passenger van for Nashville because I assume we're always gonna have so many people staying with us
Starting point is 01:54:33 that anytime we go to dinner, we're gonna need like tons of seats. So then I was like, well, I want a 15 passenger G van, a Chevy. That is amazing. But I want it to be John Player paint scheme. I want it wide body. I just said like not a ton of things and it spit this one out. It's fit three amazing ones out.
Starting point is 01:54:50 Did you say black and gold? Look at that. Oh yeah. That's the John Player livery famous paint scheme. Isn't it interesting how they fuck certain things up? Like look at the grill. It's kind of going to say Chevrolet, but it doesn't. Yeah.
Starting point is 01:55:02 It's weird. And then like there's normal letters on the tires and then Weird abstract and that's the third one look at that Fucker can you imagine? Here's the weird thing. I did look and I'm only noticing now that it's blown up that big. It's got two gas Oh, it does yeah Yeah, two fuel doors. That's weird Right you see that, Monty?
Starting point is 01:55:25 You need a lot of gas for that sucker. Well, he probably knows I'm gonna be gonna have my foot in the kitchen the whole time. Okay, now go, Rob, to the other, can you go to the other? What is John Player? Was a cigarette brand that sponsored Formula One and racing, and they have this signature black and gold. So then, okay, so I said,
Starting point is 01:55:43 make a poster for a beer brand named Ted Segers in the style of Miami Vice. Yeah, I love this one. Isn't that great? That's really cool. And again, what does it say? It doesn't say Miami Beer, but it says like, Nam-Yam-Yoo Beer.
Starting point is 01:55:58 Almost makes you think it's like a Thai beer or something. And like underneath it's just squiggles kind of. Hieroglyphics? Yeah. To get that Egyptian. Okay, now, then I said, give me a beer brand, you know, Ted Segers, I want a muscle car, a motorcycle, and a muscular horse.
Starting point is 01:56:14 Wow. Ah. Ah. This one is a little. That fucking horse. That thing is ripped. Oh, this is so funny. All right, and then I was like,
Starting point is 01:56:25 what if a beer brand named Ted Seeger sponsored a hydroplane race boat, and it made that. This is incredible. Yeah, but what's on the side? Did you? I think it's a- I think it's barley or hops. Oh, probably.
Starting point is 01:56:41 Yeah. Okay, and then- All right, and then, look at this beautiful. That's pretty. Get the hell out of here. It's really pretty. What'd you say, a crow with a cherry? Yeah, a crow with a cherry in its beak.
Starting point is 01:56:52 Okay, but. I had to do many to come up with this one. This one took a while. Did you see this? What's this? That's your, oh, that was your invite for the party. Oh, I didn't see that. It says, you're rock rock, right?
Starting point is 01:57:06 That's not an H, it's an R. It's Y-R-H-C-H. Yeah, you're right. And Captain's questionable. There's, oh, he's got sneakers on. Yeah, he's a jacked crow. No, I hadn't seen that. That's great.
Starting point is 01:57:19 Look at that vascularity. Oh my God, that crow is a fucking animal. How scared is that other crow that's flying in the background? He's probably flying to get the fuck out of there I didn't really she used AI for it. Oh Must have well, that's that that was it. Those are all the facts Fun AI detour. Yeah, unless you're listening. I guess that'll get cut out if you're I was just thinking if you're listening And you couldn't see go to YouTube and watch it.
Starting point is 01:57:45 Y'all. All right, love you. We love you. Love you. Love you. Follow Armchair Expert on the Wondry app, Amazon Music, or wherever you get your podcasts. You can listen to every episode of Armchair Expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple podcasts. Before you go, tell us about yourself by completing a short survey at Wondry.com slash survey.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.