Tangle - Suspension of the Rules. - Prolific podcaster Andy Mills joins the show today to talk about The Last Invention and more.

Episode Date: December 12, 2025

Today on Suspension of the rules, Isaac, Ari and Kmele are joined by Andy Mills from the prolific podcast The Last Invention for possibly one of the more important conversational topics we can all be ...having right now. They talk about the current state of artificial intelligence development and what the future holds and what we should potentially do about it. After that conversation, the guys chat further about the EU v. X controversy, some breaking news about an oil tanker we seized off the coast of Venezuela, and the latest from the Trump administration about checking non citizens social media backgrounds before entering the country. Lastly, a good grievance section where Kmele drops the ball again and forgets the prompt. Ad-free podcasts are here!To listen to this podcast ad-free, and to enjoy our subscriber only premium content, go to ReadTangle.com to sign up!You can subscribe to Tangle by clicking here or drop something in our tip jar by clicking here. Our Executive Editor and Founder is Isaac Saul. Our Executive Producer is Jon Lall.This podcast was hosted by: Isaac Saul and edited and engineered by Dewey Thomas. Music for the podcast was produced by Jon Lall.Our newsletter is edited by Managing Editor Ari Weitzman, Senior Editor Will Kaback, Lindsey Knuth, Bailey Saul, and Audrey Moorehead. Hosted on Acast. See acast.com/privacy for more information.

Transcript
Discussion (0)
Starting point is 00:00:00 Coming up, we've got Andy Mills, one of the most prolific podcasters of the last couple of decades, joining us to talk about the last invention. And then Camille, Ari, and I discuss the EUVX controversy, the latest from the Trump administration on checking people's social media who's coming into the country, as well as some breaking news about an oil tanker we've apparently seized off the coast of Venezuela and some grievances where Camille once again forgets the prompt. It's a good one. Good morning, good afternoon and good evening, and welcome to the suspension of the rules podcast. Last week, a place where none of my co-hosts showed up. This week, a place where they're both here. And we have a guest joining us, which I'm really excited about. As many of you know who have been listening to the show for any period of time longer than a month. A little while ago, we had a partnership
Starting point is 00:01:04 with Andy Mills and Matt Bowles, who are, you know, in my view, I think in most people's view, two of the most prolific podcast creators of the last two decades. They were responsible for more hit shows than I can list here. And we were honored to work with them
Starting point is 00:01:19 to get a written piece up about their podcast, The Last Invention. Phenomenal show, in my view, about the current state of artificial intelligence development and what the future holds and what we should do about it. So joining us today is Andy Mills to talk about the last invention and have an existential crisis altogether.
Starting point is 00:01:49 Andy Mills, welcome to the show. Thanks so much for being here. Well, thanks for having me, Isaac and Ari and Camille. Is it Camille? Is it how are you saying? Sure. Cameli. Cameli is the way people say it often and the uninformed. The Canadians. Andy, you have released to me the, I mean, I'm not, I don't want to glaze you in the first two minutes of the show. You can glaze.
Starting point is 00:02:19 Glaze away. I'll glaze. All right, go ahead. Some uncomfortable slang to start. Yeah, yeah. Podcast of the year in my view. I don't say that just because Tangle got the chance of partner. with you guys in the early days and do our little written version of it, which I think at this
Starting point is 00:02:33 point, a lot of our readers and listeners are probably familiar with. Listening to it, I feel like it is above the target of one of the most important conversations that we're having right now as a country and as a planet. I mean, we're seeing, with all the news just coming out this week about Trump and how he's handling some of the AI chip sales to China and what's allowed and what's not. I mean, it's clear this is like a global phenomenon, global issue. I think maybe to start, I'd be curious just to hear from you how you're feeling about, like literally how you're feeling about the AI space as it stands today. And I know it's a broad question, but I'm wondering, like, you get to the other side of this story. You've produced this podcast. You have whatever it is,
Starting point is 00:03:20 10 episodes out. I'm sure there'll be some bonus and follow-ups. Like, are you feeling angsty? Are you feeling like confident? Are you feeling like, oh my God, people don't understand what a big deal this is? Like, where is your headspace now coming out of the other side of the tunnel? Well, as you might expect, it's definitely been a reporter's journey because when I first got interested in this, it sounded so crazy even to me. And it was years ago that I first heard about this seemingly sci-fi plot that smart people were saying, true, that the people who are creating AI weren't aiming to build a product, but were aiming to build something much more like a new intelligent species. And some of them believed maybe something
Starting point is 00:04:08 closer to like an intelligent God. And as I learned that this wasn't a fringe view, and then as I watched more and more investment go into AI, I thought, okay, I want to report this out. And I'm definitely coming from a place of like skepticism and fear. We're kind of in the emotional, they were the emotional companions to the early reporting. And also just like any reporter who you have a story that you feel like isn't getting enough attention. There's a part of you that thinks,
Starting point is 00:04:42 oh, what an amazing opportunity to tell this story that I don't feel enough people know about. But there's another part of you that thinks, am I going to look crazy telling this story in these terms, taking seriously these big, bold claims? And there was a moment early on in the reporting where when I would tell my friends what I was working on, like even just like last March, some of them would be like, Andy, I don't know if that's a good move. They almost treating it as if I were saying, I'm taking alien abductions really seriously. And I have this whole thing where I've met these people who really think they were abducted by aliens.
Starting point is 00:05:17 There's kind of a knee-jerk reaction against that kind of reporting. It felt similar to that. So how I'm feeling now is much more calm. I feel encouraged by the fact that even just since last spring, I think this has become a much more mainstream conversation. I think people are really, I'm finding and encouraging how many people aren't just taking the knee-jerk reaction and saying, oh, this is crazy or, oh, this is terrifying, or, oh, this is awesome. it does feel as if here in this moment where this technology is still emerging, and there's a lot of
Starting point is 00:05:55 questions, people are paying attention and they're asking those questions. And I feel like that's about as good as it can get in our field, you know, that we are trying to report information, tell people's stories, bring people different perspectives, and they engage in it and engage with those views in good faith as they go to inform their own. I know that that's not everyone, but I am seeing that is an enormous amount of the public. In the show and in some of the writing that we did and like our companion piece to the podcast, you kind of divide people up into these three different groups.
Starting point is 00:06:33 You know, there's the accelerationist, the people who really want to push AI forward, AGI forward, for various reasons, whether it's, you know, just to beat China and be the ones to have it or greed, selfishness, money. There's the doomers, the people who are really scared about the future of AI, they think we need to pump the brakes, immediately need to regulate, outlaw, stop this from happening.
Starting point is 00:06:56 And then there's the scouts who are kind of the, I don't want to say they're in the middle or like the moderates, but there are people who I think are in almost a wait-and-see and or are really cognizant and sympathetic to both the doomers and the accelerationists and are kind of looking for some sort of middle ground, maybe to advance slowly or advance with heavy regulations. I'm wondering on the other side of doing the show
Starting point is 00:07:22 how you would describe yourself now. Like, where would you put yourself in that framework that you have? I think I started off a little bit more Dumer Scouty just because I have friends that are. I think it's just, I should be open. One of my friends is Sam Harris, and he's featured in the podcast, and he was one of the people who, 10 years ago,
Starting point is 00:07:46 got me thinking deeper about this. And so I was coming to it with a lot of that domery scout mindset. And I hadn't really heard the case for accelerating it. And I don't think that the people who are making the case that we should accelerate this technology, I don't think that they're often making a very good pitch. But the more time I got to spend with them, the more I came to see that their view is really interesting.
Starting point is 00:08:11 And I think worth seriously considering. So I find myself right now, I think all three camps have a good case to be to make. I think that they have evidence on their side. And I think like now is the time to take a beat, hear where they're coming from, and slowly develop your own view. So I'm kind of comfortably not in a camp at this moment.
Starting point is 00:08:36 I will say this, though, that part of me that was super skeptical, I'm less skeptical now about the possibility we create AGI and that we do it within decades. That's something that I didn't believe a year ago that I do now. I think it is likely that we are on the verge of this hinge moment in human history. And I don't know if that's three, five, ten years. No one does. But it doesn't appear to me that it's like a thousand years away.
Starting point is 00:09:11 It appears as if the time has come for us to take this really seriously because it does seem at least somewhat likely. So I'm a believer, I guess, is the camp I'm in. I am inside of that world. Interesting. Okay, this is a great opportunity then. I would say I'm a doomer after listening to the show. Interesting.
Starting point is 00:09:36 I think I was a dumer going into the show, and I think I'm a dumber coming out of it. And I think I think I went into the scout world at some point, and I never touch the accelerationist world. And like, I would say to define that clearly to me, I am in a position around, I'm like, I don't want any more than what we have right now. I think this is enough. Like I chat GPT is cool. Grock is okay, I guess. Like being able to put, you know, MRIs through AI or like do math really quickly and, you know, get better diagnoses. All that stuff's great. I'm like, we can do that now. I'm good. Let's just stop right here. This is totally good for. for me, which I would define as doomer, like, slam the brakes and we're okay with what we have.
Starting point is 00:10:23 Right. Don't make AGI. Don't make something more like a species. Don't make something that could completely automate all labor. You're saying, don't do that? Yeah, I'm saying don't do that. But I would like to hear, I mean, you just said, like, you went and you're a little bit skeptical in different ways and, like, fully acknowledging the fact that you're like, I don't know
Starting point is 00:10:45 how I'd place myself in the camp. Yeah. I do know this. I can make the case for any camp. That's what I want. I want you to steal man the accelerationist camp for me. Let me hear what you think are like the best arguments for that view. Oh, this is fun. So if you're looking at the accelerationists who are at the head of, say, a company like Anthropic or Open AI, they don't like to say that they're accelerationists because in their mind, Their motivation to go as fast as we can and not to over-regulate it is because they believe that the technology is very likely to be built by someone and soon, and they think that if it's made irresponsibly, it could have catastrophic effects forever, that there would be, that this hinge moment in human history would be one of catastrophe if the wrong, people build it. And so they think the only way to stop that outcome is to make sure that they're the ones that build it, that the good guys make the good AGI before anyone makes the bad AGI. And I don't think that that's a crazy logic if you can buy onto other aspects of their belief system, which I know that Camille doesn't exactly. But that's one case. The people like Peter Thiel or Mark Andresen, who I think are more comfortable with this
Starting point is 00:12:16 identity as accelerationists. One of the cases that they're making is that we always get nervous when a transformative new technology comes out. And there's this instinct inside of us to pull back, to get afraid, and that when we let that fear guide us, it doesn't lead to good outcomes. And you almost always end up in a situation where you were holding back human progress that you could have experienced in earlier times. think that this is possibly so transformative that it's impossible for us to even imagine
Starting point is 00:12:55 what is on the other side of AGI, and that to let our fears push it back 20 years, 30 years, 40 years is something that we're going to regret whenever we get there. I find that to be somewhat compelling, and especially if you buy into the idea that what it will be like to live in this AGI world. We maybe don't know all the details. But it is a world where there is more intelligence. And that is a, getting more intelligence has been the source of a lot of human progress. That just think about, like, why would you not want more intelligence working at Tangle? You know, over at Tangle, wouldn't you be happy to welcome in, like, a limitless amount of more
Starting point is 00:13:39 intelligence, more intelligence to make decisions, more intelligence to develop new products, It's more intelligence to investigate the world and its mysteries. And I find that to be compelling. And I also say this. The accelerationists, they have some good arguments when it comes to what the Dumers are afraid of. I won't get into them now, but I do find that they have some good arguments that they are getting, the Dumers, quote, unquote, are getting ahead of themselves. They're a little bit over their schemes. that it's almost they have this philosophical belief
Starting point is 00:14:13 that is causing a lot of their fear and I agree that the evidence isn't as compelling as some of the Dumeers make it sound like it is. I wonder if there's a response here where there's less of a Dumer critique of the accelerationist point of view and more of a skeptical critique of the point of view. So saying, look, both of you,
Starting point is 00:14:39 not quite the same as a scout argument that's saying we need to know more before we make decisions, but saying both the doomers and the accelerationists are in a way very optimistic about the trajectory of this technology
Starting point is 00:14:53 because they both are buying into the possibility that it's species creation globally transformational things where it's very possible that Isaac's dream scenario that we're done here is just what happens. I know that
Starting point is 00:15:08 I have this calendar event on my Google calendar for next August when the first threshold of a prediction from Sam Altman or somebody similar saying AI is going to be better than humans at everything. That's the first milestone. Then early 2027 is the next one when there's another prediction about that. And I just don't truly believe that that will happen. I think it's going to be very tough to put it in a really succinct way. it's going to be very tough to build something to replicate a word that we can't define already. We don't know what intelligence is.
Starting point is 00:15:43 We don't know how it works in humans. And to say that we're going to be able to replicate it in a way that surpasses. The thing that we don't understand doesn't seem totally like it's something that is a foregone conclusion to me. Well, I'm just a skeptical person in nature, Ari, so I'm with you, 100%.
Starting point is 00:16:01 I feel the same way. I'm most comfortable playing the part of the guy who's skeptical about any claim, little alone a claim as dramatic as the claims that these people are making. And so when I say, like, I'm less skeptical now, all I'm saying is that I have been convinced that it's possible and even likely to come
Starting point is 00:16:22 in a way that two years ago, me, would not believe. You know, like, I'm in a strikingly different place now than I was a couple years ago, but it's not that I've become certain, It's just that I'm taking the possibility and the likelihood a lot more seriously. But what you're pointing to with these predictions that are being made, I think that there's a little bit of a... Mark's like showmanship element to it. Yes. And I think the smart thing to keep in mind is that we need to be really skeptical about the specific claims coming from people at the head of companies that are raising money.
Starting point is 00:17:01 And it's not because they're bad people. It's just the incentives are, there for them to really hype this up and to get specific enough to get people to give money now. You know, oh, you don't want to miss out on investing in my company now because we think we're going to hit this milestone. And it could be by 2027. I think we should for sure have a lot higher bar of skepticism in the situation like that. And I'm even with you that it's likely that the key that they believe they've found to unlock this AGI, this recipe of things. And the transformers, the LLMs, the, you know, scaling up the data and scaling up the compute, like that probably isn't going to be enough.
Starting point is 00:17:48 Like they're probably going to have to innovate more and more and more. And there's a lot of people out there who understand the technology much better than I do, who would say that. You know, I interviewed two of the godfathers of AI in the series, Jeffrey Hinton and Yahshua Benjio. And their stories are really striking because they have been working on this technology in Hinton's case since the 1970s. They dedicated their whole lives to it. And now they're going around saying, we're not prepared, and this is coming a lot faster.
Starting point is 00:18:24 And Hinton, he doesn't buy the idea. He's going around doing this, right? Going around telling people, we need to get ready, this thing seems poised to overtake human intelligence and it will have the ability to rule the world and if we make it irresponsibly we could cause our own demise. But when he hears Sam Altman say five years, he's still like no way. He's friends with Demas Hasabas, who runs Deep Mind, which is owned by Google. It's where Jeff Hinton used to work. And he has a friendly disagreement with Demas when Demas says it could be five or ten years away. But Hinton would say, I used to think it was like 70 years away at best. And now
Starting point is 00:19:04 I think more like 20. Similar to the way I hear people talk about fusion, though. Sorry, not to cut you off. No, go ahead. It could be one of those things where the timelines always going to be both further and sooner than we think, and then we put in some progress towards solving a problem. But the problem space is so large
Starting point is 00:19:21 that any progress we make just reveals what other problems there are on the horizon. Yeah. And then the horizon keeps getting pushed back. Yeah. I mean, it's a hard thing for, me because I've trained myself as like a reporter to never try and predict the future, never tell people what the outcome of an election is going to be. It served to be well. That's intelligence.
Starting point is 00:19:45 To not get into that prediction market game. But so I'm a little bit uncomfortable just by the very nature of it. But I do think that there's enough evidence out there and that enough of these insiders are sincerely concerned about this, that we should take it really seriously. And we should form our own views, and we shouldn't allow a knee-jerk skepticism that I sometimes hear from people, this idea that this is all just an attempt to make money, but this is like, oh, this is all just a bunch of rich billionaires who don't care about human beings and who, you know, just want to make more money. My reporting just shows that that's, it's a more interesting story than that. They may be completely wrong, but even if that was that was.
Starting point is 00:20:34 the case. Even if it turns out in 10, 20 years, you listened back to the last invention and none of their predictions came true. I will be really proud of this piece of work because it is an honest reflection of what people think and believe right now. And what people think and believe is what shapes the world. Like, there's nothing more powerful in the world than what we believe. And like, looking at the stock market and how much investments going into this, that is belief. That is people saying, I believe that they could pull this off. And that's an interesting and important story. And I think that it would be wrong to just say, eh, you know, those guys.
Starting point is 00:21:17 And also, it could be catastrophic if it turns out that they are right. I'm curious, Andy, in the course of this conversation, and this happened, this came up when we worked with you guys on this initial piece when the episode first came out, this distinction between AI and AGI. which I think, you know, get used interchangeably conflated in various ways when they're really different, or my understanding at least is like in the technical sense to people understand this stuff, there is an important distinction. I'm wondering for our audience if you could just kind of contextualize how we should think about those two things and like the, you know, chat GPT and what you're using on your computer and like an AI robot to get customer service versus.
Starting point is 00:22:04 like the kind of technology that the doomers are worried about or the kind of technology that like the accelerationists are trying to barrel us towards like what's the what's the distinction there i mean the shorthand fun way if you just like want a way to remember the difference between a i aGI and as i is just think algorithm species god right like this is like this is like AI, we've been interacting with it all the time. AI is going to recommend the podcast we are recording right now to people who don't know who we are but might be interested in this when they go to YouTube or whatever, right?
Starting point is 00:22:48 That's AI to a certain extent. And the chat bot that we're engaging with right now, that is an AI product that is created, that's made instead of out of the algorithms that have been choosing your social media feeds, it's been created out of this LLM model, all these large language models. And it's an amazing product.
Starting point is 00:23:08 It's really remarkable. It's shocked. Its capabilities have shocked the very people who created it, right? It's a hell of a story as a reporter to cover the LLMs and the chatbots. But they, in some ways, are a product. The thing, the AGI, this benchmark
Starting point is 00:23:25 that they're shooting for, this is something that the name AGI even, it was popularized by one of the founders of Deep Mind, this guy, leg. And the way that they're thinking of it is that don't think of it as a revolutionary new tech product. When we get to AGI, we are going to have something that's so profound. I mean, a lot of them use language like the singularity. They'll talk about the event horizon. We just cannot know what's beyond that. That's the kind of language that you'll hear some of them use
Starting point is 00:24:04 talking about AGI. And the way I like to think of it is just an AGI, instead of it just being a smart calculator or, like you were saying earlier, it makes a cool image or it's like this cool product. The AGI is essentially like having an incredibly intelligent person who never sleeps, who can learn anything. And having them 24-7 or having thousands of them 24-7, working on any problem that you want them to work on.
Starting point is 00:24:37 And so at AGI level, that's when it's not that it replaces some secretaries. It's not that it becomes a different thing when you call customer service, right? That's like a product, that AI service. When they hit AGI, they think that it will be a beginning, the beginning of a new era where we no longer have a labor market as it's existed since the Industrial Revolution. because everything just becomes how do we take the work that is currently being done by humans
Starting point is 00:25:07 who, you know, like to take off holidays and sometimes get home over and don't do a good job? How do we off-shoot everything a human is doing to the AGI? And that won't happen in a day. Like, that's going to take time, but that will be the beginning of a new era
Starting point is 00:25:23 is that AGI is like a really, really smart person inside of this system. And people think that it will be kind of like a new species because of that. And then ASI is the third one, artificial superintelligence. It's one that, you know, has a lot more skeptics. But this is the belief that if we were to create that AGI, it would, one of the jobs it would take, essentially, is the job of making the next version of AGI.
Starting point is 00:25:52 And once the AGIs start building more and more and more intelligent systems, they think they'll be what they often call an intelligence explosion. where it may be weeks, it may be months, it may be years, it may be decades, but they believe that AGI would essentially become a artificial superintelligence and that it may, at that point, be beyond our understanding. We may not even understand the language it uses, it's interests, you know, it's understanding of the universe. And that's definitely feels a little bit more sci-fi, even to some of the people who
Starting point is 00:26:30 really buy the AGI, but a surprising amount of them believe in it. And Facebook's AI lab is called the super intelligence lab. Like that's what they are saying that's, we want to brand ourselves on that kind of an ambition. And so it is something to take seriously, even if it sounds a little wacky to us. I mean, I think one of the fundamental challenges that I have with the way that the conversation proceeds around this issue is that. the fact that humans have to be in the loop today for most AI applications with the LM certainly, my expectation is that whether we're talking AGI
Starting point is 00:27:10 or even super intelligence, you'll have competing models in a similar sort of way that we have competing models today, probably more of them. And again, you'd still want humans in the loop. To the extent that continues to be the case, it seems to me that the really optimistic perspective, and I would actually categorize
Starting point is 00:27:30 like the doomers, the scouts, and the accelerationist as a kind of optimist because they all imagine that this extraordinary accomplishment is achievable. In most cases, in a matter of years and not decades or centuries from their consensus opinion. Right. And in some ways they have more in common with each other than they have with the average person. But that's exactly why I wanted to make them front and center in this podcast is because that's who's making this technology
Starting point is 00:28:02 are the people who really believe in it and they're having a debate while the rest of us are often being like oh, this chatbot is leading to a cheating scandal in our high school. That's an important thing to know but let's not get so distracted by the cheating in high school thing that we aren't aware of the conversations and debates happening
Starting point is 00:28:23 among the people who are making it, whether they're right or wrong, you know. But you're saying, so maybe I'll form this into a question. You are as convinced or perhaps more convinced that they're likely to achieve their agreed upon end, even though there are plenty of people who are skeptical that even the current approaches are the ways
Starting point is 00:28:45 that they'll get to AGI. There's a lot of people who doubt that the LOMs are actually the right step forward in order to actually achieve these ends, in which case there's a whole other technology that needs to be discovered. You're perhaps already at the limits of what the LLMs can potentially do.
Starting point is 00:29:02 Well, I've interviewed a few prominent skeptics, including this guy, Gary Marcus, and we're going to air that interview here in a couple weeks on the feed as like a follow-up continuing coverage on the story. But even Gary Marcus, who doesn't think LLMs are all that impressive and doesn't think at all that an LLM is going to lead to AGI, when I asked him if he believed that AGI was impossible, He said, absolutely.
Starting point is 00:29:30 And I said, do you think it's likely to come in the next 10 years? He was like, no, I mean, maybe 20. And I was like, hang on, maybe 20. 20's still really soon, sir. It's really soon. So even the biggest skeptic out there who's like going around trashing the LLMs and saying we're not going to get there by LLMs alone, he agrees that there have been massive, unexpected leaps forward in the field
Starting point is 00:29:55 and that this goal they have of creating HGI. is likely enough that he wants to continue to invest in how we would bring it about. So once again, Isaac, you are the more of the doomerish person here. My question for maybe you and for Camille is just the part of it that I now struggle with, like, my last vestige of skepticism is along the lines that it's going to lead to human extinction. and I did full disclosure on this podcast,
Starting point is 00:30:28 I used to be a fundamentalist Christian and it was a beautiful and a terrible thing to be, you know? And since leaving that fundamentalism, life has been more interesting and complicated and I'm so committed to not being a fundamentalist anything anymore. But I have to like, I have to reckon with the fact that I have a knee-jerk reaction against apocalyptic talk Because I used to be so convinced in a specific kind of apocalypse that was near that I do have this like, ugh, apocalypse, come on, guys.
Starting point is 00:31:05 You know, like, I'm back here again. What the hell? So I struggle with that. Is your doomers, doom reviews, is it about the ASI that takes over? Or is it more, like I talked to Ezra Klein about this yesterday, also another follow-up episode that we're working on to come out with the series. Ezra's just really worried about and all this Huxley like dystopian future that is coming in the next five to ten years if we continue
Starting point is 00:31:34 to just zombie walk our way into incorporating AI in our society, the way that we somewhat have zombie walked our way into social media. And I find that to be a far more compelling case to be worried about because I already feel like
Starting point is 00:31:50 I'm in an apocalyptic sci-fi movie when I go out in public, when I go to a concert. And I see these, it looks like insanity, the people who are staring into their phones appearing to be in some kind of hypnotic state and engaging in like short attention stealing videos that if you ask them, you know, 24 hours later,
Starting point is 00:32:13 hey, what video did you watch at 3 o'clock yesterday? They don't even know what they saw. They went into like a fugue state, right? And I can see that if we pump a bunch of AI products into that environment, we might find ourselves in a world where we are distracted, we are entertained,
Starting point is 00:32:30 and we are handing over the most valuable parts of the human experience to a machine in exchange for, like, dopamine and, you know, erections, you know, because one's a huge part of it. Yeah.
Starting point is 00:32:44 No, I have an answer to that. I mean, so I'll say, I'll frame this in a... I'll up top first. I mean, I was in high school, like, reading Ray Kurzweil, the singularity, whatever. Amazing. Like, that was informative for me on my views about this stuff. And so I was, I was very captivated by him and the people that kind of, I mean, I think he is like a singular figure in terms of the futurism and people who are writing and talking and thinking about what's coming.
Starting point is 00:33:19 And, you know, a lot of the stuff that he talked about back then, 20, 30 years ago, whatever, has come to fruition in certain ways, but a lot of it has been further off than he expected to. And so I think I was sort of like in the mania. I wasn't a fundamentalist Christian, but I had, I was that, like, I was subscribed to that. Like, this is the smartest guy in the world and he's telling us what's coming. and we are, nobody gives a shit or is paying attention. And then I sort of saw a lot of, like, him be fundamentally wrong about some things. Right, nanotechnology especially, right?
Starting point is 00:34:00 He was so big on. Yeah, which, right, which, like, gave me this skepticism about kind of just this whole field of, like, futurism thought. So that's, like, something prior that I came into this with. I would say I am maybe not a Dumer because I do have some skepticism, like, the anti, the non-dumer quality that I have is some skepticism about how close we are to this stuff. Like, um, I mean, I've said this before and I know I fully recognizing the stuff you said like chat GPT is the product and it's not the technology and whatever, but it's like I use this stuff fairly regularly and it still sucks so much that I'm just like there's,
Starting point is 00:34:42 like I don't feel threatened by it quite yet. I feel like I'm, I can wrap my head around it. I can fool it. I can get it to do what I want. And I know we're in the early stages, but I think I, there's sort of, I can't quite put my finger on it, but there's sort of this like future I imagine where the AI is kind of smoking the AI and it's just like a, it's like, you know, sitting in the garage with its car on inhaling its own exhaust. And there's something broken about that ecosystem. them. Like, I just don't know how that's going to produce this incredible product. We still feel, the humans still feel so core to everything that it's doing. So I just don't really understand how that part of it's going to play out. And I subscribe a little bit to Ezra's view also. Like, you know, I can't remember that there's a movie that Matt Damon is in.
Starting point is 00:35:39 I know the one. I'll look it up. One second. Yeah, there's a famous scene where he's like, he has a parking ticket and he's in line and he's talking to the robot and he's trying to like... Alicia. He has some sort of... Yeah, Elysium, yeah. And he's like...
Starting point is 00:35:52 And the bot is just stuck in this loop and it's like the future where robots are the police and he's like, no, like, he had this very human explanation that the robot can't wrap its head around for why he made the mistake he made and he can't get it to see it. And it's like, that to me is the dystopian future. Like that is... Which is sort of in line with Ezra's view.
Starting point is 00:36:10 Like, I imagine that world where we have these things that are just like kind of shitty at what they do and we're just stuck with them and that's awful and infuriating and we can't get it to do the thing we want um what your show introduced me to from the dumer perspective which was a view that i did not hold or like an idea that i not occurred to me i'm i mean introduced me in a million ideas but the one that i think i really stuck with me was the dumer perspective of like if there is a five percent chance that advancing this technology ends with humanity being destroyed? What the fuck are we doing? Like, stop. And I totally agree with that. Like, I don't want to take a one-and-twenty gamble on, like, I'm a recreational gambler, I'll say.
Starting point is 00:36:58 My fan-dual account is active on Sundays during football season. I've hit some one-and-twenty bets before. Like, I have no interest in, like, those stakes on everything dies, and we get destroyed by some artificial intelligence. So that was an argument where when the first time I heard it, I was like, oh, yeah, definitely with those guys. So I think that's kind of like the whole of my Doomer perspective, if I had to sum it up. Yeah, I mean, it is a compelling case they make.
Starting point is 00:37:30 In the podcast, Nate Soreys says that you wouldn't, you wouldn't feel satisfied with a bridge builder in your city who says, I'm going to invest a ton of money in this bridge project. But here's the thing. It's got a 5% to 10% chance that kills the entire population of your city. You're like, I'm going to look for a different build,
Starting point is 00:37:52 bridge building. It's not, there are better builders of bridges. I do think, though, that the accelerationists, they have this, this gets to the more accelerations who don't like to be called accelerators, the Dario Amadez. Like it or not,
Starting point is 00:38:09 like no one asked, no one took a vote before they made the printing press. No one's going to take a vote here. It's one of those things where it's like, yes, but I don't want to do it. This is who we are. This is what we do.
Starting point is 00:38:27 We make nuclear bombs. That's a crazy thing to have made. And we made it. And I'm with you. I don't know if they can make AGI. I don't know if they can, right? I'm just saying I am far more convinced that they can, that I'm far more convinced that they might be able to do it,
Starting point is 00:38:48 convinced enough that I'm trying to advocate people, take it seriously enough to form your views and join the debate because it could end up being the most consequential debate of our entire lives. They think it might be the most consequential debate in human history, and it shouldn't just be made by people in technology. You know, I think we all should join. I mean, there's this thing in the discourse about how dangerous it would be for China to be the ones who make AGI and why we have to go really fast. And there's a part of me who thinks I see the logic in that completely their humanitarian human rights abuses and all the like from the Chinese government.
Starting point is 00:39:31 It's frightening to think that they would have something with the power that these people believe AGI would have. But the whole thing about us building the AGI here in a country with civil liberties and freedoms is that we should use those freedoms, one of which is engaging in public debates and fighting and getting into the muck and getting our voices heard. I think that's what I'm advocating for. It's like, let's do that thing that free societies do, have the debate out. and but to do that you have to you have to take it seriously you can't you can't you can't just take a posture of no way this is going to work and and i think that that's in some ways that's the thing i'm advocating for with the podcast it's not for any position in the three camps but it is a strong advocacy to take seriously and join join in on the debate all right i i have one last
Starting point is 00:40:28 question for you before you let you go and uh i don't know maybe we should all take a bong rip before I ask this question. It's a little bit of a... It's a little... Yeah, it's a little existential. I mean, and actually your comment about the fundamentalist Christian background, which obviously I know about you,
Starting point is 00:40:45 but it sort of I think teased this one up a little bit. I had this moment listening to your show where I don't want to say I had an existential crisis, but I'll say I had some very
Starting point is 00:41:01 like eight dimensional existential thoughts that I couldn't really I had trouble out in my head around and one of them was like is are we just like building God and this is our purpose and this is like what it's all about and this is the thing that we're headed towards and there's this you just touch on it like there's this inevitability
Starting point is 00:41:28 nobody's going to take a vote we are going to do this We are like hardwire programmed to just keep advancing and keep advancing. And I know it sounds like sort of silly on the surface, but I'm, and I'm just like, it's a half-baked idea. I'm just sort of scratching it. Like I said, bong rip. But I'm just like, why are we so drawn to this? And is there something like really deep inside of us that knows that this existence,
Starting point is 00:41:56 the things we're seeing? You know, it made me start thinking like, oh, my God, is this all a simulation? and we're just like, we're working towards cracking the code, and that's the whole thing. And so my question to you is, you know, that's my sort of existential thought crisis I had a little bit listening to the show. I'm curious if you had any of those, like, if that came up for you in the course of reporting where you're just like sitting on your back porch staring off at the horizon, like, holy shit. Like this is like blowing my mind a little bit. And I don't know if there's any like meat on the bone there for you. It's just like, it feels like there's, you could really go down a weird rabbit hole here.
Starting point is 00:42:35 Oh, yeah. At this moment, I could go down several different rabbit holes, especially because, I mean, as a fundamentalist Christian, you don't know how crazy, you don't know how crazy evolution sounded to me. It doesn't sound right. It's like, it's not intuitive that this complicated thing with all of its ambitions and feelings and anxiety is, you're like, well, it comes from the slow evolution over millions of years from the single-celled organism. I just didn't buy it. It's like there's no way that's real.
Starting point is 00:43:07 And sometimes when I hear people say now, there's no way we could make this thing. And there's definitely no way it could ever have consciousness. It does remind me a little bit of like, well, I used to not believe in evolution at all, you know. And I made fun of people who did, you know, when I was like a 12-year-old fundy Christian. So I definitely wonder sometimes.
Starting point is 00:43:28 Is that how you got fundy Christian? Is that like a slang term for you guys that we can use? Yeah, yeah, former fundies. You don't have any foreign fundies? No, I never heard that. The former fundies are a good bunch. I like them. We hang out with each other.
Starting point is 00:43:41 But I also think you're pointing to the fact that there is like this kind of new mythologies and technology are forming around this. Like one of them that's very, like surprisingly large amounts of people subscribe to is what's often called the worthy successor theory. And that's the idea that, To your point, Isaac, some people, and a larger amount of them every year, they are becoming convinced that we are a stop on the way to something better, that we are one stop on an evolution. And some of them think that our purpose has now been revealed to us. We are supposed to create an artificial new being that will do better than us, that will be better than us.
Starting point is 00:44:23 And they call it the worthy successor, something worthy of replacing us as we slowly. die out and it takes on not just ownership of this world but slowly the universe and they think that that will be our legacy that we will be remembered as the creators of the thing that was the good and that was it would be the best thing we ever make and then there's this other group of people this is less popularly subscribed to but they believe in the the what's called the great filter are you guys familiar with this theory the great filter no yes this is this is a belief that some people have when it comes to the question of like, where are all the aliens? Why aren't there, there are all these planets we're discovering where are all the other
Starting point is 00:45:05 intelligent life forms? And some people believe that we are going to, like what this AGI moment is, that we're running into a temptation that every other intelligent life form eventually got to. And if you choose to make the AGI, you choose to end your intelligence civilization and set your planet back. You know, like, they think that it's a wild theory, but it's interesting to think about. I thought you were going to say, if you make the AGI, you like break through and all of a sudden we'll be able to communicate with all the other intelligent life forms out there. No, that's her.
Starting point is 00:45:40 Yeah. Yeah. And then the last thing I'll say is that you talked about the simulation. It is, it's not surprising that some of the early believers in superintelligence are the people who popularized this idea that we are currently living in a simulation. Nick Bostrom, whose book Super Intelligence came out in 2014, he's essentially the reason that a lot of people in Silicon Valley have come to take seriously the idea that we're in this simulation now. So there's an enormous amount of overlap that if you really start getting down deep into this.
Starting point is 00:46:13 And it can sound a little out there. I mean, Nick Bostrom, one of the things that he thinks about a lot are what are the rights that we need to be preparing for what rights we give, AGI because he doesn't want to see us enslave it. And then he's thinking also about what if our superintelligence is created it may want to communicate
Starting point is 00:46:36 with super intelligences in other universes. This is like, this is like that is where you end up going if you go deeper and deeper into this world. So Isaac, if you want to take that bong rip, that's, you know, that's the ride
Starting point is 00:46:52 right there. I mean, yeah, well, uh, Go ahead, Camille. No, I was just going to say, they're all fascinating conversations. Yeah. And there's a probability that at some point we will have some sort of successor. I don't know how, what is our relationship to our predecessors? Some of them are very different from us.
Starting point is 00:47:10 Some of them are different species than us. If we accept evolutionary theory. That crazy old theory. That gackle of heathens here. Then there's something that will come after us. And maybe it is the software that we will build for ourselves. But it also seems like I just, I don't have to accept those ideas to take this conversation seriously. I do know that there was going to be increasingly more sophisticated
Starting point is 00:47:30 and powerful technology that will make people super empowered individuals and give them the capacity and the capability to do all manner of things that were just unimaginable for a single person to do before. I think part of the challenge though and the difficulty for Bostrom at all, and I certainly felt this way about the simulation theory in general, even multiversal, theory. I don't really know what the point is. It doesn't quite explain anything. It's just simulations on top of simulations. Okay, so what's at the bottom of that? More simulations? I don't know. I feel like there's a dimension of that in the AGI debates as well. And, you know, even the probability that we might, there be some species level extinction event. There's so many
Starting point is 00:48:18 technologies that we are screwing around with now that one could imagine a story of it leading to a species-level extension event. And I think the one kind of green shoot I would offer you, Isaac, and maybe it's very little comfort, is that all of these probability statements are fake. Like, none of them can be taken seriously.
Starting point is 00:48:38 2%, 5%, 20%, 50%, like this is make-believe. It is weird when they get into the numbers. I'm always like, hang out of the Dume's Day clock, right? Yeah, it's actually... Until midnight. Yeah, 52.65, actually, is what it is.
Starting point is 00:48:52 That's fine. The Dumesay clock is a really good metaphor. It's not it's not like precise. It's like but it is, they have thought about it a lot, right? It isn't that they're just throwing it out there but it is also the doomsday clock. Right, and we have to live with the fact that we're always
Starting point is 00:49:08 under the specter of death. Well, that's how I like to end my podcast. Yeah, yeah. So. All right. Well, Andy Mills, the show is the last invention. You can find a writer. up of it on our website, reetangle.com. Andy is a friend of the show. I highly, highly
Starting point is 00:49:29 recommend the podcast. It's been an awesome listen. Andy, thanks for giving us an hour of your time today, man. Really appreciate it. Well, thanks for having me, fellas. We'll be right back after this quick break. All right, gentlemen, we've got Andy out of the studio. Smart guy, man. I mean, you know, he was saying he doesn't envy us being in the day-to-day grind of the news, which I respect. I'm not sure I envy being down the rabbit hole of artificial intelligence, which seems exceedingly scary. and somewhat dark to me, though he's handling it incredibly well.
Starting point is 00:50:19 That guy is, he's sharp and seeing things clearly right now in a space where I don't really know what to make of anything. Mills is an exceptionally good journalist. It's a great reporter, excellent storyteller, and always tends to be pretty level-headed. So to have him be inspired to take a fresh look at this emerging, important conversation that is being shoot over in a bunch of strange ways,
Starting point is 00:50:45 was eminently refreshing. I remain somewhat skeptical of certain dimensions of the conversation, and I think I am pretty bullish on the tech broadly. But I liked hearing the Dumer's perspectives, even if I disagree with you people, Ari and Isaac, since I suspect you're both kind of Dumers of a sort. Ari, would you call yourself a Dumer, Ari? I don't think I would.
Starting point is 00:51:13 I think I'm more doubtful about the horizon between now and general intelligence. My background is just that I had a sojourn in an academic area that was trying to measure intelligence and people. And I know how difficult that is. So I'm really skeptical that we're going to, while we're trying to solve the problem of measuring intelligence of people, invent it to a higher degree in something. else. I actually think the interesting thing of the super-intelligence that Andy's saying is far off it's on science fictiony, I think it's almost the foregone conclusion after you get to general intelligence. I think once you create something that can mimic human intelligence and you create it, as he said, a thousand times and it doesn't sleep and it can learn anything and get smarter,
Starting point is 00:52:03 how does that not accelerate to an intelligence we can't define? Like that, that's almost like a logical foregone conclusion to me. I think, um, I don't know if it's, species-level existential, we're going to die, I think all of our problems are kind of structural. And I thought his point that we have a lot of technologies around us that could be existential anyway. It is well-made. And I think, as I know Camille, you and I have discussed,
Starting point is 00:52:35 the whole title of The Last Invention is a reference to an Asimov short called The Last question where a couple where humanity creates a superintelligence and they ask it the question of how do you reverse entropy and spoilers for the short story but it can't answer the question says it has insufficient data
Starting point is 00:52:55 for millennia and then millions of years until all that's left is some ephemeral AI and some ephemeral human consciousness and the AI goes, okay I get it now and then creates the universe anew and we end up creating God. I don't like that's kind of a
Starting point is 00:53:11 to your bond grip to question, Isaac, kind of a story that's been in the back of my head forever. Now that I have bookshelves up, I have like these, this pair of books from Octavia Butler about like a future where humans are sort of in an apocalypse of their own making and one of the main characters wants to create a religion
Starting point is 00:53:33 where they're like, we need to go to space while we're also trying to survive and eat and grow crops for each other and calls the religion earth seed. I do think there's like a part of our existence that's like we have to create something larger than us and we have to try to take over this larger problem of like how does the universe is going to die someday.
Starting point is 00:53:54 We have to prevent it and we can't do it ourselves. We have to create a superintelligence. This seems like the timeline for me. Like if I'm anything in those camps that he was discussing, it's just I completely doubt this horizon of general intelligence being 5, 10, 20 years. I definitely don't think that's going to happen, in my opinion. But it is really compelling to think about those arguments,
Starting point is 00:54:14 the way they connect to these broader ideas that are challenging press to think about. Ari, I should have known that you're reading Octavia Butler, but, man. Four years ago, read it. But yeah. Huge fan, dude. Change is God, my brother.
Starting point is 00:54:28 She has, I mean, one of the most bangor lines of all time. That's just totally changed my perspective on existence and the universe. I love her stuff. There's a lot of news to talk about. So before we fall back into this rabbit hole of dystopian futures and Octavia Butler, which we should really do sometimes, we'll put that on the book club list. We got a...
Starting point is 00:54:53 Let's talk about a dystopian present. Yeah, I'll tell you about a dystopian present. I spent today just absolutely hammering a European Union regulatory enforcement agency that is find X for a bunch of things that I find a little bit far-reaching in the government overreach category in the larger context of what I think is a very obvious backslide of free speech in Europe. And there's a lot of things happening in places like Germany and the United Kingdom in France that I find deeply alarming. I think makes me really grateful to live in the United States and not in Europe. No offense. To our European listeners,
Starting point is 00:55:43 trust me, we got plenty of our own problems. But on this specific to the speech issue, there's stuff happening there that I find really scary. And then we publish this newsletter. I record the podcast. It goes out. And then I get on to Twitter a couple hours after the newsletter is out. And I encounter a headline, which actually drives me. before I recorded the show and before the newsletter is out, which I regret, because if I had seen it before,
Starting point is 00:56:14 I would have addressed it. I'm going to read this headline from NBC News. Foreign tourists could be required to disclose five years of social media histories under Trump administration plans. The Customs and Border Protection proposal would apply to travelers from more than 40 countries. And in the article, it says,
Starting point is 00:56:33 the Trump administration plans to require travelers from more than 40 countries to provide their social media histories from the last five years in order to enter the United States. The data would be mandatory for new entrants to the U.S. who hail from 42 countries
Starting point is 00:56:45 part of the Visa Waiver Program. This is according to the notice from CBP. I don't... This is insane to me. I suppose there's a... I mean, I presume there's a national security underpinning to the justification here. it's like I'm, I spent all morning just hammering this idea that I'm glad we're us and not
Starting point is 00:57:14 them. And then there's this measure that this current administration is considering that to me feels almost equally as draconian is some of the stuff that we're seeing in Europe where you know people are being arrested for insulting politicians on the internet. People are having, you know, broad, hate speech laws applied to their conduct online and going to jail for what I think is essentially mean, or sometimes racist or bigoted tweets, but things that people should not be going to jail for. I'm curious how you guys feel about this. If you've been tracking this story, what you make of this headline, I felt a little bit
Starting point is 00:57:53 foolish when I saw it, like, oh, I just did this whole song and dance about how great things are here compared to Europe and how glad I am that we're up. us and they are them. And then I ran head first into this example of like, oh, we're also doing a shit job of handling this right now. Maybe you ran head first, but he had a helmet on. And that helmet's name was senior editor Will Kayback, who went out on the limb for you in a staff dissent and said, I don't really think that it's all that feasible, that the United States is going to pass laws that restrict speech in a way that's similar or inspired by what Europe's doing. And, yeah, poor Will. I mean, we've given him.
Starting point is 00:58:32 plenty of credit before when he said things that became immediately prescient. In this case, it happens to us all. Happens to the best of us. But the way that I read your take was that this is something not that like the U.S. does great and we're thumping her chest about it, but actually had within it this nugget of warning of we need to be on the lookout of the way free speech laws are being applied in the European continent because it's possible that they could inspire things in the U.S. U.S., which will is pushing back on. And I have, of course, arguments in other areas to discuss
Starting point is 00:59:08 with you on this take. But in this case, in this notion about U.S. speech restrictions, it does seem like it was a little oppression. It is a little different. Obviously, it's not going to apply to citizens. It has nothing to do with you being arrested for speech and curtailing rights that are detailed and founding documents. But it does restrict freedom of movement for people that wish to enter the country and in a way, invites questions that are quite similar about what the government has the right to restrict in police and what kind of speech are we drawing a circle around and saying something outside of this, it's not acceptable, and also we're giving the circle drawing
Starting point is 00:59:54 authority to a larger government, and that is a similar thing that I think, if there's anything that comes out of your piece that is like, this is the thing I'm screaming about, it's the fact that you don't want governments to do this, and here's why. And that, if anything, makes it seem somewhat apt impression. So there's your helmet for you as you go head first to that wall. Yeah, that's fair. I mean, Isaac, you used the phrase, um, backsliding. a little while ago to describe what's happening in Europe. And there's, obviously, as you know and have detailed, there are important historical differences
Starting point is 01:00:33 between the United States and Europe and always have been when it comes to various things related to speech, certainly even culturally. They're acclimated to and even expect and perhaps celebrate the degree to which they have different approaches to dealing with bigoted speech and censorship
Starting point is 01:00:53 with respect to their relationship with the state. So in that respect, I think the American project is very different. And we have this kind of foundational, dispositional approach to free speech that makes us very wary of encroachments upon people's liberty in that area. Even the sensitivity with respect to people entering the country having to submit their social media histories to extra scrutiny is something that Americans, like here, and they scratch their heads.
Starting point is 01:01:28 And I think that's true for Americans, pretty much all backgrounds. It also seems true to me that, one, this just seems like it would be hard to do. And scrubbing it in any sort of efficient way seems kind of ridiculous to just take away my civil libertarian hat for a moment and embark on a, God, is that really the best way to actually protect the homeland from potentially dangerous lunatic? one, that's the first question. The second thing is you don't really need specialized permission or to do some of this scrubbing and searching if in fact people are applying to come here for visas legally. You could just do this stuff on your own.
Starting point is 01:02:05 There's a lot of this information is public. If someone is actually likely to be a threat and their online behavior is going to be indicative of that, there are probably ways to find that sort of stuff out anyways. So this has the whiff of the sort of, sort of thing that someone in the Trump administration is kind of actively thinking about and perhaps hasn't really given much thought to how to operationalize. But at the same time is also indicative of how the Trump administration has been
Starting point is 01:02:34 kind of schizophrenic on free speech issues. I distinctly remember J.D. Vance going over to Europe giving this rousing speech and having extended aside about Europe's insufficiently robust protections of free speech And at the same time, this is an administration that has gone after journalists in various instances, has filed lawsuits against various media organizations for the things that they're publishing when they're remotely skeptical of what the president is doing. Up to today when he is posting things suggesting that it is seditious for media organizations to ask questions about his health, for example. So, you know, I think we've become somewhat accustomed to that from the head. I guess my question is, like, is it schizophrenic or is there, are they just giving lip service to this and everything they're doing on the action side is not really, you know, free speech oriented?
Starting point is 01:03:34 I mean, they certainly talk a lot about it, but I don't know that I've seen any sort of policy, legislative, executive order, oriented stuff that is made. free speech in America more robust. To the degree that they've improved free speech, it's like, you know, saying offensive things is more normalized now and people have less, you know, knee-jerk, cancel culture-type reactions to it. Which, by the way, I think is mostly good.
Starting point is 01:04:10 I think we should have forced taboos still. But, like, I think like... But it's cultural. But it's cultural. And, like, and it's also gone too far in the other direction, a lot of instances where, like, you know, the USDA, like, Department of Homeland Security Twitter account is posting people, like, immigrants crying as they get handcuffed and deported as like, yeah, we own the, where it's like, oh, this is gross.
Starting point is 01:04:37 Like, I don't want this either. Like, I don't like the, like, Latinx crap, but I also hate this, too. Like, you know, keep, just like, can't we just not do either of these things? You know, so I don't know. I think Trump, low-key, may be barreling towards this kind of Bush era post-9-11 security state expansion. I mean, all of this is underpinned by, you know, the National Guard shooting that was done by an Afghan National by the Somali fraud stuff. I mean, this is the sorts of things that they're using to justify this stuff.
Starting point is 01:05:27 Like, he is doing this to keep a very specific kind of person, immigrant, out of the United States that has all the flavors of the kind of post-9-11 mania, except instead of like the 9-11 event, we have the mass migration of the Biden era as being like the justification for this. And, yeah, I mean, I think an underrated part of this, I don't know if they'll go through this. To your point, I don't know how functionally, I mean, can you imagine going through the airport if, like, the person in front of you have to wait five hours
Starting point is 01:06:06 for them to go through five years? Like, I don't understand how this is going to work. Yeah. But generally, I'd expect you to apply for your visa, et cetera, before you get to the airport. So hopefully that won't still. Right. But what it seems like they're saying is this will apply to people even who aren't coming here on visas.
Starting point is 01:06:25 I mean, that's what the NBC report is saying. It's like even tourists who are just coming here and don't require visas to be here to enter the country, they are going to face this kind of scrutiny. I'll tell you a quick story and then just to tee up like I think maybe an underrated point of all this, which is I think I actually might have told this story on the podcast. But I was in New York City, I don't know, maybe six months ago, eight months ago. And I went and got my haircut by a barber in Times Square. As, you know, any self-respecting New Yorker will tell you, Times Square is a hellscape.
Starting point is 01:07:03 If you live in New York, you never want to be there. We had a big conference there for Tangle. I was in Times Square. I once got my haircut, this guy had had a barbershop in Times Square for like 20 years or something. And he was, he said to me when I was there, he said, the last, couple months have been the worst business we've had since I opened this barbershop. We used to be lines out the door because a lot of the people who come here because they're staying in Times Square are tourists in the city. It's like 20% of my customer base and they've
Starting point is 01:07:32 totally disappeared. And all the shops up and down this strip in this area that are selling mostly to tourists in Times Square, you know, foreign tourists and American tourists are just like it's, you know, business sucks right now. And a lot of it is because, foreign tourists then, back then, we're already really scared of coming to the United States. And I think a very underrated part of this story that's going to be important is that our economy is built on many things.
Starting point is 01:08:02 And one of the things is tourism. A lot of American cities live and die on foreigners coming to those places or sometimes other Americans, but oftentimes foreigners coming to those places for experience. You know, whether it's Disneyland or New York City or Los Angeles or whatever else, like, there are foreign tourists. I live in Philadelphia, which I don't think a lot of people would even think of as like a major touristy destination. And there are foreign tourists here all the time to go to the Liberty Bell and go see the Constitution.
Starting point is 01:08:37 It's popular. It's like a thing to do. And if I was living in another country, I mean, I don't even have to imagine living in another country. if London was like, yeah, in order to get into England, you have to show me five years of your social media on your phone. I would be like double middle finger, no thanks, like I'm going to Paris, you know. So if this is something to actually pursue, I imagine it's going to have a huge, huge impact on the tourist industry in the United States.
Starting point is 01:09:08 I don't know how it couldn't. Like I don't know what kind of person would come here and submit themselves to this sort of thing. just for a vacation to New York City or New Orleans or whatever else. So what do we think? And it's not happening in a vacuum either. Yeah. I mean, it all makes a ton of sense
Starting point is 01:09:28 to ask how does it work? What are the ramifications going to be? And then when the detailed implementation is difficult and the ramifications are farther reaching than expected, like what's the backstep? Like, do we think the administration's
Starting point is 01:09:44 going to go, well, what we meant was and then say this is just going to apply to visa applications. But still, we want you to be careful. Is there going to be a clarification of this? I can read you some of the reporting that we have. So in addition to social media histories, customs, and border protection would add
Starting point is 01:10:00 other new data collection fields, including email addresses, telephone numbers used in the last five years, as well as the addresses and names of family members. The Department of Homeland Security spokesman said the proposal is not final. It's not clear how applicants would be required to provide their social media.
Starting point is 01:10:17 The U.S. public now is 60 days to comment on the proposal. The federal register notice reads, so this is something that was submitted. And this was a detail that's also important. The U.S. is hosting the FIFA World Cup next year. Oh, right. And then the Olympics are coming up. Yeah, which are going to draw fans from around the world, including from the UK and other countries where visitors do not require visas.
Starting point is 01:10:38 A Trump administration official told NBC News that while World Cup ticket holders may be fast tracked, they would still be subject to the same requirement as other travelers. So, I mean, the administration's acknowledging this plan exists. Their spokespeople from the DHS are saying it exists. They haven't finalized it yet. But, I mean, it seems like they're at least considering doing this, which is just, it's a mind-boggling to me. But do we think the recipient of the first ever FIFA World Peace Prize is actually going to keep people out of the country over this.
Starting point is 01:11:14 The answer to that is absolutely, which certainly is right. He's already making those efforts. I mentioned a moment ago that it's not happening in a vacuum. And as you were doing your thoughtful analysis, Isaac, of just the implications of this in terms of other people, foreign appraisals of who and what the United States is and whether or not it's a place that they kind of longed to visit anymore.
Starting point is 01:11:40 It's impossible for me not to think about. the conflict in Ukraine and the degree to which the Trump administration is applying pretty intense pressure right now. I'm hopeful that they can get the Ukrainians to agree to a peace deal that they're less than enthusiastic about. And at the same time, the administration is hurling pretty searing criticism at their European partners. And that is almost certainly going to play in exactly play into this dynamic in exactly the same sort of way where it just changes attitudes about the United States and not just about one administration. I think it probably takes more than an election cycle to sweep that away and make people begin
Starting point is 01:12:23 to feel differently, even in the event that someone wins who is kind of repudiating all of the prior administration's policies. I will say, though, that my expectation would be that if there were obvious horrible ramifications from a policy standpoint, in much the same way that with the tariff regime. We were told early on, oh, this is it. It's going to be here to stay. And immediately, like, the dealmaking is happening behind the scenes. And they do everything they can to kind of delay, delay, delay. I would suspect that they will do something similar here. So them instituting a policy that they make a lot of noise about, only to walk away from that policy very shortly thereafter, not only can I imagine it, that's kind of what they do. We don't talk
Starting point is 01:13:07 about Doge anymore. And early on, it was the only thing that we could talk about. about for about a month and a half. Yeah. I mean, this is like one of the eternal frustrations of covering this administration is, you know, I know supporters of the president might listen to the first few minutes of this and imagine that there's like, you know, that there's some anti-Trump bias or partisanship. And it's like, I don't know. what else to say. I literally drew lines in the sand without knowing this story existed
Starting point is 01:13:47 about the kinds of speech conduct that were acceptable and not acceptable. And then I read this story. It's like, oh, this is very obviously on the wrong side of my principles. And I'd like to imagine that there are enough thoughtful people inside the administration that when this public comment period is over, whatever, they made their splashy, headline, they're scaring off the people they don't want coming here who are going to have horrible things on their social media. I like to imagine that those people will win out, but I don't know. They've tried to do some of this already. I mean, we've had plenty of stories already about foreigners who have come to the United States and had their social media
Starting point is 01:14:33 scraped and been held in, you know, scrapes isn't the right word. Ari, we'll correct me on that. but had their social media examined and were held in detention for, you know, days or weeks at the airport or at some immigration facility because they had posted some mean stuff about the president of the United States on social media. I mean, this stuff's happening already. So I don't know. It's, it worries me for sure. And I think it's, you know, I think it's representative of just.
Starting point is 01:15:08 just a general erosion of kind of the free speech culture globally. You know, we talk today in the newsletter and the podcast a lot about Europe. And I had a sentence or two alluding to like what was happening here and that there are concerns I have about the free speech culture in America. But I think both the left and the right now in the last five years have proven themselves perfectly capable of cravenly abandoning any sort of like
Starting point is 01:15:45 honest free speech principles when they have the power when they're wielding the sword and we're living in an era now where the right has the power and they are predictably like abandoning a lot of the free speech principles
Starting point is 01:16:01 that we heard them espousing when they were in the minority and it's really scary to me I mean, it is, I don't want to be cliche as like the journalists, you know, beating the drum for the, as like a speech warrior. But I do think that we need to, it's almost like I feel like we need to reteach or, you know, reinforce the idea in our country that people should be allowed to say whatever they want to a large degree publicly. without fear of government, overreach, and prosecution. And these are like very specific, you know, this isn't like nobody's accountable for their words.
Starting point is 01:16:48 That's not what this is. It's just like when we get to an era, when we get to a place where the government is saying, if you want to come to this country, we get to look at your social media and decide whether or not you're allowed here, that to me feels like a really dark, scary, dystopian place, even if you're somebody who's worried about national security or concerned
Starting point is 01:17:12 about the kinds of people who are coming here from foreign countries. It's just like you crack that door open and sooner or later it's everybody. I mean, we just had this guy, I think it was in Tennessee, I want to say, who was in jail for like 37 days for this post he made about Charlie Kirk on Twitter. I mean, it's happening to some degree here already in a way that I think is fairly alarming and scary and there's, you know, not enough people kind of fighting the good fight. Camille, I'm preaching to the choir here. I mean, you're on the board of fire, so, you know, I know you're in the trenches, but it's worrisome to me for sure. We'll be right back after this quick break.
Starting point is 01:18:14 I mean, can I throw out a question here? And I'm not even so much paying devil's advocate. People who support the administration, members of the administration, would almost certainly defend this policy in much the same way they've defended their approach to immigration. We just say that people who would like to visit the country, people who are in the country on visas,
Starting point is 01:18:35 or certainly people who are here illegally, simply don't have the same catalog of rights that actual citizens do, in which case, scrutinizing their speech as the government and policing their behavior as a result and insisting, well, you're not allowed to come here because of the things that you said or you have been singled out for special scrutiny on account of the things that you've said. This is not a violation of the First Amendment.
Starting point is 01:19:03 This may be a violation of certain principles that we all hold dear, and certainly that I would advocate for forcefully. But there's also a dimension of this that's kind of practical, right? I mean, you're always on the lookout for prospective threats. You care about people's associations back home. And to the extent that someone does have dodgy associations being on the lookout for that and leveraging whatever tools are at your disposal as part of the national security apparatus to prevent bad people from coming here and doing bad things. That's kind of the job.
Starting point is 01:19:37 So the question becomes whether or not, if I'm going to put the best possible gloss on this, and I want to be fair to them, whether or not the administration is really taking seriously, I think the gauntlet that you've just kind of laid down there, that taking this step could very will be a first step towards degrading the rights of American citizens in a real practical and profound way. And if they aren't willing to engage with that question and to talk openly about how they insist that they will, in fact, safeguard people's freedoms, then that becomes a real problem and something that we should be talking. I'd have a lot more about. So hopefully they'll get some good questions about this before maybe, well, I'll go, I won't go there, but maybe
Starting point is 01:20:25 Yeah. I mean, I'd be curious here what Ari says about this, but I'll give my response really as briefly as I can first. I mean, A, I think it falls into a similar line as like the mass deportation stuff and the lack of due process where it's like if you say this is okay to do to people who are here illegally, then you're saying it's okay to do to everybody. Because. there's no, I mean, it's less of an issue here, I guess, but you open the door, it becomes very easy for an American to leave the country to travel back from Europe and just to be shuffled into some line where he has to open his phone and share his social media stuff. The other thing is, like, I would maybe be a little bit more receptive to this
Starting point is 01:21:18 if there were really clear lines being drawn. Like if the administration was saying, we're going to look through your phone and if we find that you've made any terroristic threats towards the United States in the last five years or your social media account
Starting point is 01:21:34 shows associations with designated terrorist groups in America then we're not going to let you in. If this was some clear thing like a parameter that's like black and white, did they have this or not? And if they did, we're not laying them in, that's fine.
Starting point is 01:21:49 What concerns me here is like, it's really easy to just imagine, say, a Swedish journalist who wants to come to the United States to do a story about Donald Trump. And I posted some critical things on social media about Donald Trump. And then we're saying, we're okay with the government, not allowing that person in. I'm not good with that. Like, that to me feels like a rejection of the fundamental principles that make the country what it is. The same way that, like, if I went to England, and like 10 years ago I had posted some nasty tweet
Starting point is 01:22:25 about the British PM. I don't think I should not be allowed to you know, my cousin gets married in London and I want to go to the wedding but I'm not allowed in because I said something mean about like we recognize this is like this is gross. There's nothing this isn't in like a free democratic society
Starting point is 01:22:41 we shouldn't accept that kind of thing. So that's how I would I'd worry about, I like think about the edge cases but I appreciate the point. I mean I appreciate the framing, like, and the pushback of there's a legitimate interest in monitoring who's coming into the country. And I obviously support that interest. I, you know, I've talked a lot about immigration on the show and in the newsletter. And I'm not like a, you know, quote-unquote,
Starting point is 01:23:09 open borders guy. I mean, I think we should, we need to be really cognizant of the people who are coming through. And I think there's national security interests that play. But the social media stuff to me seems like a bit of an overreach. I don't know, Ari, what do you think? I feel like there's definitional problems all the way down, no matter how you look at it. I think the three of us, by virtue of what we do, are all pretty open to debate offensive language
Starting point is 01:23:38 and statements in public that should be permissible legally, even if enforced by some cultural norm or standard, that there's repercussions for. I personally am somebody who am very much against hate speech, laws personally. Even as somebody who, like, you know, this extended family has been victim of violence that's perpetuated because of hate speech, I think laws against them always beg the question of what do we mean?
Starting point is 01:24:05 How do you define hate speech? And I think that is a, you offered this invitation abroad in the lens to Europe and internationally, globally earlier, Isaac, and Europe is trying to regulate hate speech. finding X, which is one of the reasons why we're talking about this in the first place, because we covered that in the newsletter this week. And it invites the question of, okay, how do you define that and who's defining it and who do you enforce it against? And if you use that same set of questions to a U.S. policy of, okay, so you are traveling to the U.S. and we are looking at speech in your phone, anything you've posted from the last five years, who are we
Starting point is 01:24:49 looking at who, how are we defining the people that we are trying to limit and who's defining it? Because what if somebody comes into this airport line, hypothetically, and we're told this person's not American? How can we prove that? What about somebody who had a visa or is in an immigration process already getting denied because they left the country for like their family's wedding and is coming back? Is that person allowed who gets to define it? And then also to your point of like you would feel better if there are some delineation about, okay, we want to chew in for terroristic threats. Same question.
Starting point is 01:25:24 How do you define that? Clearly, this administration is very loose with how they use the word terrorist. Like we're defining people who were targeting in boats off of Venezuela's narco-terrorists, even if they are actually drug dealers, a bit of a stretch to say that's a terrorist. And then if those are the people that were letting you define that term, then what other terms are going to be able to define. It seems to me that one of the, like the best ethic to continue to defend is that we should be really, really conservative with the speech that we are limiting and giving any kind of legal repercussions to because of it. Direct threats, doxing of personal addresses, even that, like there's some blurry lines too.
Starting point is 01:26:09 We talk about the Elon Musk Jet Twitter account and stuff, but direct threats of personal violence and haranguing. constant threats, stalking online. These are things for which there are clear delineations for me. I'm sure that somebody could come up with one or two more. But apart from that, the definitions around who is enforcing and defining these things is important. And even the idea of who it's being enforced against is important. Just as we're talking about, should Europe be really regulating an American company? Should America really be defining what people are allowed to say,
Starting point is 01:26:46 depending on where they're born, U.S. or not. Like, there's a bit of a true Scotsman fallacy there. Like, you're a true American, you can say what you want, but who's a true American? And are they defined based on what they're saying? Because if you're critical of the administration, you're not a true American anymore. We can sue you, you're fake news.
Starting point is 01:27:02 It starts to get slippery right away. All right. I know we're a little bit short on time because we had Andy in the studio today, and we've got a few minutes here. I want to get to one, quickly get to one other breaking news story that came up right before we hopped on the show to get your guys' initial reaction to it. And I'm sure we'll learn more in the coming hours. But apparently the United States just seized an oil tanker off the coast of Venezuela.
Starting point is 01:27:32 President Trump said this on Wednesday to gaggle of reporters. He said, we've just seized the tanker on the coast of Venezuela, a large tanker, very large, largest one ever actually. Wait, did you say that? Yeah, I can't. Of course he did. Yeah. Oh, God. I didn't hear it.
Starting point is 01:27:51 I didn't read it, but I mean, of course he did. Yes. Yeah. Ask what will happen with the oil. Trump said, we keep it, I guess. Why was it seized? Is there an explanation yet? It's too big.
Starting point is 01:28:06 The seizure, this is Reuters saying the seizure could signal intensifying efforts to go after Venezuela's. oil the country's main source of revenue. I know this is serious. I'm sorry to laugh. I mean, he is, it's the funniest president of all times. I mean, it's like how I imagine if I would run the country, you know, like, well, what's going to happen?
Starting point is 01:28:27 And I'm like, I don't know. We keep it, I guess. I haven't really read into that. I guess it's ours. Marco, what do you think? Yeah, yeah. God. But it's significant because it's, as Reuters says, quote, it's the first.
Starting point is 01:28:42 known action against an oil tanker since Trump ordered a massive U.S. military buildup in the region and carried out strikes against suspected drug vessels, operations that have raised concerns among democratic lawmakers and legal experts. The U.S. Coast Guard conducted the operation. They did not name the tanker, which country's flag. It was flying or exactly where the interdiction took place. So we don't have a ton of information yet. I mean, it certainly seems like an escalation, I, you know, if you're, if you're in the MAGA world and you're watching this with concern that Trump's barreling towards, you know, a kind of Middle East style catastrophe by starting a war in South America, I would say like seizing an oil tanker feels a little bit too
Starting point is 01:29:32 on the nose of just, are we going in there for natural resources? Is that really what this is all about? but I you know I said this last week I talked around the podcast a little bit we talked about on suspension of the rules we have a video coming out on YouTube about this probably next week I just feel like we're about to go to war with Venezuela like it has the 22 Russia Ukraine vibes of just like okay so we're building up the troops in this region we're sending a bunch of our biggest ships out there we have stories in the New York Times
Starting point is 01:30:12 about the CIA saying there's going to be a land invasion at some point and we're trying to overthrow Maduro and now we're just picking off oil tankers off the coast of Venezuela like is there any reason to think what it looks like is happening isn't happening
Starting point is 01:30:28 I mean I just don't know if we just are head in the sand on this right now I think one of the first questions I have for this is truly how unusual is it. The statement that I see CNBC release from Pam Bondi said that the tanker was sanctioned by the U.S. for years,
Starting point is 01:30:51 quote, due to its involvement in an illicit oil shipping network supporting foreign terrorist organizations. And I'm going to try to avoid the bait about talking about terrorist organizations and how that can justify any response because I'm constantly right about that. But if I look back, for other instances of this happening. U.S. seizes tanker used to deliver oil to North Korea, July 2021.
Starting point is 01:31:16 Iran tanker, U.S. issues warrant to seize Grace 1 Supertanker, August 2019. It's not to say this is a thing that happens extremely normally. Obviously, tensions are really fraught right now with Venezuela, but I don't know if it's something that's like, oh, yeah, this is happening now we're going in, we're taking the resources. Right. I think it's something where this actually we're going to pay more attention in the moment
Starting point is 01:31:41 but we can still go through the process of broadening our scope and slowing down and looking back and seeing if there's actually something to this because there very well may be something to justify taking this tanker in this instance. Yeah, I would say the same. I mean, we just have to wait until we see more reporting.
Starting point is 01:32:00 It is the strangest thing to see the president pro act preemptively essentially comment on which is what it seems like happened here they're having a different meeting and he announces the seizure of this tanker as though it's something oh great we got one that's so no everyone is just kind of scratching their head somewhat mystified by what's going on and i i saw i saw the the snippet of the reporting rye about that um the history of this particular tanker and also that it had been involved with like Iranian oil trade, the Iranian oil trade. So, yeah, there may be a story here that helps to explain exactly what's going on.
Starting point is 01:32:37 But I do think that the context here is really important. And the fact that the administration is making this announcement in the context of talking about Venezuela is indication of what they want you to think about. And the fact that they see this as a kind of further ratcheting up of the pressure that's being applied to Maduro to achieve whatever outcome they're looking to achieve there. So it's not inappropriate to speculate about what the implications might be, but I think you're right to qualify some of that concern. If there was going to be, not if there was going to be, certainly don't know what's likely to happen, but there hasn't been a ground invasion yet.
Starting point is 01:33:14 There certainly could have been. So there's at least some reason why they're applying the brakes. I appreciate your guys restraint. I think that we're going to war with Venezuela. I was going to say we're kind of already at war with Venezuela. I mean, we're all pretty alarmed here. I don't think it's worth, like, us saying, like, oh, it's actually not a big deal. Yeah, I don't know how much worse it's going to get is perhaps my perspective on this.
Starting point is 01:33:39 All right. All right, well, gentlemen, I know we've got to wrap here, but we can't get out of here without a little grievances. So, John, hit the music, my friend. Let's get into it. The airing of grievances. Between you and me, I think your country is placing a lot of importance on shoe removal. The grievance for me this week, I think, is just simply, I dropped a shelf on my foot, and it hurt a good deal. I'm putting up these shelves behind me.
Starting point is 01:34:15 I'm trying to make our executive producer John Law happy, and he's been bothering me correctly for months to try to make my background more compelling visually. so we can stream these things on video platforms. And I complied and put these really nice live-edged maple shelves up, which my wife and I stained and mounted ourselves. And as I was trying to hang some of the metals on one of the support braces, I didn't have the shelf nailed down. And I was trying to just like shimmy and in without taking the stuff off the shelf first, which was extremely dumb.
Starting point is 01:34:50 And then it just like slipped and all the books started to slide off. And I was like, oh, shoot. and tried to like brace the shelf from falling and it fell and to save the floor what I did heroically was I stuck my foot out to get the edge of the shelf as it came down. I knew it was like 40 pounds so I got the brunt of it right on my foot
Starting point is 01:35:11 and just like made an alarm noise and then had to just like do the thing that you do when you're in pain where you're like, you're just in silence just thinking about it for like a minute. My wife comes up, she's like, are you right? And like, pretty sure. Yep.
Starting point is 01:35:30 I think so. I'm just going to need a minute. And then like, all right, it's all right. Nothing's broken, barely any bruising. But boy, that was dumb. As my dad likes to say, it's better to learn from somebody else's mistakes than your own. And I would just advise that if you're adjusting your shelves, take the things off of it first. Yeah, that's a good.
Starting point is 01:35:53 That's good. I have to go next because mine is so similar to yours. Interesting. I fell down my stairs this week. Oh, an old man or a young child thing to do. Yeah, I thank God. I wasn't carrying my baby or anything, but I was in New York this weekend for this, like,
Starting point is 01:36:16 a big pony frisbee alumni pride of New York is this ultimate frisbee team I played for in coach for many years pony. and we had this big alumni weekend. I partied all weekend and saw all these old friends and I lost my voice and it was super fun and I didn't sleep a wink and I was just running around the city like a golden retriever. So I was home on Sunday night, exhausted,
Starting point is 01:36:36 just like pushing through some work and I was really tired. Like, you know the kind of tired where you like start to lose your faculties a little bit? And I was trying to take the garbage out and move like my book bag and my computer and a microphone and my water bottle all down into my own. office in the lower floor of my house all at once. And I was going to the baby gate.
Starting point is 01:36:59 And like, God, baby gates are definitely becoming a grievance for me. And something I don't know exactly what happened, but something happened like the garbage got caught, like maybe like the string or something got caught in like the baby gate. And as I stepped down on the first step, just like things got kind of weird. And I like lost my balance and I was trying to hold things. And I reached to grab. the baby gate, like, as I felt myself falling down the stairs. And when I first grabbed it, it just, like, moved six inches and, like, sort of slid off the hinges that it was, like, pressured into. And then my foot got stuck between one of the bars of the baby gate. And I just
Starting point is 01:37:41 bit it down the stairs. And I literally did the, like, da-da-da-da-da-da-da-da-da-da-da-da. Like, all the way down. Like, I got to the bottom of the stairs. And just, like, was like, oh and baby was upstairs and she came running down like oh my god are you okay and i was like i like did what you just described are you just like yeah i think i'm fine i'm like checking like i didn't hit my head i'm like breathing like nothing's broken yeah just like mental inventory of like silence of just like oh my god that hurts so badly um so yeah unlike you though i have huge like huge dark purple bruises like down my hip and my leg and my back from just like down the stairs.
Starting point is 01:38:29 So I've basically been just like every time I get up, I forget about, I stand up. I'm like, oh, like everything on the left side of my body hurts because I did something I've never done before, which is I fell down an entire flight of stairs. So I'm happy I'm not worse off and wasn't holding anything valuable, but definitely my grievance for the week and then was the thing I was I was coming in with. So we both had some like old man injuries kind of. I'm a little mystified by these grievances though, guys. Like what are, what are we upset about the fact that something happened? I think we're aggrieved at ourselves. I know. I wish I wasn't done. Your impatience is kind of the problem in both instances.
Starting point is 01:39:11 Like you should clear out the shop. I try to take everything down. I think I'd say. Yeah. Yeah, that's fair, Camille. I mean, you're a prick, but that's fair. I was going to say, I've hardly wanted to lecture. I still have, I have permanent nerve damage in my finger from nearly cutting it off a couple of months ago. I do remember that. I used as a grievance, by the way.
Starting point is 01:39:30 Yeah. Well, yeah, what was I aggrieved by? I guess it's just my idiocy. Yeah. My grievance is like, I'm in pain. Yes. I'm in pain. That's my grief. The whole... God-invented pain. Yeah, the whole left side of my body hurts, and that's my grievance.
Starting point is 01:39:46 Okay. All right. Well, I mean, my grievances that there aren't more flights and there aren't more convenient flight times. And I'm usually complaining about flying because I'm flying all the damn time and it's not fair. But I did go house hunting the other day. And I went out with the realtor. And it was actually the first time. And I know a lot of realtors. And I love the realtors who I know personally who are like family. But she was the best realtor. And that isn't much of a grievance. So I don't know. Maybe I don't have one this week. I'm sorry. To the extent things are wrong, it's the same things that are wrong. Wait, wait. You have a grievance. You have a grievance. I know what your grievance is because you told me about it on the phone. It was your experience going to touring some of these schools and getting like the Black Lives Matter.
Starting point is 01:40:36 I did get that once. But the other school that I told it, because I'm both looking at homes and schools. But one of the schools was very much still steeped in the same kind of cultural. craziness. One of the other schools that I spoke to, I shared some of my perspectives, complicating narratives around a lot of this stuff. And I was surprised to see, not only did she respond positively to my expression of concern that my children be seen as individuals and not avatars of blackness for the campus. And she said, yeah, we took the D out of our DEI program here because we realized that there is diversity and it is inherent in our.
Starting point is 01:41:16 population because we're all individuals. And that's, we, of course, we're aligned with that value. So I was thrilled to hear that in response, and it certainly beat the day before where I, where someone preemptively said to me when I asked a similar sort of question, well, it's not like we're indoctrinating them or anything, to which I could only think to myself, the lady doth protest too much. I got to say indoctrination. I asked you if you had racial affinity groups for kids in preschool and elementary school because you suggest that it might be possible that you would have kind of assertive programming
Starting point is 01:41:55 about identity, where you're kind of promoting certain kinds of values and ideas. And that was not a reassuring response. So, yeah, two very different responses at two different schools, which is to say that you should ask questions and sometimes you get great answers to your questions. So again, not quite a grievance.
Starting point is 01:42:14 Things are good. I think the first one, I think the first one was agree. I'm going to start taking notes about the things you complain to me during the week, and then I'll give you to assist when we get to the grief. I'll remind you the things you're aggrieved. No, not nearly enough. It's part of your problem. It's a huge issue with this segment.
Starting point is 01:42:32 You have to really, you need to figure that out. I don't know. By grievances with the grievances. A lot of the time, the segment is carried by Isaac's personal woes. And I love the fact that he uses that. a way of telling us that we're insufficient. It's kind of nice of it. You guys need to be more upset by minor stuff.
Starting point is 01:42:51 Yeah, there's, if you're, if you're, if you're, if you can't find one thing that really piss you off in the last week of your life, I don't think you're paying close enough attention. That's my first one. Untangling Christmas lights? Like, that's a constant thing we all have to do. That would take that. That is a great one. Good, that's a great one.
Starting point is 01:43:07 That's so much better than, oh, I went house hunting and had the best meal time ever had. That's not even close to, it's not. But I did, I did struggle with Christmas lights just a couple of days ago. But I worked through it with my children. Man has not created a shape. It turned into a beautiful experience. So I can't help it. Of course.
Starting point is 01:43:25 Of course it turned, it turned into a beautiful, all right. All right. This empty glass is half full. Yeah. I'm going to fire Camille before this podcast is ever. He's already on the board. Yeah. All right.
Starting point is 01:43:38 Gentlemen, appreciate you. Appreciate Andy Mills for stopping by the show. And Camille, we'll let you go catch one of those flights that there enough of and see you guys soon. Bye. Our executive editor and founder is me, Isaac Saul, and our executive producer is John Lowell. Today's episode was edited and engineered by Dewey Thomas. Our editorial staff is led by managing editor Ari Weitzman with senior editor Will Kayback and associate editors
Starting point is 01:44:04 Hunter Asperson, Audrey Moorhead, Bailey Saul, Lindsay Canuth, and Kendall White. Music for the podcast was produced by Diet 75. To learn more about Tangle and to sign him for membership, please visit our website at retangle.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.