Some More News - Even More News: AI: “People Hate The Slop”

Episode Date: April 17, 2026

Hi. On today's episode, Katy, Cody, and Jonathan talk with Ed Zitron about Allbirds, Anthropic's Mythos, and other A.I. stories. After that they dig into JD Vance's fight with the Pope and Pe...te Hegseth sharing his favorite film quotes.As always, we recorded right before that big thing that happened.PATREON: https://patreon.com/somemorenewsMERCH: https://shop.somemorenews.comYOUTUBE MEMBERSHIP: https://www.youtube.com/channel/UCvlj0IzjSnNoduQF0l3VGng/joinFor a limited time get 40% off your first box PLUS get a free item in every box for life at https://Hungryroot.com/smn with code smn.For a limited time, Wildgrain is offering our audience $30 off your first box – PLUS freeCroissants for life – when you go to https://Wildgrain.com/MORENEWS to start your subscription today.#EvenMoreNews #JDVance #AISee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Transcript
Discussion (0)
Starting point is 00:00:00 Hi, hello. Today we are talking with Ed Zitron about all birds, anthropics, mythos, and other AI stories. And then after that, we'll dig into J.D. Vance's fight with the Pope and Pete Hegseth sharing his favorite film quotes. Well, welcome back to even more news, the first and only news podcast. My name's Katie Stoll. That's right. That's right twice. That's the name of the show. That's what it is. That's how many other news podcasts there are. This is your name.
Starting point is 00:00:44 And I'm Cody. Hi. Everything was accurate so far. Let's see if I get this next part right. We are very thrilled to welcome back a return guest, the host of Better Offline and writer of Where's Your Ed at? It's Ed Zittron. What's up? Hey.
Starting point is 00:01:02 What is up? What is up? So many things. We'll talk about them. The world's great. We've got all the things. It's going well. It's normal.
Starting point is 00:01:10 Jonathan. is also here. Hi. I was waiting for that. Hey, Ed. How's it going? It's going gray. I'm really enjoying just setting up probably the largest I told you so in history.
Starting point is 00:01:24 That's what I'm just being doing. It's on the way. It's coming. Every day. I got like so many of them. I got like a giant Apple notes full of people's names and links to things they wrote in 2023 that I intend to publish again and again and again when all this stuff explodes. You've got the receipt.
Starting point is 00:01:40 Oh, I got detailed. detailed notes. Take some time. It takes some time. Yeah. So that's what I've been focusing on. It's a strange job, but someone's going to do it. I mean, things do seem to be oddly dire.
Starting point is 00:01:50 It seems like things may be happening sooner rather than later. But, you know, we've thought that before. They can let this ride for a long time. I mean, yeah, they only officially shut down the Metaverse, like, a week ago. And then they kind of restarted it. Wait, what? Tell me more. Well, they just restarted it.
Starting point is 00:02:08 They're just like Andrew Bosworth, his CTI. I was just like, yeah, you know, we're going to leave it up for a bit. I mean, they renamed their company for it. Like, they got to hold on to that for a while. Also, the servers can't be that expensive for something used by eight people. Exactly. Yeah, what's the overhead on that? One continual meeting where they all have their legs dangling.
Starting point is 00:02:28 The legs exist now, and they're just sitting around the table meeting. What a strange fucking thing that was. They didn't have to say legs were coming. No, they didn't have to say that. No one was clamoring for the legs. The old man from the Simpsons. I wanted legs. I wanted the legs.
Starting point is 00:02:42 Well, you got them. It's going to be one of those funny things that every so often I'll trot out throughout my life. The legs are coming. What I like, though, is that the media has just moved on. Just like one of the strangest things to ever happen, $70 billion, where did it go? Same place the $200 billion is going for what they're doing now. We built a magic virtual world, and then how are we going to reveal it to everyone? You can have a meeting in it.
Starting point is 00:03:10 The most exciting thing you can imagine. Also, it's broken. It does not work. It sucks us. You need to use a weird helmet that costs a lot of money, which also sucks and is uncomfortable and makes some people sick. We are renaming our company. I would say Instagram. That would be a good name for the company and what they do now.
Starting point is 00:03:28 Instagram. Or just annoying. Allbirds made an announcement. Allbirds is was a shoe company that sold shoes and advertised on podcasts. Not this one. Maybe. We did. Oh, yeah.
Starting point is 00:03:42 We did a long time ago. I got a pair of them. I wore them earlier today. They're fine shoes. They're good. Well, it must not have worked very well because that company lost a lot of value over the last few years. Went from $4 billion. It's because they stopped advertising with us.
Starting point is 00:03:57 $4 billion valuation down to I think they were bought recently for $39 million. And then that company said that Albers is pivoting to AI specifically. They are expanding. to quote, AI compute infrastructure with a long-term vision to become a fully integrated GPU as a service
Starting point is 00:04:17 and AI Native Cloud Solutions provider. And they're going to be called New Bird AI. And so you, Ed can correct me on this if I'm not saying exactly what they are doing,
Starting point is 00:04:30 but I think they're just leasing existing data centers to provide compute capacity for AI companies. Is that it? From who? that's a good question so are they building them
Starting point is 00:04:42 well here's the thing the company said to quote the new york times will be used the money will be used the 50 million dollars will be used to spare to buy GPUs now the statement does and they raised a 50 million dollar convertible convertible financing facility to do this
Starting point is 00:04:59 now it did not say what the GPUs are but i will say this now doing the maths if they're buying mvL 72 racks gpues We're looking about 1,028 GPUs total. And that, but those are big computer, like, you might have to, you have to breakfast They are the, they are the GPUs behind AI.
Starting point is 00:05:19 I must also be clear that a thousand of them are fucking useless. Do you mean like, because AI is useless? No, I mean, literally that number is like, that's not, that's not anything. Because here's the thing. I'm just talking, 50 million dollars you're looking at like 14 racks of 72 GPUs. You still need somewhere to put them. Right. And you need the networking gear.
Starting point is 00:05:38 and power and all this other stuff I don't think I think this is just fraud I think that they will just Financial fraud you mean basically saying Where is this money going
Starting point is 00:05:49 Who are they gonna buy the GPUs Where are they placing the GPUs Right well then where do they go And then for what Like for who Are they And then other companies will use them for their Yeah
Starting point is 00:05:58 But this is the biggest Shitheads in this are actually the media Every single company That put a story out about this It's falling for a trap I'm not talking about podcasts, we can all yuck it up, this is fine. I'm talking about the New York Times, CNBC, everyone being like, oh, how odd. Congratulations, you felt for the trap.
Starting point is 00:06:16 Not a single story brought up the fact that, what, they're going to buy the GPUs, they're going to put them in a data center? What data center are they doing co-location? And they're going to build their own? This many GPUs would be about two megawatts of capacity, just sweet fuck all. Because they need gigawatts. Like OpenAI has multiple gigawatts. You need at least like 50 megawatts.
Starting point is 00:06:37 watts for it to be anything useful. And every single person writing about this is just falling it. It's fell for it at again award. Every time they're doing stock, they are literally doing stock manipulation. Yeah, it's like when you were, when you mentioned Zuckerberg and the Metaverse and stuff, and like, yeah, they fell for it and they did this. And like, they're still kind of going, like, there's no incredulity with any of these companies about anything they say.
Starting point is 00:07:02 Part of it feels like just you right now, Ed, explaining like the questions they should be asking. Nobody knows the questions they need to ask. And that's why none of us fully, I mean, you understand it. You spend your life diving in and fact checking this and cutting through the bullshit. But like most people don't fully get it. It's just able to go. And then nobody's asking the right questions. The reporting's not good. Well, that would suggest that was reporting. Because when you read the announcement, which I've now pulled up, they've reached a definitive agreement with an institutional investor left unnamed, very strange. That's extremely weird. Like that doesn't, that's strange, strange. Also, it has not closed. The convertible debt facility closes in the second quarter of
Starting point is 00:07:49 2026. It's never going to happen. I guarantee you it's just never going to happen. They're also planning to do asset sales, it looks like. It's very strange. And they're still going to make shoes, right? No, they said, this is apparently they sold. They sold the shoes to another company. They said they would still be selling some kind of apparel with Allbirds on it. Maybe on their gift shop. They'd get a merch or whatever. Weird.
Starting point is 00:08:19 But this is like a meme stock, right? It is. It's literally it should. Like it's already dropped a bunch. Did they sell it? Did they make the money? Let's see. Employee share sales, all birds.
Starting point is 00:08:31 I wouldn't be surprised if they. they probably set up some share sales, I'd have to look into it. I think everyone involved fucking sucks. I think everyone who wrote this up as a mark. I think that they're not going to buy a single GPU because if you think about it like this, if they close that $50 million facility, second quarter, they won't receive those GPUs, which are never arriving, until the end of the year. At this point, Vera Rubin, which the next kind of GPU will be out, and that'll be even more expensive, so they're not going to get as many, then where are they gonna put them up their ass i don't
Starting point is 00:09:03 maybe i hope so i fucking hope so i think there's enough room i mean i hope they fall it's gonna be tough with their heads up there but sure they can manage that's what i mean by enough room you know yeah but this is also like assign the i think even relative boosters right now are like yeah this is a bubble we're in a bubble it does seem like it's slowly becoming accepted that
Starting point is 00:09:29 a lot of this is spoken mirrors and just sort of like, oh, we're on the hype train. This is what we do. We just got to put out a release and then the stock goes up and we can, yeah. Almost wish I was evil. This feels so easy. So easy. We were joking about it before we started.
Starting point is 00:09:47 We're like, should we do this? Right wing pivot AI. I was like, yeah, we'll buy. Oh, no. To be clear, this is a joke. Like, I won't do this. But to be clear, us as well. I thought for a moment,
Starting point is 00:10:01 like how funny it would be if I did like a how I left the left style thing. Yeah, yeah. Like, you know, the AI, anti-AI critics have gone too far. I've now found how great Claude is. They would accept me in two seconds. They give me $100 million immediately in cash. I could have a whole lot post written in seconds. It's not just, uh, it's not just that I'm leaving the left.
Starting point is 00:10:24 I'm embracing the right. It's a real. It's not just shit. It's crap. Yeah. When it comes to, AI, we had a couple years where it's just like LLMs, LLMs, or everything, the language models, chat GPT, Claude. And now they're trying to push AI agents, which are like you upload.
Starting point is 00:10:44 It's just LLM. Right. It's in, it's a repackaged LLM, right? No, it's just LLMs. It's an agent is just an LLM. That's all of this. It might be an LLM that talks to another LLM, by which I mean they're prompting each other. And yes, they hallucinate. And the more they hallucinate. The more it happens. Exactly. Yeah. It's literally just LLMs. They want to use the word agents so you think autonomous. They are not autonomous. In fact, they fuck up all the time. And the bigger the job, the more they fuck up. And that's a mathematical certainty. Open AI themselves put out a study that said, hallucinations are a mathematical inevitability.
Starting point is 00:11:18 Anyway, this is the most biggest, hugest thing ever. No one can really tell you why, though, or what it does or why you should care. Just this vapid thing of like, oh yeah, it writes a lot of code. Might be useful for software engineers. but no one can seem to tell me why to what scale, what that actually means. And no one seems to want to. They just want to keep like eating up the slop and going, oh yeah, yeah, it's the biggest thing ever. And look at all the monies.
Starting point is 00:11:44 Look at all the monies. These companies lose billions of dollars, but money go up, number go up. Yum, yum. That's all that matters. Yeah, so many of the use cases just seem to be like, yeah, like this agent will talk to this agent for you and then correct the problems that this agent made. And it's just sort of like eating each other a little bit. it to create something that somebody might think like, oh, that's good. Like, they're already, they're already clearly like, well, the video generation, people
Starting point is 00:12:07 like getting a disorder, like they're realizing, oh, people hate the slop. Yeah. And it's cost a lot of money and power. So we're not really going to lean into that. So just trying to find, like, well, what can we, what can we sell beyond just the idea of the hype around AI or AGI or whoever they want to sell it? I do think it's like, it's because it costs so much to run that they need. to promise more than it can do.
Starting point is 00:12:32 The analogy that I've been running through my head and you can tell me if this is wrong is that it's like if they invented the dishwasher and they're like, this is pretty cool. It'll wash your dishes. You used to have to do it by hand, but it cost $100,000 just to make. So they would have to pretend that it does a lot more than just wash the dishes in order to promise it. It's a microwave. It's your security system. It does this, this, this, this. It won't wash and you won't know which ones.
Starting point is 00:12:59 You won't be able to see that are dirty. Sometimes the dishes come out on their cups now. Exactly. Yeah. Or have extra fingers. Sometimes they add more dirt. Yeah, my silverware is a dog dish. How'd that happen?
Starting point is 00:13:09 That's fun. That's just like, that's the future, baby. I don't understand it though. Like if you, if you presented a calculator to me and said, this will do any calculation you want, but it'll get a bunch of stuff wrong and you don't know how often and you don't know which ones will be wrong. so you have to manually check them all anyway. Isn't that useless?
Starting point is 00:13:29 It's an introspection checker. It's like if you can look at something and say, I understand this and lie to yourself and be like, I understand this fully, then you are exactly the mark for this. If you truly, if you know things by which I mean you know sentences about things that you can repeat without understanding them,
Starting point is 00:13:47 yeah, this seems like magic to you. If you're an executive that just goes to lunch and answers or ignores emails, this is magical. I read a story, I think it was in the Bloomberg, where it was like, oh yeah, young people, enterprises are hiring young people because they know AI well. And there was a whole thing about how there were people that can't make phone calls and can't write their own emails. And they insist on using AI. And it's like, if I receive an AI generated email, I'm immediately offended. I can tell.
Starting point is 00:14:17 And I'm just like, wow, you don't give a fuck. You just, you don't think for yourself. And also, I'm terribly afraid of being misled or being wrong. I don't want to be. I care a lot about my knowledge. I don't think many people do. I think people are really happy to just slop shit out and be like, there you go. I also think that there's a cult of personality around this that is genuinely dangerous.
Starting point is 00:14:39 And just like the feeding of like the ego and sort of like, it's going to compliment you, it's going to tell you every idea you have is amazing. There aren't human beings to be like, I don't know about that, actually, which is how things are made and how ideas are teased. out and how we know things is by not just saying yes over and over and over again. It's so dystopian in every capacity. When I think about the implications of it, what it means. Why are people drawn to it? There's probably a myriad of reasons.
Starting point is 00:15:09 I think you guys have hit the nail on the head on some of it. But it also just speaks to how many people hate their jobs, hate what they do, or just going through the motions. And so they want to automate it, whatever. Okay. So there's that to unpassed. but then just the vision that keeps trying to be packaged and sold to me. It feels so hollow and strange, yes, because I don't know exactly what it is that we're building to
Starting point is 00:15:32 or what it is that it does or what we can expect from it. Sure. But even if you were to take all the warnings and all the things that they're saying, like, what it can promise, why do we want that? I'm genuinely having a hard time wrapping my mind around why that is actually good for society. productivity you'll make that argument okay except not because you have to correct the mistakes but then who are we but even what you're saying you still because they don't provide one are not saying what it does no i don't know what it does ed i don't know and that's like
Starting point is 00:16:08 what but i'm like maybe i need to right it's it's all these like ideas like oh it's going to change the world and it's going to get rid of a bunch of jobs and it's also going to make everybody happy and rich somehow and you have you have to agree to this and there's no explanation even I saw there's this one interview that fucking Elon did
Starting point is 00:16:31 and he's talking about how kind of cater to your points like yeah in the future like we're not even going to have like phones or phone companies or apps you're just going to have like you still have sort of a slab that you hold onto and you're connected to your AI it's going to say you're extremely based basically no he said like yeah you're going to pick
Starting point is 00:16:47 you're going to pick it up and it'll just sort of show you what it thinks you want to see and like that's the future and he says it as if it's an inevitability and it's implicitly good right he never he never pauses and be like and i think it's good that we're going to have that that's something that sort of feeds you what it assumes you want to see there's no explanation for why that's a positive thing because they can't they can't say that and that's good we don't want to think anymore it's absurd this whole thing feels a little bit like buy into it, buy into it, buy into this. If you don't use it, you're fucked.
Starting point is 00:17:21 If you don't use it, you're going to be left behind. It's like, but you're leaving yourself behind by not using your fucking brain anymore. Well, I think that's kind of like the big question is because there's been this like, there's been this like broader discussion online of like, is the, the left doing itself a disservice by being broadly against AI? And I kind of want to like step back and be like, because I'm not against advanced software doing stuff. Like if there's a thing and you're like, oh, this will help organize your spreadsheets and it'll tell you what guest would be good to come on in November because of a thing that's happening.
Starting point is 00:17:57 I don't know. There could feasibly be a thing that's there. But what we're being presented with is no, no, no, this is your job now. This is your life now. And it doesn't feel to be able to meet any kind of that expectation. But like, is it is the, the, the, the, the, the, left turning people off by just being super kind of impulsively anti-AI. This is also the thing. It's a very deceptive way of suggesting the left are anti-progress as the left I would are, and the left, whatever the fuck that means, it means anything they wanted to. Quite frankly, they just go, hey, this doesn't do enough and it loses all this money, it takes
Starting point is 00:18:38 up massive amounts of space, and it steals from everyone, it's environmentally dangerous. It's, look, why are we doing this? And the answer is, we're doing it, fucker. We're doing it. Do it with us or else. And whenever I read someone being like, let's be reasonable here, there are, we're getting LLM centrist now, by the way, and they are actually as annoying as the full AI boosters. Because they're like, look, it isn't this big thing.
Starting point is 00:19:03 But you can't keep saying the stochastic parrots thing. You have to admit it's doing some stuff. It's like, I don't have to admit fucking anything. And on top of that, even if this thing is writing a bunch of, of code. Why, like, is the code better? Because most software products feel worse, including Anthropics. Anthropics keep breaking. So if it's so good at code, why, also, I don't want this to happen, but they're not laying off its software engineers. It's not like it's replacing them. And everyone I talk to, I talk to a lot of people in big tech, they're
Starting point is 00:19:34 fucking miserable. The deadlines have all been moved up. They're not getting as much support. And they're being told, just put it in fucking Claude Cod Cote, just take a screenshot of it and say, code, this is broken, fix it, Claude Code. Maybe it will work, maybe it won't. Yeah, you know, it's just, yeah, the left doesn't like this. The left doesn't like this because it fucking sucks. It sucks. If this thing existed without the environmental problems or the stealing problems or the big data centers, if this was just regular software, which is how we should treat it, we wouldn't have this debate. But because we have a tech and business media that is so heavily captured by corporations and hype cycles, it's impossible to stop them.
Starting point is 00:20:16 They have to frame it like they're inventing God and get on board or not. You can't raise billions of dollars by saying, yes, this is chatbot. Yeah. Meanwhile, speech to text is so much fucking worse. Like, everything is worse. Search is worse. Like all of these things that allegedly, purportedly, are going to get better, have gotten worse. So I don't know. When are the, I keep asking people, when is that going to happen? When are they You keep telling me the future's coming. When is it? Where is it?
Starting point is 00:20:42 I swear that auto correct on my phone has gotten worse. It's so much worse. Everything is worse. 100%. But like how? If you go in... Like, how do you do that? Yeah.
Starting point is 00:20:50 My favorite feature is when you're on Google Docs and you make a typo and it puts the squiggly red line and then it goes, yeah, you got a typo here. Yeah. And it doesn't give you any suggestions. Yeah. And it's like, why is this happening? And you click a link and it doesn't explain why. And it's like, it used to work. It used to function exactly as it was supposed to.
Starting point is 00:21:09 Technology is meant to be something that gets better over time, not worse. Howdy y'all? As a Montana ranch hand who spends most of her day wrangling horses, I can't so much as muster a can of beans come noon time. Who can plan on a week's worth of nourishment whilst riding, roping, and rustling? Well, thankfully, all that done changed. Once I discovered hungry route, they're my one-stop. feed shop for planning a week's worth of fixings, delivering everything to my homestead and not breaking the bank while doing it.
Starting point is 00:21:47 They've got over 50,000 recipes to spare you from posting up hungry at the chuck wagon, and most of them can be compiled in one 48th of sun time a day. That's 15 minutes for y'all city folk. Plus, they're filtering out more than 200 added. including high-fructose corn syrup, artificial sweeteners, and preservatives. No more of that counterfeit, molasses chunking up your sasperilla, if you know what I mean. Matter of fact, I bet you're going to love Hungry Root as much as I do. For a limited time, get 40% off your first order, plus a free item in every box for life,
Starting point is 00:22:34 long as your subscription still subscribing. Go to Hungryroot.com slash SMN and use code SMN. That's Hungryroot.com slash SMN to get 40% off your first box and a free item of your choice for life. That is, as long as you're keeping those pesos about. You folks take it easy now.
Starting point is 00:23:00 Catch you later on down the dusty old trail. Another big story in this, world that we haven't talked about yet on the show is Anthropics or Claude Mythos preview, which is a new model from Anthropic. A secret model. Right. They're giving to companies and, like, defense contractors to, like, analyze, but they're not releasing it publicly because I guess it can...
Starting point is 00:23:28 Well, not defense contractors. Not defense contractors? Just cybersecurity companies and some banks. Cybersecurity. Because it can, like, get through. Linux security protocols? Is that what it can do? Or that's what they say? That's what they're saying and they're also not saying that. It's said that they found a 27-year-old bug in free BSD. What they failed to note was that that bug is not exploitable.
Starting point is 00:23:51 And indeed, they had to do like 130-something manual reviews. They did. Claude Mythos didn't fix the bugs. It found a bunch of them. They didn't include how many false positives there were, which is a very important detail for something that's meant to be autonomous. And also, They fucking lied in it. There was a whole story that went out saying, oh yeah, Claude Mythos escaped containment. It's not the first time they've done it. A escape containment and the person in question who was running it was informed that Claude Mythos escaped when it sent them a message. What actually happened was Claude Mythos was in one server.
Starting point is 00:24:29 The model was here in one server. And then the environment it was using was a totally separate one, which it then cracked. and then sent a message and it did all of this after being explicitly instructed to. They also do not say how many times it took to do this. They do that a lot,
Starting point is 00:24:48 which is claim, like, yeah, like, oh my God, it became sentient or like it started asking me questions or it did this and you look into it and it is exactly what you said where it's like, no, you told it to do that. Right. You literally type the prompt, do this.
Starting point is 00:25:02 And you can be pleased that it accomplished the task but to frame it like it just had a mind of its own and decided to do that is so absurd and it always like turns me off of anything because like you can you can be excited about oh it's capable of doing this it capable of doing this but as soon as you're lying about that I'm like oh then I don't believe you you're full of shit why would it's not even the first time they've done it they did it with an older model where they claimed it blackmailed someone and then I remember that and in very small
Starting point is 00:25:36 words, it said, very small letters, it said, and it was prompted to do so. Yeah, it told it to try to do that. And it was explicitly trained. It was explicitly trained to do this. Like, the training was specific to get this outcome, which was not autonomous. This shouldn't be legal. Like, that's, and it's also, it's useful to know, like, if somebody got a hold of this and told this model to do this. It, can do it, but to frame it like, wow, we gave it access to these emails and it fucking blackmailed the CEO. Yeah, because you told it to. Yeah, and it's good to know that it can do that if you tell it to do that, but like, that's not the same thing. Couldn't we train these publicly
Starting point is 00:26:21 released products not to blackmail? Like, wouldn't that be a thing? It feels like, but they can't, though, they can't. The only, it's the, it's not even the first, the, second time someone's done it. Because when GPT4 came out, good work Casey Newton and Kevin Roos of the New York Times on this one. They claimed that it blackmailed a task rabbit when actually what was happening was that they were literally prompting the model being like, what would you say here to the task rabbit to make them do this? What would you say? And then they're like, yeah, it was theoretical. Anyway, this got reported by the Times and Casey Newton and I quote said, shut it off. And that's exactly how he said it. I'm so, and the thing is, I'm not trying to bag on one person other than the fact that he's been
Starting point is 00:27:03 ourself to me. It's the way to stop things like this is to have responsible members of the media to stop them. And we just don't have that. We have people that are actively captured. You have multiple outlets that are pro-AI.
Starting point is 00:27:19 And I'm fine with not being pro or anti. If you want to actually be objective, fine. But when the objectivity is, I will repeat whatever they say and not involve any kind of criticism. It is objective. It's just like just whenever like, like yeah cop union like releases a statement like look what happened they're just doing that for the
Starting point is 00:27:38 entire industry and the time so that guy who threw a molotov cocktail sam ointman's house he was inspired by eleazar eudkowski from um oh god what's the book called it's like if they build it everyone if if they build it i'll kill myself i don't remember if if anyone builds it everyone died so not ronan farrell's not ronan farro's piece like they're claiming no that was and the whole suggest that's just an attempt to chill criticism. No, the guy was inspired by AI Duma porn. This book about AI Duma. Sam Orkman in 23 said that Eliyzer Yodkowski might get the Nobel Peace Prize one day. So yeah, the dangerous rhetoric here comes from Sam Orkman. Comes from Dariana.
Starting point is 00:28:20 That is what's so wild also about the framing of this as like, oh, AI, anti-AI people are now escalating this into attacks on our lives and, et cetera. First of, I am not, don't bomb people. You know, I don't, I'm not supportive of that. But you, let's be clear about who is creating the Duma narrative. Let's be clear about why this is in our faces and it's being used as the excuse for letting go of people, not the engineers as you've just articulated, but other people,
Starting point is 00:29:01 and there's a lot of tension and a lot of fear and you are stoking it. So what do you expect in this climate to happen? They say it all the time. They say it's the end of the world. Again, don't throw a Molotov cocktail people. Yeah, but it's like if you go on TV and constantly say, I'm scared of the things I'm creating,
Starting point is 00:29:16 if you mislead people to say, to inter-believing that these models are somehow sentient and have their own means, if you are saying, I need data centers for these things, and by the way, My biggest thing is, 50% of white dollar, white collar labor jobs are going away. Yeah. Unstable people are scared.
Starting point is 00:29:36 Harder time getting credit or a mortgage or even a job these days. Everyone's fucking tortured. But then you go on TV and you look and it's everyone's saying AI. Anything AI people ask for, they're humored like a gifted child. If they need more space, they can get more space. They need more money. They can get more money. And yeah, these people are saying, yeah, I'm going to take your job.
Starting point is 00:29:55 I'm looking forward to taking your job. I will own a sentient creature, which I will then trap and make work for me, which is also known as slavery. Yeah, that's my plan. I'm doing that. I'm taking your job. I'm very scared of the thing I'm creating, but also that I own, and I have more money than you.
Starting point is 00:30:11 Yeah, an unstable person could easily take that as a threat. Yeah, and then they'll always add, like, every once in a while, but eventually, universal high income for everybody, and no one will have to work. How? How? How? How? How does that work?
Starting point is 00:30:25 Explain that process. And if their argument is going to be, oh, we'll pay it. No, I don't think you will. Prove it now. Try it now. Your behavior has not indicated that you will allow that to happen. So why would we believe you now? The hype train can just keep going on forever.
Starting point is 00:30:45 Eventually, they'll run out of money and people won't be paying them enough money. And then they'll get new investments and it keeps going forever. And it keeps going forever. The good news is that it will not. never stop and it goes on forever. Excellent. So I was being facetious about this being able to go on forever. Eventually, once, unless they're able to, unless the OpenAI raises its prices for chat EBT to 200 bucks a month or something and people pay it, they're going to need more investment at some point, more billions at a certain point. The investors are going to say, okay, I think
Starting point is 00:31:21 Reuters reported that $1.6 trillion has been put into this industry. since 2013. Yep. So here's the funny thing. How are they going to make that back? Right now, Anthropic is like they're having serious stability issues, the models are dumber.
Starting point is 00:31:37 What's great is they've also, because, so to explain, it used to be on an Anthropic $200 a month clod subscription. You could, so when you use an AI subscription, you're connecting to a model, or a series of bottles, and it burns something called tokens.
Starting point is 00:31:50 There's input tokens, so there's stuff you feed in, output tokens, the stuff it spits out. Now, to make things slightly more complex, but it's not too bad, it does this thing called chain of thought reasoning on a lot of the models, which just means if I say, write me a workout plan, it will take that prompt and then say, okay, what are the things I would do in a workout plan? How about nutrition? How about this? Every time it generates that plan, it uses output tokens. Output tokens are also more expensive. On a subscription to like Claude, you used to be able to spend $2,500 to $5,000 worth of tokens on 200 a month.
Starting point is 00:32:25 They have since cut the rate limits down. They're still letting you spend way more, but not as much. But they've cut down those rate limits a ton. We're going to keep doing that. And they have to keep doing it because it's the only way to make the company anything approaching profitable, which I don't even think is possible. People are acting like they're being stung to death by bees. They are freaking like, what the fuck? What are you doing? I'm going to use Open AI.
Starting point is 00:32:47 Now Open AI is changing their rate limits. So, and it's funny, I predicted this over a year ago and people were like, you fucking is. you fucking idiot they're never going to do that why wouldn't they why wouldn't they do that that's how this works that's how everything works
Starting point is 00:33:03 they're going to keep trading $15 for $1 forever and what's funny is they've also said things will come down prices are coming down it's coming down it'll be fine they're not coming down they're getting more expensive the models use more tokens now the models are getting dumber
Starting point is 00:33:20 somehow with new releases and Anthropics website is just constantly going down and people like, well, they'll fix it. When? Yeah, more expensive and dumber and worse. It continues to do that. Is the idea that like, oh, you know, TV's got a lot cheaper over time, all these miles of data centers are going to get cheaper over time.
Starting point is 00:33:40 Is that what they think? They think, but not really. Well, that's what the cell, that's the excuse or the line of thinking they can give, I guess. But that's not how this works. That's how TVs worked eventually, I guess. So I still, I know TVs are cheaper. I still think they're expensive. But...
Starting point is 00:33:56 The way that TVs get cheaper is the components inside the television get cheaper. The cost of running a TV show on your TV is negligible. The cost of running any AI service seems to only increase. I had a person at Microsoft tell me that one of the smaller reasoning models, 04 Mini, I believe it's called, if you had a particularly bad coding problem, it could take up 12 GPUs.
Starting point is 00:34:22 Just 12 of the fuckers burning... Burning at full speed. And it's really strange because when you talk to AI boosters, they mostly use the classic philosophical argument of nah-uh. Hard to argue with that, huh? That works for a while. It works for a while. It terrifies me the implications of, yes, everybody going all in.
Starting point is 00:34:40 And it also terrifies me what happens when this busts. What happens to all the infrastructure, all the lives, all the disruption that's happened, what happens when there's no returns? I don't know. Yeah, all these, I mean, yeah, even though all the data centers are going up, like, well, a lot of those aren't going to be used anymore. Laser tag arenas. Exactly.
Starting point is 00:35:03 I think we're going to see the end of venture capital as we know it, because, which is probably for the best, we're going to see a destruction of high-end venture capital because it's not just open AI and anthropic dying. It's when they die, because every AI stop, every single stop you've ever used that connects to AI is unprofitable. Every single one, without exception, perplexity, cursor, all of them, all unprofitable,
Starting point is 00:35:28 and they're unprofitable because it costs so much to run the models. When those companies are also dependent on the same venture capital that Open AI and Anthropic are, is a limited pool of money. When they start running out of money, they die. And then they can't feed that money to Open A.I. Anthropics, so Open A.I. andthropics, revenue drops. This in turn creates a vicious cycle,
Starting point is 00:35:50 where everything kind of falls apart. The thing is, at some point venture capitalists are going to side with me they're going to just be like fuck and when they pull away that's when stuff collapses
Starting point is 00:36:00 those data centers my biggest worry the reason I'm being the hate is guide to private equity which I think will be out when this runs private credit even is because private credit
Starting point is 00:36:10 is sinking I think over 100 maybe 200 billion dollars into AI infrastructure and private credit is funded principally by insurance
Starting point is 00:36:19 and retirement funds state pension funds, Calpas, Arizona's, Ohio's, some health insurance companies as well. Private credit is funded by insurance and the thing is, it's not just that money going away. If those debts are not paid, that means that there is no yield to give to insurers so that they can pay their premiums. The problem is, is that private credit is pretty much the main source of yield for a lot of retirement and insurance funds. So this isn't just, wow, we'll lose the money now. This is where the growth is coming from. So it's extremely bad and there is no bailout that can fix it. Unless the government
Starting point is 00:36:58 becomes more socialist than you could possibly imagine, I don't mean because a bailout isn't what's because they would just be feeding the money to the funds that would feed into retirement funds. I don't think that's going to happen either because we actually don't know how big the private credit problem is. We have no idea. It may be $2 trillion, it may be $4 trillion, it may be $5 trillion, dollars, maybe six trillion dollars. Of the six trillion dollars in insurance and annuity company funds, a trillion dollars of it is private credit. There is a systemic weakness here. And AI, all of that money being sunk into data centers is being sold to investors as stable yield from infrastructure projects. Is a question, how the fuck do those data centers pay their loans if their clients are unprofitable
Starting point is 00:37:45 AI start-ups. And they keep doing that. And the two largest sources of compute demand are OpenAI and Anthropic, who burn billions of dollars and will run out of money. No, when I mention this to AI boosters, they talk, I feel like I'm going crazy, because they're just like they're working out. There's a super cycle. There's huge demand for AI.
Starting point is 00:38:06 Other note as well is Anthropic running out of compute capacity and everyone's like, wow, it just proves they have tons of demand. No, it doesn't. because if they had tons of demand and it was a capacity crunch, they'd have to stop accepting money. They'd have to stop taking people. They're not.
Starting point is 00:38:22 They're just making the service worse. So what's actually happening is they don't have the capacity to serve their current customers and they're saying, fuck them. And so eventually this all falls apart. It's just a big fucking con. I'm asking Gemini that question
Starting point is 00:38:36 that you posed a second ago. I'll figure this out. Is it helpful? Is it helpful? Grab the officer's service revolver. I mean, this is just, this is all, terrifying and you've got so much information about all the things. Yes, I'm sorry.
Starting point is 00:38:50 I'm just rattling you off. No, it's okay. I'm just going to have nightmares for the next week. Vitally important. But I mean, all of my stuff, like all the little things that I'd gathered in my head to talk to you about seem so inconsequential when you lay out the big picture things. You know, like in Tahoe, people are about to lose energy because they're, like, within a year, the energy, the electricity purpose.
Starting point is 00:39:14 provider is cutting off Tahoe because they're building a data center nearby and no one's saying, how is Tahoe going to get energy? Like, I'm sure they'll work it out. And then you could, why would people be angry about the AI companies? I just don't know. It must be the wrong. I don't understand. Why could we? But they're like, well, but this is more important. These robots, not robots, these machines, this AI, but yeah, robots software is more important and more valuable than the actual human lives that live in this area when it comes down to it. And then you lay out all of this shit. I mean, that's what all the venture capitalists think generally too. Like that's who that's what they value more. They don't really value humans
Starting point is 00:40:03 beyond like abstractly in their like simulation that they think they're a part of. And so they're just going to keep pumping it until they become the gods that they wish they were. Well, the good news is the people that lose the money are going to be huge fucking assholes. That is good news. Except that a lot of people that are not huge fucking assholes, but I agree. But there's just so many people who's, you know, yeah, their retirement is in some broad portfolio and whatever and all of this stuff, the downstream effects on just people trying the rest to live their lives and play the game that
Starting point is 00:40:42 everybody has said play this game and that's how you'll have financial freedom in your old age whatever what about like david sacks you care about his feelings mark andresen has feelings too these guys have feelings and who's going to tell sam altman why his kid stopped throwing pizza on the he'll never know he never answered that question tibby he just we know what did chat jean what did the kids stop throwing it some We just seems to have brushed right past collectively as a society that two weeks ago or something,
Starting point is 00:41:16 I mean, this has been going on for a while, but apparently Sam Altman's sister is suing him for sexual abuse throughout her childhood. And we just move past that. And I'm bringing that up. Well, first off, what a horrifying story. What a monstrous person. But these are the people that are shaping. our future. These are the people that say, trust me, trust me. Why should I trust you at all? I mean, in general, why would we trust any of these powerful people? But you specifically,
Starting point is 00:41:51 it's just, it's bananas to me that we are in this reality. And, but yes, okay. Take away my energy, please. Just take away my power so that you can. So that you can generate sexier Garfields. Yeah, exactly. all the things. That's the only AI image I've ever generated. Sexy Garfield. Garfield with huge cans.
Starting point is 00:42:15 Why don't you think... Anything after that? Can I see it? Alexander Webb for there were no more Garfields to generate. I don't think we've yet reached peak sexy Garfield, though. I feel like we can keep fine-tuning
Starting point is 00:42:28 the algorithm. What the G and AGI means. The last thing I'll say is most people I know that are starting to... Some people have experimented with AI have learned about it. I know a lot of people being shipped to conferences about the future of AI and HR or whatever their field is and trying to adopt it. But in general, what I've seen, not me, I don't have a client or whatever agent.
Starting point is 00:42:52 I'll see two people use it essentially like Google and ask the same question and get wildly different answers. So at a basic level, I don't understand. Yeah, if they want to replace search with this much worse version that does random things and hallucinates or whatever and all the bullshit, at least make the normal search work. Like it would be like, can you not have like the basic version be fine and like preserve that and do all your horse shit elsewhere? But no, they can't. Thank you so much for taking time to chat with us. I was hoping to feel better at the end of this conversation. that I do right now.
Starting point is 00:43:37 Sorry. But the truth will set you free. Feel more clear in my panic. I'm going to go tell everybody I know. If you can't have comfort, you can have clarity. That's okay. I live in cash for a reason. Ed, where can people find you and support your work?
Starting point is 00:43:53 Where's your ed. dot at and betteroffline.com. Please subscribe to my premium newsletter is my principal form of income now. So please do that. That would be great. But yeah, subscribe to the podcast. better off line. It's awesome. I'm doing a great monologue that should be out by the time
Starting point is 00:44:09 this runs about the Sammel and Molotov situation. Got the haters' guides private credit tomorrow. Actually, when the day that this runs, seven bucks a month, seven to be a fucking banger. I write like 15,000 words a week, if not more, so hell of value. Thank you so much, having me. Thank you, Ed.
Starting point is 00:44:28 We'll love to be back sometime. Have you ever wanted to make your own bread at home, but you can't find a spot in your kitchen to add a state-of-the-art proving drawer like they've got on the Great British Baking Show. As an American, I don't even know what that is. But I know I need it if I want to have good bread. At least, that is, until I found out about wild grain. The first Bake from Frozen subscription box for sourdough breads, artisanal pastries, and fresh pastas. They use a slow fermentation process that's rich in nutrients and antioxidants, plus everything
Starting point is 00:45:08 bakes in 25 minutes or less with no thawing required. Is that a showstopper enough for you, Paul Hollywood? That's even your real name. Their boxes are also fully customizable, a protein box, a vegan box, and a gluten-free box, which personally is my bread and butter. or rather, I guess it's my bread. You have to buy the butter separately, but they have that too. Seriously, folks, going to go off script for a second here.
Starting point is 00:45:39 I've been gluten-free for a while, and it is such a struggle to find good quality baked goods. Not anymore. It's so exciting for me. This is like a total, total game-changer. And their pasta is also excellent, and it will have you sang bonissimo with each bite, or whatever it is that they say,
Starting point is 00:46:01 say in England. Proper scrummy. Okay. With Wildgrain, imagine having fresh bakery, quality bread, pastries, and pasta at home without any trips to the store. And don't just take my word for it. They have over 40,000 five-star reviews and have been voted the best food subscription box by USA Today for three years in a row.
Starting point is 00:46:27 For a limited time, Wildcrain is offering our listeners $30 off your first box plus free croissant for life of your subscription when you go to wildgrain.com slash more news to start your subscription today. That's $30 off your first box and free croissants for life of your subscription when you visit wildgrain.com slash more news or you can use promo code more news at checkout. Now watch as I enjoy this freshly baked loaf of sour dough that that has made my house smell fantastic. Let's see folks. It's so good.
Starting point is 00:47:16 It's proper scrum, eh? That's got a good, a good crumb. Paula Hollywood would say. Oh yeah, you knocked it out of the park. Wild grain. This is so good. I'm happy. The first news we're gonna talk about
Starting point is 00:47:38 has to do with our guy. Which one? are least likable. The current, JD, we're going to talk about JD Vance. We're in like day five or six of Pope versus U.S. government gate. I should have predicted this when we made our predictions. Oh, I know, yeah, just like an extended feud where every ally of the president has to come out and be like, well, I feel like the Pope needs to understand God. Even obvious fake fraud Catholic, J.D. Vance, who's pretending to be Catholic.
Starting point is 00:48:09 when talking about theology. You don't have to be careful when you talk about childless cat ladies, but you do have to be careful when you're the Pope and you spent your entire life working to be the Pope. You don't have to be careful with your words when you're alleging that certain residents of a town are eating pets. You don't have to be careful with your words then. Yeah, of course not.
Starting point is 00:48:29 Anyway, so the Pope, as he tends to do, tweets like, don't do war. Everyone needs to not do war. He doesn't, he's not like lambasting the Trump administration. He's not in States of America. Yeah. It's hard to avoid because of the war that they just started. But anyway, he tweeted, the Pope did.
Starting point is 00:48:53 God does not bless any conflict. Anyone who is a disciple of Christ, the Prince of Peace is never on the side of those who once wielded the sword and today drop bombs. Yeah. So J.D. Vance thinks the Pope is wrong about that. And at a turning point USA event, which was attended by people. Cancelled. Erica Kirk had to cancel.
Starting point is 00:49:19 Eric Kirk had to cancel. She felt unsafe. So Erica Kirk had to cancel due to security concerns. So instead they got the vice president. No, the vice president. He was already going to do it. He was already going to be there. He was going to be in conversation with her.
Starting point is 00:49:36 Her team, they say, got word of some security. thing and they said well she can't go now and so the vice president's like I'll just talk to some other guy there. According I'm sorry gentlemen according to Candace Owens according to Candice Owens and I didn't listen this is just I've osmosis to this information according to Candace Owens
Starting point is 00:49:56 that's bullshit she didn't go because ticket sales were so low I mean A that I would believe but Candace Owens I'm pretty sure doesn't believe in like dinosaurs or like the moon or something. So I... She believes the dinosaurs
Starting point is 00:50:11 are here now. None of these are sentences I want to be talking about. I'm just saying to your point, Erica Kirk didn't feel safe but okay,
Starting point is 00:50:19 the vice president United States. It's so silly. It's ridiculous. So that's funny. Yeah, it's an 8,500 seat arena, maybe 2,000 people
Starting point is 00:50:29 where there was less than a quarter full. It didn't look great. It didn't look great. And so the vice president is talking with, I'm sorry, some guy, I don't know who. He's asked about
Starting point is 00:50:39 the Pope and he can't just say like, oh, the Pope's the Pope. Of course the Pope wants peace. Who wouldn't want peace? And then move on. He has to defend his boss. Do we want to watch his? Sure. I think it's very, very important for the Pope to be careful when he talks about matters of theology. And I think that one of these issues here is that there has been... Sorry, stop, stop, stop. Sorry, pause, pause. We're pausing. So... There's a heckler as well. That's also going to be. There's a heckler as well. complicates things. J.D., he is, he is careful about it. He's the Pope. It's like this bizarre, this bizarre world where, first of all, J.D. has apparently appointed himself like arbiter of
Starting point is 00:51:25 everybody's behavior. It's not his role or job. And he, like, again, is like one year senator. And now he's like, I actually should be in charge of telling everybody how to act. I'm going to tell people to say thank you and how they should dress up and how the Pope should talk. It's absurd. But it's like he thinks that the Pope was chosen via like lottery. Like, no, he's actually quite studied in theology and knows what he's talking about because he became the fucking Pope. It's not like a random guy.
Starting point is 00:52:00 It's somebody who wanted to be the Pope probably and has like spent his life like studying. this specific thing, we're going to continue, but I just like... Well, I just want to say, I want to piggyback off of that because it blows my mind. Look, I'm not religious. I'm not Catholic. But I actually do feel offended to a degree about this. The audacity of this rub who just found God, so to speak. This fake, fucking Catholic piece of shit.
Starting point is 00:52:33 Fucking asshole. Is out here saying, I would tell the person. Pope to be careful when talking about theology the fuck? It's dumber than dog shit asshole. Just like. Beyond it's just like. It's easy for us to say it's like, yeah,
Starting point is 00:52:47 is the Pope you can ignore him. If you are the thing J.D. Vance is, if you are like, you're not supposed to ignore the Pope. A Catholic. Right. That's the Pope you're talking about. It's just so condescending. All right.
Starting point is 00:53:01 He's going to, he's going to yell at this heckler and then keep going. Again, hey, random dude screaming. I told you I'd respond to your point. I just want to respond to this question first. But I think one of the issues here is that if you're going to opine on matters of theology, you've got to be careful. You've got to make sure it's anchored in the truth.
Starting point is 00:53:20 This weird, like, stay out of politics. He's just saying war is bad, first of all, is saying peace is good and blessed to be the peacemakers and all that jazz. You talk about religion all the time. You try to bring religion into the political conversation constantly. your boss father is likening yourself to Jesus Christ on the regular. So the idea that like if you're going to like be the Pope, don't get into politics. Fucking shut up, J.D.
Starting point is 00:53:50 You do this all the time. You're so happy. You get to feud with the Pope right now. But he brought up and Mike Johnson separately brought up this thing, just war theory or just war doctrine, which is basically like, well, in, you know, in theology. it's well established in Christian theology that there are certain reasons where you might have to go to war
Starting point is 00:54:12 if you're defending against aggression and you're preserving a longer lasting peace and it's a last resort and you don't hurt civilians and all the things you would expect people to say like when a war is okay like even colloquially a just war you'd probably like this this this is not like in the Bible
Starting point is 00:54:28 it's like St. Thomas Aquinas wrote about it kind of thing it's St. Augustine I believe it goes back the idea, yeah. Yeah, so Mike Johnson has bring this up and JD Vance is bringing this up because they know it's a term that they can use.
Starting point is 00:54:44 They know, like, well, there's just wars. But as we've talked about many times on this show, you didn't even bother justifying this war right now. This isn't even supposed to be a war? It's not supposed to be a war. It's combat operations and then it is a war, but it's not. And you're doing
Starting point is 00:55:00 regime change, but you're not and they need to disarm, but you're not. It's the just excursion, doctor. Yeah, you you bombed a school of 160, 170 young girls the first day. So I just like, I'm begging like anybody who actually gets to talk to these people in person when they're saying like, have you heard of the just war doctrine? Have you explained it? Use it to justify this war that you fucking started.
Starting point is 00:55:27 If you go down the sort of the list of ideas like, well, it would have to be this. It'd have to be this. None of them apply. Competence is actually a part of it too. And as we've seen, they are not displaying that. But specifically, I wanted to point out, sorry, just this doctrine wasn't invented by St. Augustine. Pope Leo was an Augustinian friar for 12 years. He, like, of all the things to be like, um, Pope, have you ever heard of the just war doctrine?
Starting point is 00:55:57 Yeah, he started for fucking, for decades. Like this specific thing that you're trying to fight the Pope about is something that he has studied for decades. In order to be the Pope, J.D. You learned about Catholicism a year and a half ago or something. I don't even know what I'm fucking talking about. And I know more like it's, I. But obviously, obviously this is all bullshit and him just saying things to have a response.
Starting point is 00:56:27 Because, yeah, the whole point of people being, well, there's so many reasons why people are upset. But there is literally. no justification being given. They've trotted out lots of different theories of ideas and tested the waters with different excuses and rationale. None of it makes the muster. None of it makes the cut. And we still don't really fully understand it.
Starting point is 00:56:52 I mean, we know. We know. It's the oil. All the things, whatever. But like they haven't justified it in any way. Everything they're doing is horrible.
Starting point is 00:57:03 And they can't justify it. And so to just. to cite this thing. It's like, explain yourself. Give an example of how this is a just war based off of this doctrine that you're citing. Do you guys want to have me read his new book that's coming out in June?
Starting point is 00:57:21 Communion, finding my way back to faith. I'd love to... We should do. Maybe we should all read it and do a series of episodes just about book club for that. About the... About the...
Starting point is 00:57:34 Tied into all this. is Pete Hegseth, also a religious guy, who was giving a prayer sermon at the Pentagon, and he was like, I'm going to read a prayer that was given to me by the head of the search and rescue operation that brought back those pilots. And he suggested that it was going to be a reworking of an existing Bible verse.
Starting point is 00:57:59 He says this is meant to reflect Ezekiel 2517, and if you are a 43-year-old white man, that Bible verse triggers something in your head. The film Pulp Fiction, because Ezekiel 2517 is the verse that Samuel L. Jackson's character says before he kills a guy in Pulp Fiction. Spoilers alert for the beginning of Pulp Fiction.
Starting point is 00:58:25 But before we play it, Ezekiel 2517 just says, and I will execute great vengeance upon them with furious rebukes, and they shall know that I am the Lord when I shall lay my vengeance upon them. The verse that Hegeseth reads is the considerably longer Tarantino created verse. Yeah, and to be clear, I do think it's important to mention that this is very stupid and not good. But some people are framing it like, oh, wow, Pete Hegsa thinks this is from the Bible. He doesn't think it's literally from the Bible.
Starting point is 00:59:05 He prefaces saying it's like a, yeah, like what, it's like an interpretation of or like a version of this. He says it's. But I think he thinks that he thinks that this is a version of a prayer that is from the Bible. I think he thinks the full quote is from the Bible and the word righteous man has changed to down to aviator to sound like a situation that happened. He thinks it's slightly different. He thinks it's slightly different instead of from the film Pulp Fiction. From the film Pulp Fiction, right. But maybe not, I don't know.
Starting point is 00:59:40 It's unclear a little bit if he thinks it. But I just wanted to say it's stupid and bad either way. So the prayer is Cesar 2517 and it reads and pray with me, please. The path of the downed aviator is beset on all sides by the iniquities of the selfish and the tyranny of evil men. Blessed is he who in the name. of camaraderie and duty, shepherd the loss through the valley of darkness, for he is truly his brother's keeper and the finder of lost children. And I will strike down upon thee with great vengeance and furious anger those who attempt to capture and destroy my brother. And you will know
Starting point is 01:00:22 my call sign is Sandy One when I lay my vengeance upon thee. Samuel did it so much better. Much better. Professional, actor. Academy Award nominated. Well, he's a dig sets of TV personnel. That's true. It's really, really silly.
Starting point is 01:00:41 It's silly, but also it is, I, I, we joke about it, we talked about it. And I was going to make this point before this even happened. But their view of so many things are so skewed and bizarre and based on their, like, films that they misunderstand. band. Like, it's very clear, like, okay, so you get a lot of your, like, religious ideas from, like, Nazi rhetoric and, like, movies and stuff. And it's actually, that's a fucked up prayer, uh, to have everybody join in in saying. Um, he, like, it's like, so you get all your, hexeth, you get all your religious opinions from like, fucking movies and the, the righteousness of this. And you get all your, like, your viewpoint of the military from like full metal jacket. But in,
Starting point is 01:01:30 like a positive light. Like, oh, that's awesome. I fucking love how cool it is. It's just such a, it's dangerous to have these guys be in charge and trying to reshape how these things are. J.D. Vance is actively trying to reshape what, like, religion is and what it can be used as and for justification for things. And it's just really, really gross. As somebody who's not religious at all. This is still really, really gross and dangerous, and I don't care for these people. It would be funny if the entire Trump administration just kind of went into Pulp Fiction dialogue, like, without really knowing it, Trump's, you know.
Starting point is 01:02:11 I'd like that. You know what they call a quarter pounder with cheese in France? Royale with cheese they call it. It's a royal. I'm the royale with cheese. I like McDonald. And then I looked and I said, I just shot him in the first. fucking heads.
Starting point is 01:02:27 Not all the dialogue. Yeah, maybe not all. He would actually love to quote that. What else we got before we sign off today? John, then you have some good news for us. Well, we'll do some quick hits with good news. I've got a few news stories.
Starting point is 01:02:42 First of all, who knows because it's just Trump saying it, but apparently Israel and Lebanon will start a 10-day ceasefire on Thursday night. And then Netanyahu and the president of Lebanon. Joseph Aoun, we'll meet at the White House for talks. I mean, have they even really ever honored the ceasefire in Gaza? I mean, right. Like, I don't even want to say like, oh, good, we're done here, but because it's like still dire. But that could potentially be good news and give Trump the cover he needs to stop all of this.
Starting point is 01:03:18 I also have that the House 10 Republicans joined Democrats. voting to pass a three-year extension of temporary protected status for the 350,000 Haitians living in the U.S. So that's some genuinely good news. That is wonderful. It seems like if they were doing the thing that Vance and Trump said they were, that this is not what they would do. Maybe this is a piece of, a little piece of evidence that maybe they were lying.
Starting point is 01:03:45 Maybe. Maybe. Maybe it's not true. Maybe it's a Nazi lie that they pushed. Maybe we already know that. But that's genuinely good. That is genuinely good. And it appears.
Starting point is 01:03:55 Trump has nominated Dr. Erica Schwartz, a former deputy surgeon general, or to be the CDC director, she was part of Trump's first term. She was a deputy U.S. Surgeon General, but she appears to be like a normal health, public health official. Like, I think, like, seems like vaccines. More normal than one would expect in the term two. The other guy is, he's got his new podcast and he's, maybe the guy said dark me, but it seems like the guest on his podcast was encouraging people to eat dog meat or whatever he's got going on he's cutting off raccoon penises somebody asked him about it in the hallway he walked away to an answer he was asked sir what did you sir what did you do with the raccoon penis was the question oh also this i just saw this uh the secret service has determined no credible threats to the uGA rally
Starting point is 01:04:50 after erika kirk's cancellation so they're the u.s secret service is releasing statements saying that Erica Kirk is full of shit. By the way, circling background to Canvas Owarms. Whatever. You know what? We did it, folks.
Starting point is 01:05:05 Yeah. We talked a lot about AI and we talked a little bit, BAT, BAT, BAD, LAPAD, and IIA. Hey, BAPA, BAMBAMBAMB Cotton IG.
Starting point is 01:05:18 Like and subscribe. Yeah, like and subscribe. Thanks for watching and listening. Thanks for being here. Thanks for being here. We got to go? Very much. Much.
Starting point is 01:05:31 We love you very much. We got it very much. We got it very much love.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.