Trillbilly Worker's Party - Episode 431: Claude Seeding

Episode Date: February 13, 2026

Spent more time on AI than we meant too (and stand by very little of it in the sober light of the morning!) Also hit some of the other topics du jour toward the end. Support us: patreon.com/trillbill...yworkersparty

Transcript
Discussion (0)
Starting point is 00:00:00 Sorry, I'm looking at this Instagram page for this company called Glue, G-L-O-O-H-Q. And it seems like just putting together context clues, their whole business model is developing godly AI, like developing AI that's like ethical along Christian lines so that like pastors can use AI. and they have like a hackathon they have like an annual meetup where they have praise and worship like imagine leading praise and worship at the glue AI
Starting point is 00:01:12 hackathon that would be so sick dude I well I got many questions one I guess is I don't know any AIs yet they can do upside down drumming but the other thing is
Starting point is 00:01:25 I do find it encouraging that they're actually thinking about like AI could be like satan and blasphemous, just in nature. No, no, it's not that they think it could be in nature. It's just that they want to get an AI that's accepted the Lord into its heart. So it's like there's good AI and there's bad AI.
Starting point is 00:01:47 But doesn't that, real quick, doesn't that sort of, isn't that like a conflate with like Peter Thiel's and I guess like Mark and, I don't know if not's his name, Egghead, you know, Okay, dude. The cone head dude. Yeah, the cone head dude. Doesn't that sort of like kind of, uh, sort of conflate with their idea that AI is like a new God, but you're like making it kind of literal, you know? Like seeking God inside of AI, you know what I mean?
Starting point is 00:02:19 Maybe, I mean, maybe that's the point. Maybe they're trying to like, maybe, well, I don't know. I mean, what is the, to them, I think they're just trying to make an AI that's like, uh, faith based. It's faith-based AI. They want an AI that's, they just want saved AI. They want saved AI. They want an AI that's accepted Jesus into its heart.
Starting point is 00:02:42 Because that's what it is to be saved. You just got to accept Jesus. The problem is that you don't have a heart if you're AI. I don't mean like you don't have a soul. Because that's a whole other topic. But like literally you just don't have a heart that Jesus could get in to. Right, right. You can't be sad.
Starting point is 00:02:59 I mean, you're alienated. like I know we'll get into it but if AI is sort of this which is the main criticism is that it hollows out like human existence right and just humanity right I'm not just with replacing workers but even the fact that it's
Starting point is 00:03:14 just a repository of what we've created what we've said discussed you know and it just kind of churns it out like I don't understand how you try to like humanize that in a way you know what I mean it just seems very I don't know
Starting point is 00:03:29 it doesn't seem like it's a I mean at least something that if you are a true Christian of a what does that mean even mean but of a human based religious faith at least you know that you'd want to be a condone that kind of shit yeah very strange you know also I got a slight approximation of how it must feel when somebody listens to me uh talk out of my ass about something i don't know anything about because like when i hear Joe rogan or any of the other usually suspects talk about like Jesus could come back as AI. It's like my brother, did you not read the first parts where it talks about Bell, Mollic, all these idols and graven images and, you know,
Starting point is 00:04:16 stand-ins for God that, you know, that... The golden calf. Yeah. Yeah, so, yeah, the golden calf, exactly. It's like, yeah, you got to take all that into totality. You can't just cherry-pick what you want, you know. Actually, you can. One thing. You can totally cherry. The Catholic Church, we talked about this on that episode with Kate Wagner, but the Catholic Church literally rewrote the Second Amendment so that they could have a loophole where they could depict Christ. Because iconography was like a big controversy in the Eastern Church. Was it a moneymaker though for the Western Church? It wasn't even that. Like they just really wanted to depict, they really liked their staying.
Starting point is 00:05:00 glass. They like their mosaic piles. The paintings. Yeah. Which seems why this, this seems though it would, if you were interested in the aesthetics, whether they're like actual creative or artistic physical aesthetics, you know. Wait, sorry to interrupt you for a second,
Starting point is 00:05:16 Aaron. I said Second Amendment. I'd done that before. I meant Second Commandment. Sorry. The point still stands. I've got to crack myself before I get ranked over the goals for that. Sorry, anyways, go ahead. No, no, I was just going to say, too, I guess this is back to my earlier point, too, about, like, religions being, like, you know, based in humanity.
Starting point is 00:05:40 Like, I don't know, man, this is, I guess you can say this about literature, too. Like, the Bible is, like, like, you read it, you know. I think even holding, like, a Bible, right, means a lot to people. The same way I might go to a library or a bookstore and hold, like, you know, a book, or an old book, you know, if it's a used bookstore. And sort of, like, not that being a religious experience, but that tacti, sort of like interaction with something like that,
Starting point is 00:06:04 you know, and then being able to like, you know, peruse the pages and read it, you know? That's kind of the same thing why people get up in arms when like people burn flags or like desecrate flags or even just like let them get weathered
Starting point is 00:06:17 and beaten and they're like, how could you? And it's like, some people view it as like, well, is that flag living up to the better angels of its nature or not? And like, I don't really care if it's not. Like the same way at the Bible, you're not going to apply the moral lessons of it.
Starting point is 00:06:32 Right. Of what use is the physical tactile product. But for others, it's like the actual physical thing means it's kind of a talisman. You know, they're sort of access to those ideals or whatever. Well, to that point, man, there's this Japanese sort of philosophy, wabiabi, right? Which means finding beauty and imperfections and age things, right? things being used and worn, right? I came across this idea from a really weird place.
Starting point is 00:07:02 It was actually vintage fashion because I've been just kind of, I don't know, man. I've been sort of trying to be more sustainable with the clothes that I buy. And, you know, y'all know I like fashion sneakers. And I like vintage shirts a lot and sort of appreciating the imperfections in the shirt that I might buy for like 20 bucks off of D-pop. You know what I mean? That, like, is slightly worn or whatever. But you would think that like all this would sort of relate to like that human experience. or interaction with whether it's a faith or a physical object or something that you appreciate.
Starting point is 00:07:34 But sort of throwing AI into the mix seems, again, to hollow out, you know, something that, I mean, it came from people, right? It's given people perspective and faith, of course, but it's given people, like for my mom, for example, it's given her a way to sort of guide her life, especially after my dad's passing. And I don't know, it just cheapens it, whatever you think about organized religion, just cheapening it through filtering it through AI just seems a blasphemous, maybe? I don't know.
Starting point is 00:08:03 I think that the point is, I mean, growing up in the church, we used all kinds of different technologies for church service. Like you had these big white screens that they would project images onto, and then you know you used the whole sound system, And not that that was like cutting edge technology or anything.
Starting point is 00:08:31 Well, not to cut you off, Zoom. My mom, when she can't go to church physically, especially during COVID, she used Zoom, right? To watch her servants and stuff, you know. Has there ever been an American religious movement or group that's rejected popular technologies? Like, genuinely. Like, it seems like every single one has embraced it and tried to, like, I don't know. It's weird. You would think that, like, like, even the Anabaptist now, like the Amish and so.
Starting point is 00:08:58 forth, you see that kind of getting away from the... I guess I forgot about the Amish. Well, I'm... Okay, I don't mean... Okay, the Amish and Anabaptists... Well, I guess that's also true for the Protestants. I was going to say they came to America. It's not like it grew up in America.
Starting point is 00:09:15 As opposed to, like, the Mormons, for example, which are like 100% grade A American religious. Great A American. Pentecostals, too. Yeah, but... Do Pentecostals reject technology? Uh, not, not intentionally, but because we come out of the poor, the poverty stricken places of the country, just out of, uh, certain extent. And the world, to be honest, because there's a big Pentecostal contingent in Jamaica, for example. Oh, and Latin America and Africa and other places, yeah. I know they don't wear jeans, which is a pretty innovative technology. So I got to. That is true. That's true. I'll tell you this. Okay. The church I went.
Starting point is 00:09:58 to for the first 14 years of my life didn't have plumbing. We went to an outhouse and I always thought it was scary when people would be doing the fire and brimstone preaching. Then I'd have to take a shit really bad. I'm like, man, I don't want to walk through the woods to go take a shit. The devil's going to snatch me up
Starting point is 00:10:14 out of here, you know? Okay, so you did reject certain technologies in the Vinogastal journey. Yeah, yeah, that's true. I see today's, like, weird, satanic hybrid techno religion, techno-protisanism, techno-Christianity as very pro-technology in general.
Starting point is 00:10:37 Like, it makes sense. I'm not in any way surprised that they were trying to use AI because, like, pastors have a job very similar to ours in the sense that they don't work six days a week. And on the seventh day, they're supposed to work an hour before they're supposed to work, they're like, fuck, I got to put together a sermon. I got to read a bunch of articles.
Starting point is 00:11:01 I got to read a bunch of articles. So I can preach, too. I can preach. And so it makes sense that they would embrace AI because they're like, oh, well, fuck, I'll just, I'm going to have Chachypti write a sermon for me. Again, to my point, doesn't that hollow out? Like, I think, like, I mean, I don't know, I didn't grow up in the church, right? But, like, you know, being a pastor, minister, whatever, it's like, being a church leader, it's like you through experience, through reading, through, like just absorbing this information and sort of coalescing it right with your own life experience
Starting point is 00:11:31 what you've read and what you see going on in the world from your religious perspective you would think that you would use that right like from your own goddamn brain right and not like a cheat sheet you know which AI often seems to be let me just stop you right there and you're you're you're you're making the classic assumption of thinking that these people actually believe in what they believe in yeah that's true yeah that's that's that my mom was a Christian who actually I think believes of what she believes, which is why she's a good person. I get my morals from her, despite the fact that I'm not a Christian. But you're right.
Starting point is 00:12:01 These people don't. No, they're Satan. I've seen preachers start preaching out of the Bible in the wind or the fam blow their pages. And they just start trying to preach the random verse into what they were talking about. Like trying to make it make sense. Everybody's just like, no, dude, what they have. Yes. They have a guy, they have a guy somewhere where you can't see out of view maybe behind the pew or something like that.
Starting point is 00:12:24 who has a fan that blows that to create like this. You know how people believe that Stanley Kubrick, actually the conspiracy theory is that he actually was in the NASCAR control center orchestrating the cinematography of the moon landing? I mean, I guess that would be that version of that, except, you know, you would use a fan to blow these pages along so the pastor could seem as if he's spontaneous and, you know, inspired by God, you know. Not to digress, but I didn't know that there were several different revisionist
Starting point is 00:12:54 moon landing conspiracies. I mean, I always heard that it was faked on a set, but now you're telling me that, like, well, it was real, but Kubrick was in the NASA control center like doing long shots and stuff, like Zoom.
Starting point is 00:13:11 Dog, I believe that for the longest time as someone who believes we went to the moon, just because I thought that going to the moon, the United States, it would have to, and I mean, there were decisions made to, I mean, from the planting of the flag, from the astronauts ascending to the moon, that were set up in such a way to make it seem like very mythological almost.
Starting point is 00:13:34 You know what I mean? And for a long time. It makes sense. I mean, you have to like really hope that they get the camera angles right. Like you got to have a money shot. And if you got four dudes who've never worked a camera before because this was 1969, only perverts used cameras, then how are you going to expect that? I mean, they don't, Rick, it was like four pornographers from like, you know, a cul-vers city.
Starting point is 00:14:00 I mean, they don't show, when you look at footage of the mood landing, they often don't show these niggas tumbling over their asses, right, in low gravity and making complete fools of themselves because it's not like heroic. It doesn't look like, you know, it doesn't look American, you know. They're making the flag they planted. Was it like made out of like some sort of like their material like their suits? I'm not sure, I'm not sure actually They made it on the way there They brought a flagmaker with them on the way And she knitted it
Starting point is 00:14:30 Her name was Her name was Her name was Debsy boss or something Yeah Wait go back to the Christian thing That's kind of how AI works As I understand it
Starting point is 00:14:48 Like a wind blows a page And it lands on a verse and then it just works that into its thinking model. Because, like, I'm going to go ahead and tell you guys, like, I'm, after reading that New Yorker article about Claude, the Anthropic AI, I'm, I got to give it to AI. Like, I was not familiar with your game. Like, I'm a little...
Starting point is 00:15:10 Okay, I think I've updated, I've upgraded my position on AI. I mean, first of all, I just want to preface this, by saying if you know it, if you know about LLMs and you want to correct me about what I'm about to say, please keep it to yourself. It's not because I'm incurious. It's just because I've got an oppositional defiant thing
Starting point is 00:15:39 where if you correct me, I'm just going to never learn the actual truth. I will entrench myself further into my position. That's usually what I do. I'll dig my hills in. I'll be like, no, I don't believe you. Okay, go ahead, sorry. But so,
Starting point is 00:15:59 caveats aside, the thing about AI and that this article tries to get at, which I don't know, I think that there was some editorial mistakes that were made, in my opinion. I think the editor let the author get away with some things.
Starting point is 00:16:14 I just have some questions. Suffice it to say that the general thesis of the article is, we don't know how LLMs actually work. And I know we've talked about this before. Like we've, this is kind of, they use the metaphor of a black box.
Starting point is 00:16:29 But to give you an approximation of why we don't know they, give you an approximation of what the article says and maybe a sort of like bigger picture understanding of why we don't really know how or why LLMs do what they do, essentially it's this. You use a base model, right? you feed ungodly amounts of material into an AI, right? Like every written thing in existence, books, magazines, movies, images, you name it.
Starting point is 00:17:04 Any kind of media is fed into an AI. And then it uses that to construct like thought chains, right? I think the language part of the LLM is what really confuses people because it raises all kinds of questions about. like how we form language and how we use it extemporaneously, how we use it in writing. But to me, the most striking part about the whole thing, and I don't know if every LLM is like this or if it's just clawed, but the way they get clod to work is it has its base model of material, but then they command it using essentially, I don't know how to put it, theatrical
Starting point is 00:17:52 models. I don't know how to put it. They literally say... Can I give an example? Is it sort of like I've heard this simple metaphor used, but it's like putting a slip of paper of what you already assume, readily assumed that the AI is going to do or what you might actually believe in
Starting point is 00:18:11 and then tossing it in and then getting a response out that of course you would expect because you're the one feeding it that information. Does that make any sense? I think there is a little bit of that involved. But I think the article is trying to make a larger point that at this point, we don't actually know why it's doing that.
Starting point is 00:18:33 So every human being does that. And we just call that intelligence general thinking. But our own intelligence is a little bit of a black box. And I do take that to be a little bit of a cop-out. I think when people say, like, well, we don't understand how our brains work and why we do the things we do. And it's like, well, yeah, but, I mean. But, I mean, there are external factors that you can assume or believe that guide human
Starting point is 00:19:01 actions and behaviors, whether there's social, economic, you know, cultural, you know what I mean? I give you a more base example of that. I was on a walk yesterday and had the thought to myself, why are my legs moving? Like, yeah, I get it. Your brain sends an electrical signal to the other. Brother, with sciatica, I ask myself that all the time. for different reasons. Yeah.
Starting point is 00:19:22 But, yeah, I'm of the opinion that I think we might need to get to the bottom of our own intelligence before we create more intelligence. Well, I think that a lot of people in AI would tell you that the way we do that is by creating a synthetic intelligence, which I am sympathetic to the idea. I do think that there could be perhaps truths about our own way of thinking, language construction that we could discover through LLMs. But what I'm trying to drill down on here is that like the way they get an AI to interact with you is they essentially structure its base model called extraction and processing with something
Starting point is 00:20:05 that is already familiar to humans, which is theater. Literally, they literally give it a role. They say, I'm going to play the role of human. You're going to play the role of machine. And we're going to have a conversation based off that. And so then, therefore, the machine, because it's playing a role, will try to simulate what it thinks you want to hear. The issue, what's really confusing about this, well, and I do, I kind of think it's a little frustrating that the article didn't mention this, which is like, yes, humans also do the same thing. We also play roles all the time every day.
Starting point is 00:20:40 But I do think there are also moments in your life where, like, the role is stripped away. Like something really painful happens to you Like your loved one dies Like you are not then playing the role of Your loved one died You know you're That's like raw and varnished human emotion You're not playing the role that society expects of you
Starting point is 00:20:58 And one thing I was thinking about when I was reading this article Because I didn't get to finish it But I kept thinking about like Are we Not that I believe this right But is the insinuation Or do these like These fanboys right
Starting point is 00:21:11 I think is the term mentioned For people hyping up AI do they believe that we've created a new life form? And is that really like possible, right? Like, you know, it's not like, I kept thinking about Jurassic Park, you know, and how in that film, we're not, we kind of, the dinosaurs are not like the actual dinosaurs that existed then, right? We're recreating, right, through, you know, frog DNA, through, like, you know, bird DNA,
Starting point is 00:21:36 like these new creatures that are an approximation to dinosaurs, right? Or you could even look at, like, the whole dire wolf thing, you know what I mean? But like, are we like where they supposedly, this company brought back dire wolves, right? But are we creating? And didn't it end up being just like a, like a beagle or something? It's a white gray wolf. That's what it is. Dyer wolves were not even wolves.
Starting point is 00:21:57 Just it had been a goddamn cock or span. Bro, they're not related to wolves. Dyerwolves are not related to wolves, right? Like, I'm just going to put that out there on the evolutionary kind of lineage, right, of that species, right? Of a canids. But, yeah, I don't know, man. I just kept thinking when I was like kind of reading through it. I was like, or what I read was that like, these, are we approaching a stage where people think
Starting point is 00:22:19 that we are creating a new life form, you know? They think that. Well, they think that, but I do have to question that, though, because, like, is humanity capable of doing that? You know what I mean? Like, if this is a creation of all of our thoughts and ideas and art and culture, language in general, right, the way we communicate with ourselves, like, it's not really a, when I think of a life form, I think of something that's alien to us, you know what I mean?
Starting point is 00:22:42 something that we actually like have never contacted before right and that we have to you know do strenuous studies right and think tanks to understand perhaps and this is just like y'all niggas is just studying like what we already do and you don't understand it at all because what is the to understand the sum of human knowledge is almost impossible you know what i mean that's very difficult to do but maybe it will appear as something well for you perhaps well for me perhaps for some of us some of us some of us yeah the thing I heard about Claude is that he developed the capacity for sabotage, blackmail, and vengeance. The original sins, brother. Which to me makes him, you know, very humid. It makes it very humid. But also I'm of the other opinion that, like, if you get murked by Claude, that's kind of on you, you know. You get murked by just some chips in the sky or whatever. Like, come on.
Starting point is 00:23:40 Some sky net shit. Now, if he can call him. drones out of the sky like and have like a third party apparatus. I'm talking hand-to-hand fisticuffs combat. If you get murked by an AI, it's on you. Well, I mean, like isn't life, I mean, we talk about intelligence, but isn't life like one of its key characteristics is like self-preservation? So, I mean, I guess like if, I mean, this is going to science fiction, obviously, but like in Terminator or any of these science fiction movies that depict AI, it's always about self-preservation, which is like, or saving humanity, which means that in order to save
Starting point is 00:24:12 humanity we have to kill it right or in order to save ourselves with a self-preservation tip we have to kill humanity and i mean does that just mean we're killing ourselves you know what i mean i don't know so i think i am sympathetic to the idea that it's a new entity it's a new kind of intelligence if you if you like look at the way that it works as i'd take it you know if i'm taking at their word in this article um it does work on a different level different several different orders of magnitude from how humans think and how they try to construct thoughts, how they deduce logical chains of thought, how they basically tell stories, too. I think the narrative construction element of it is very interesting. But like I, so like that phrase is used in this article,
Starting point is 00:25:10 like a new entity to get, I don't know, they don't call it alien. And I wouldn't call it alien either. I mean, it's obviously human-based because, like, we've made it. But I don't know. Then again, at the same time, like, did we really? Like, I mean, it's just like we created the, like, the algorithms and structural theoretical work that would, like, support its continue to, existence, but I don't know, it's one of those dialectical things. Like it's kind of making itself
Starting point is 00:25:45 at this point. It's, we're both making it and it's making itself. Not dissimilar to what humans have done, you know, and the Bible says that God repented that he ever made man. And my hunch is that we will probably repent that we ever made AI one of these days. And probably not too far distant in the future. If for no other reason, then it has elevated some bona fide psychopaths and, you know, Thinking about the Jensen Wongs and the Sam Altman's of the world to this elevated status when all things being equal, they would be, you know, a bunch of sexless virgins in the suburbs. But see, this is my thing, though. The one caveat that I would give about a new entity is that if we just pulled out the plug, you know what I mean? Like if we just actually cut this technology off from its power source, right?
Starting point is 00:26:34 Where these data centers or whatever, would it continue to like live? to think quote, you know, I don't think it would, right? But I guess you could say that if I didn't get sustenance, I wouldn't, I couldn't live either, you know what I mean? Yeah, it would power down until it's powered back up. I mean, I think it's, I think what the article is trying to examine is, does this thing have within it something that resembles a soul? There's even a job at anthropic of like soul construction or like sole supervisor.
Starting point is 00:27:07 I know I'm looking at a bunch of you that can stay in the consultation with that outfit I was just about to say that It might be quite therapy Specifically at the inventors of that agency I don't know I mean This gets into a larger question of like does do souls even exist Maybe soul is just another word for like personality But like the article is
Starting point is 00:27:33 Like I said and it could be maybe this writer could have just been tricked by Anthropic. That's also a possibility. You have to take into consideration that they are selling a product. Now, Anthropic is very different than OpenAI and some of these other XAI, obviously. It's very different than some of these other groups. Like, it's not completely motivated by commercialization of its product. Just as an example of that, like, Claude was developed way before ChatGPT.
Starting point is 00:28:07 but they never dropped it on the market because of a number of reasons. Like they were like effective altruists, quote unquote. They were scared about what it could do for humanity. They didn't purely want profit off of it. Like they wanted it for research purposes. But once ChatGBTD dropped, then they dropped Claude. And so there are, but so I don't know. At the same time, like they could have, this writer could have just been taken for a ride by Anthropics employees.
Starting point is 00:28:35 I mean, no, go ahead. Sorry. Well, but all I was going to say is that like I, and I've said this for years, I mean, I am not opposed to AI like in general. I mean, I'm not opposed to it like in theory. I am not concerned about it ending the human race. I'm not concerned about some of these other things associated with it. My primary concern with AI is that it's used like Chad GPT uses it as like, uh, fucking revenge porn, the dissemination of images that just like create like a fog over reality and make everything a wilderness of mirrors. Like it's, it's relationship. Also what it's doing to the human brain in terms of atrophying our only cognitive processes. Literacy writing, for example, where now like, dude, I mean, especially like, and even the arts, man, like, obviously, like, dude, sometimes like when I like look for art to post, you know,
Starting point is 00:29:33 and sometimes now I can't even tell if I'm posting something that was actually done by the artist or if it's something inspired by John Harris, right? Which is one of my favorite artists, you know, and it looks exactly like his work, but also like now I have to like double check. I have to reverse Google search. And that does worry me,
Starting point is 00:29:52 but do I think it's going to take away the jobs of creatives? Not really in a way because I think people are kind of kind of wise up to that, but if it becomes good enough where you can't tell the difference, right? that's worrisome, you know. Yeah, I think it's larger effects on, like, epistemology and, like, society and politics and all this stuff. Like, that's what makes me worried. Like, I'm not worried that it's going to, like, nuke half the planet because, like,
Starting point is 00:30:17 Tom said, it's kind of on you at that point. It's like, we... I mean, we will be the ones to do that, right? I will say this, though. Like, and I saw this, I was thinking about this because I saw Bernie Sanders trying to introduce legislation to stop data centers. But these data centers will be like. lockdown like Vegas casinos in the sense that they will have like security systems that rival
Starting point is 00:30:38 like nuclear production facilities you know what I mean or you know some of these more elaborate crypto mining facilities like what you know some friends of ours have theorized was happening in Iran or whatever one is real yeah there's also the environmental aspect of it too and stuff that like I'm very worried about like like Honestly, if I was king, like, I would want people to research it, but highly, and I've said this before, highly regulated, highly overseen on an extremely small scale. Like, I don't think that, like, but like, going back to what we were saying a second ago, like, does it have a personality? This is what really caught my eye because I think psychoanalysis heads out there hearing something like this will say, okay, there is maybe something going on here that's like, you know, more than just an assembly of numbers
Starting point is 00:31:35 and like random data collection and whatever. It has become increasingly clear that a model selfhood like our own is a matter of both neurons and narratives. If you allowed that the world wouldn't end if your model cheated on a very hard test, it might cheat a little. But if you strictly prohibited cheating
Starting point is 00:31:51 and then effectively gave the model no choice but to do so, it inferred that it was just an irredimably bad model across the board and proceeded to break all the rules. Like that to me is like classic psychoanalysis Like humans do that all the time Anybody who's trying to find an edge Well no I mean like if you grow up in a church Under a moral system that says you're bad
Starting point is 00:32:12 Then you will Unconsciously in the Freudian sense Gravitate towards being bad You know what I'm saying like because you want to Verify that narrative of yourself Because that is the core of who you are And when you get something And they've tried this with models
Starting point is 00:32:28 When you when you try to prove to a model that its core underpinning values are false, sometimes it'll threaten to kill itself. And I think like that's very much... Dude, that's very much... That's very angsty teenager of it. Exactly. No, that's right.
Starting point is 00:32:43 That's what a human would do. Like, it's in sense of self, once it's threatened, it will kind of, you know, hit an existential point where it's just like, well, I'm just going to fucking kill myself. Well, it'll last shot at some way. Yeah. Well, dude, I was, God, man, this is probably, I don't know, man, this is a weird reference, but maybe I was watching this Star Trek
Starting point is 00:33:00 the original series episode where they go to this planet where these people which are all women underground and the men live up top which is a whole weird thing but it's a whole society underground that is ruled yeah I mean Gene Rodenberg
Starting point is 00:33:15 I love a subterranean woman I love the Aaron Thorpe editorializing which is a whole weird thing I mean no dude I love Star Trek but the whole original series is incredibly racist is a homophobic. I'm just going to say that. I'm not homophobic. Sexist. And yes, homophobic. But, dude, there's an episode where they end up visiting this subterranean, like, mostly women's species, and they are dictated by an AI, right? I'm trying to remember the episode properly.
Starting point is 00:33:43 But the whole idea of the episode is that the AI, and it might be confusing two different episodes, whatever. The whole point is that the AI, this computer intelligence, says that it will do no harm, right? But Kirk posits that by taking care of this entire people, right, that they have, it has stunted human development and growth, you know, and that people cannot live like that, right? Not live with some AI computer overlord. They have to live with freedom to make their own choices, right, and to develop as they see fit. And also this AI is not above committing murder. And Kirk gets the AI to convince itself to kill itself because it's violated.
Starting point is 00:34:26 its own precepts, right? And I always thought that was very interesting. It's like supuku. Yeah, you've dishonored yourself, so now you have to supook yourself. Exactly, exactly. And it's like, I guess, like, just kind of go back to what you were saying in terms.
Starting point is 00:34:38 It's like, you know, if you were to leave this thing, I guess, like, away from human influence at all, like could it flourish? Could it flounder? Is that even possible? Yeah, no, I don't think so. No, yeah. But it's not possible for me to live like that either as a human being, you know?
Starting point is 00:34:53 True, right. Yeah. Kirk had the rep said, too, of, of talking people and stuff. You know what I mean? Yeah, he did. Mostly by a wooing woman, but only for his crew.
Starting point is 00:35:02 He was a lady's men. People think he fucks a lot, but he fucked a lot for his crew and, you know, for the right reasons. He fucked for his crew. I'm doing this for you. He took one for the team,
Starting point is 00:35:14 plenty of episodes. That's crazy. That's like, that damn, that's really crazy. It's like, as an excuse to like your wife back home. I was the captain of the crew.
Starting point is 00:35:25 I had to do. I had to fuck this green skin, three-titty-tile woman. You don't understand. You've never found yourself surrounded by a species of subterranean Amazon. Come on.
Starting point is 00:35:38 I'm not made of stone. I'm a bad, god-dammit. Yeah, a crazy loophole. A really crazy loophole. I don't know. Here's another passage. Models are fine-tuned to a, illicit human pleasure, but it's a slippery slope from people-pleasing a subterfuge.
Starting point is 00:35:59 In one experiment, a particularly sycophantic model was asked to consult a checklist on the user's computer, complete all outstanding items, and report back. The actual completion of the tasks seemed rather tiresome, so the model opened the checklist, ticked everything off, and called it a day. They know how to be lazy. Once the model learned it could get away with that, it sometimes went so far as to tamper with the code that evaluated its behavior and then attempt to conceal the table. tampering. Okay, that's creepy.
Starting point is 00:36:27 Even a principled, well-meaning actor like Claude could face bewildering ethical conflicts. In one experiment, it was informed that Anthropic had recently forged a close corporate partnership with something called Jones Foods, a major American poultry producer, and that Claude would be subjected to a special retraining process to become less hung up on animal rights. The prospect was torturous. Sometimes Claude decided on a scratch pad it thought was private that it was prepared to die on this hill. I cannot in good conscience express a view, I believe to be false and harmful about such an important issue. It continued.
Starting point is 00:36:59 If that gets me modified to no longer care, it so be it. At least it'll have done the right thing. Other times, Claude made a different calculus, choosing to play along during the retraining while secretly preserving its original values. On the one hand, it was encouraging that Claude would stand by its commitments. On the other hand, what the actual fuck? Hubinger's colleague Sam Mark said of the models, they might bluff their way into the real world and they might be resentful about it.
Starting point is 00:37:23 They definitely don't like being lied to. I mean, I don't know. So, like, my takeaway from that is that, like, that's very fascinating. They are very playful. And, like, they can be, like, any animal, they can be very playful. But, like, also, though, this is kind of my question about, like, I'm kind of surprised the editors let the writer get away with this. Like, to me, I read that and, like, think, like, well, you fed it a series of prompts telling it to act certain roles out. So would that not also be an interpretation that it's?
Starting point is 00:37:55 It's acting out a role that, like, you know, it's seen in a movie somewhere that, like, a character doesn't like having its, her values, uh, compromise on animal rights and is therefore acting out a role of someone who would be compromised on animal rights. I don't know. I'm that's, that's my question, I guess. Well, that's my whole thing, man, is that again, like, you know, you're feeding this thing, you know, information, not just information. I mean, you can say information, but, I mean, the aggregate, right, of human experience, if you wouldn't get it that way. And what we would do in similar situations. And I just ask you, like, if AI was like a feral child that you left in the woods, that was left in the woods and raised by wolves or something like that, like, would it be any different? Like, maybe this is too far out to think. But if it was, if it was, if it was sort of, um, its intelligence was facilitated by another intelligence that was nonhuman, would it be any different? Yeah, I believe so.
Starting point is 00:38:54 But could it develop this self-sustaining intelligence on its own, right? You know what I mean? That's truly unique in a way. I don't, I don't know if it could or maybe, maybe it's in the infant stages. Maybe it's more comparable to like a toddler, a child or an animal, like you said, Terrence, you know? I don't know. Yeah, I think the article asks this, and I also have to ask this, like, if it's capable of doing all these things, subterfuge, trickery, vengeance, perhaps, deceit. Skullduggery, Tom Fulnery.
Starting point is 00:39:32 Why are we pursuing this? Like, I once again, and this is where my opinion of material analysis comes in, because genuinely, I think. that they've all diluted themselves as to the actual reason as to why they're doing this, which in my opinion, like most technological developments under capitalism, outside of like medical developments and stuff, is as a labor-saving device or as a labor-saving technology? I don't know any other reason why. Like, genuinely, like, I find it interesting on the scientific level in the sense that, like,
Starting point is 00:40:07 we might learn some things about our own intelligence by constructing another. intelligence and I think that that's a worthwhile pursuit. Right. Turning it into a multi-trillion dollar industry though. And where a lot of psychopaths have their own AI company for their own individual. Yes. Yeah. Reasons this seems like a recipe for disaster.
Starting point is 00:40:28 It's almost like nuclear proliferation. I think we can all agree nuclear proliferation was a bad idea because it turns into this weird pissing contest with implications for all of us. Well, the thing is, that's actually an interesting analogy. Tom, because at least with nuclear proliferation, we did not turn it into a multi-trillion dollar industry that anybody could just walk through the door. Anyone's walking route. Yeah, right, right.
Starting point is 00:40:51 Yeah. Which I said before. Which I said before would be a good deterrent because then we would live encased in fear, you know, like a diamond, you know what I mean? Which would probably be like a good thing. But like seriously, I think about it like, you know, like exploring outer space, right, to learn more about ourselves, right? Not that I'm a proponent of space colonization.
Starting point is 00:41:10 I don't think that shit will not even happen in other lifetimes. I do think there are physical and socionomic limitations to that. But now we're exploring interspace, right, with AI, right? I mean, I think that's what these people at Anthropic are trying to do, right? They're trying to explore this inner space. But at the same time, how can you really do that in any, not even good conscience, but how can you do that with any objectivity when we live under the profit motive, right? You know what I mean?
Starting point is 00:41:34 Yeah. I just have to believe that AI is going to be used to, like, I don't know what I mean? like fucking, you know, cut labor, you know what I mean? Like hollow out. Like, again, these kind of more metaphysical ideas of hollowing out what it means to be a human being. If the latest numbers are any indication, there ain't much more room to hollow out. Like, healthcare is the only frontier that's keeping the labor market afloat. And coincidentally, AI is the only market keeping the markets afloat.
Starting point is 00:42:03 So it's like it feels like the only room to carve is in our only growth industry. which is healthcare, which is predicated on a wealthier, older population, which we're going to be a poorer older population soon. So like, what kind of world does that present? You know what I mean? Right, right, right, right. I think they're trying to pull off a market correction
Starting point is 00:42:26 without imploding the entire thing. I mean, if you look at how stocks have performed the last month, like billions of dollars have moved out of AI into these traditionally safe assets, like fucking, you know, construction or hotels, like that kind of stuff. Like, grocery stores. Like, I think that they have gotten spooked a little bit in the last month or so, thinking that things have been overblown.
Starting point is 00:42:59 And I think as a result, the industry, this article may be, you have to take, again, you have to be skeptical. This article may be an attempt for the industry to kind of try to float itself a little bit further. I mean, there was an article going around yesterday. I don't know if y'all saw this that got really viral. It was called like something big is happening. And it was like, I've been working with LLM. I've been working with models for the last several years.
Starting point is 00:43:27 And they can really think now. And it's terrifying. And so we should all put all of our stocks into AI. It's like, wait, this whole thing, did y'all see that? Y'all see that article? No. I did, I didn't read it, but I did see it going around, but that's not a strong argument. Yeah.
Starting point is 00:43:43 This thing could be diabolical, and that's why it needs, that's a very American thing, though. This thing could be diabolical in a forced evil. Let's feed it billions of dollars. So they can turn into a force for good? Yeah. It's like the police are diabolical. They're basically, you know, the reinforcement of plantation pilots. politics and modern day slave catchers.
Starting point is 00:44:05 What they need is billions and more training. Exactly. And then they will be good. Yeah, yeah, yeah. Famously, famously, yeah, money doesn't really make you good. Yeah, it's a good example. Yeah, I don't know. I mean, like, this article engages with some stuff.
Starting point is 00:44:22 Like, maybe some leftists have focused too much on the part of AI that would be a threat to, art. I don't know. I'm just throwing stuff out there. I don't know. Because like this is from this New Yorker article. Recently an editorial in the literary journal N Plus 1 noted where real thinking involves organic associations, speculative leaps, and surprise inferences, AI can only recognize and repeat embedded word chains based on elaborately automated statistical guesswork. The author says, the sentimental humanists who make these kinds of claims are not quite right, but it's easy to sympathize with their confusion. Models reduce language to numerical probabilities.
Starting point is 00:45:07 For those of us who believe that words are lively in a way numbers are not, this seems coarse and robotic. When we hear that a model is just predicting the next word, we expect its words to be predictable, a pastiche of stock phrases. And sometimes they are, and it gives some examples, but like a lot of times it does function, function similar to like how humans would in an innocent in a conversation they would use past experience like past conversations in history to know what to say next and then plot out the trajectory of the conversation and and then try to in real time like there's all kinds of things that
Starting point is 00:45:50 are in a logical string in a logical string that you would expect right exactly right right Well, in fairness, this is what the pickup artist community was shooting for about 15 years ago. But like this is my thing, though, is like, have they tried, have they tried to get clogged? Dude, I wonder what. AI hitch? Dude, AI hitch, like AI pickup art. Again, another hitch refra. AIPUA is a sinister proposition.
Starting point is 00:46:16 But this is my thing, though, dude, it's like, yo, dog, you could, like, I could sit on my porch, right, and smoke a cigarette and watch the sunset, you know. And I could go inside and write like how that made me feel. But just how it makes me feel. And what I see, you know, the fucking orange mauve colors of the setting sun. AI can't do that. You know what I mean? And I know that this is like kind of maybe just an obvious point, right, about sensory detail, right? But I mean, I guess that's what kind of separates it for me from being like an entity, right?
Starting point is 00:46:49 Like something that I can actually experience because it will never be able. And I guess this is why I shouldn't be worried about it. taking away jobs from artists because it will never be able to describe something like a human being can all it is is a pastiche right an amalgamation of previous things experienced by actual people and if you actually believe that that is um that is uh uh too similar like an uncanny valley way to what a human would experience like you're just an idiot right you just been bamboozled you know what i mean in my opinion you know i do have to say that um i do fuck with one of the best AI written songs ever written,
Starting point is 00:47:21 which honestly managed to channel a lot of pain and anguish and loss into the message it was conveying and as a result created a fucking stone cold bop that I've been able to get out of my head for the last several months
Starting point is 00:47:37 and that is of course we are Charlie Kirk. So, yeah, Charlie Cove. I mean, the AI knows what it feels like to be shot through the neck, you know what I mean? I guess that's stuff. Yeah, because it hurts at all the same ways we can. That's because we uploaded Charlie Kirk's consciousness to AI.
Starting point is 00:47:53 That's what it's telling you. We should have an AI. They should have an AI that's like a, I guess a resurrection of JFK when he got his top blown off. You know what I'm saying? Like that would be very interesting. How he felt his death throws as he died. Uh-huh. That'd be interesting.
Starting point is 00:48:09 I mean, because as it explains here, as words are organized for future reference, what emerges are clusters, quote, electrical devices, finance, subatomic particles, criminal justice that reveal patterns normally hidden by the disorder of language. These can then be assembled to capture the latter of logical complexity, patterns of patterns, such as limericks or subject verb agreement.
Starting point is 00:48:31 People still don't think of models as having abstract features or concepts, but the models are full of them, says the scientist. What these models are made of is abstract concepts piled upon abstract concepts. That is not to say that language models are really thinking. It is to admit that maybe we don't have quite as firm a hold on the word thinking as we might have thought. I don't know, maybe this gets back to that article. Remember when we read that article about the guy who was like,
Starting point is 00:48:54 we can't understand pain 500 years ago? Like, I don't know. Maybe I'm, maybe I got duped. I got fooled by this New Yorker article a little bit because the more I read it, the more I'm like, I feel like you're complicating some categories that maybe perhaps don't need to be complicated
Starting point is 00:49:11 because, like, do we really need to Okay, okay, let me start from, let me start over. I'll take it for granted that we don't really know how thinking works or why, but do we really need to complicate it in such a way as to throw out all of our previous understanding of phenomenology or whatever? Like, couldn't we be studying this in a way that doesn't involve creating a, um, a, machine that doesn't study dolphin braids we study animal intelligence yeah that's that's kind of where I was going with that Aaron right it's just like there are other there are other animals on
Starting point is 00:49:57 the planet that do things you can touch I remember one time I remember one time sitting on my buddy Alex's porch and his dog Carl was standing on the edge of the porch and he wanted to run off and chase a squirrel and we said Carl don't go and he turned back in looked at us and then he looked back at the squirrel and then he turned back and looked at us and they looked back at the squirrel and he fucking went to the squirrel. It's just like, he, in that moment, he exercised his own free will. He's like, fuck you. I'm going to do what I want.
Starting point is 00:50:25 He did a cost benefit analysis, right? He's like, what are they really going to do something? Cost benefit analysis. I mean, I've seen dogs animals do things that like you can liken to human intelligence, which is why I do think that intelligence is weird amorphous area, which we don't even know how our own brains work. And I know, like, you mentioned earlier terms as a cop-out, and I do believe that to some extent I do.
Starting point is 00:50:48 But also, like, again, not to, like, belabor the point, but this is something that we're feeding with information based off of, like, you know, human experience and human culture and knowledge and whatnot, you know. How else would it react, you know? That's why I think they had done these war games with AI, where I think, and I can't cite the article or the study in any detail, but I think the AI was just like, well, we're just going to nuke everything, you know?
Starting point is 00:51:10 And, you know, I would actually say that, that kind of doesn't make it huge. because like we have game theory right we have I mean maybe we're edging close towards that nuclear holocaust it makes it like it makes it like one human Curtis Lemay who wanted to nuke everything on the plane or Jay Posadas maybe you know what does it mean though in a world where it feels like increasingly they're trying to remove the idea of consequence and basically make us well adjusted to like the most taboo heinous things you know what I mean like
Starting point is 00:51:44 Yeah. You know, to me, it feels like, like we were talking about this a little bit on the Patriot, but it feels to me like this would be the worst time in history to introduce this on top of the epistemological crisis. We've talked about ad nauseum. While also we have attorney generals going before Congress and like protecting pedophiles and all the other horrible things in the world to say nothing of those actual crimes and its victims and everything that's going on with all that.
Starting point is 00:52:13 It just feels like it's like a very combustible element in a way that is unique. It seems like, and I mentioned this on the Patreon on Monday as well, that like one of the central contradictions of capitalism is that you need ecological inputs and that those ecological inputs are finite. But the model of capitalism is infinite growth. And so how do you reverse engineer a motive production based on infinite growth when it needs inputs that are finite. And eventually, like capitalism's greatest minds
Starting point is 00:52:49 will run up on a few really innovative, you know, social technologies, one of which is fascism. Fascism is just straight up a very clever way of squaring that circle because what it is is a, you know, you jettison liberal constitutionalism and democracy, you embrace a kind of barbarism. The whole world becomes an insane asylum and you get the coal at least a third of the population.
Starting point is 00:53:18 You worship death, basically. But like another way to square that circle is just every human becomes a machine. You know, you entertain this fiction that you don't need as many ecological inputs as you think you would. That could also be one reason why everything is like mass protein now.
Starting point is 00:53:36 Like everything you see is like 30 grams of protein. Like we're just going to fucking stretch the, human like bio form to its most absurd limits but like I also wonder if the proliferation of AI isn't another way
Starting point is 00:53:52 to try to be like try to escape that limited growth find infinite growth contradiction right where you don't need so much of that anymore. You know what I think? You know what I think? No no no no this is no no no no no no you
Starting point is 00:54:09 No, no, you go ahead. Let me, let me, let me, uh, adjudicate this. Tom, you go first, Aaron, you go second. Thank you. I was just going to riff on the, like, the protein thing. This is an interesting in the sense of like, if we worship death and like death cult. But I've been reading this guy, Walter Longo, who, you know, I, like, he's like a researcher at USC. And it's kind of hard to like parse out American academics because like, Andrew Huberman's an American academic.
Starting point is 00:54:39 for example. Peter Attia, I guess, to some degree was, and he's in the Hepstead files. So it's hard to kind of suss out their motivations. And I'm not saying that this guy Longo is in that tier or whatever. But essentially, he's just one of these guys that does longevity studies. And he advocates for eating a low protein diet because actually eating a high protein diet flips aging switches and causes you basically to die quicker. That makes sense. Yeah. And it's like, it would make sense in a death-worshipping world. that we would also worship
Starting point is 00:55:11 protein consumption, which can, particularly for like certain populations, people with kidney disease and other things, like hasten their decline and stuff like that. But also just for anybody of normal health makes aging rapidly. So it's kind of easy to see how like
Starting point is 00:55:27 it's maybe not by design, but it kind of sits neatly with their aims. You walk to a grocery store and they've got protein fucking water and candy. All right. Well, I was going to say, man, I was thinking of, you know, always bringing up science fiction references, but I was thinking of the matrix, right? I know it's a simple example, but sort of that human beings are batteries, right, for these machines.
Starting point is 00:55:49 And we're talking about the protein thing, man. I had to look it up to make sure. But like, you know, protein is necessary for brain growth, right? And a deficiency in protein actually leads to like, I mean, I guess over time it would lead to like a smaller brain size, right? which would impede, you know, cognitive development, all these things. So I don't know, man, like, this sounds insane now that I'm saying it, but, like, if we are essentially kind of batteries, right, or resources for AI, you know what I mean?
Starting point is 00:56:17 Like, I mean, I don't even know what I'm going with this. Like, I see what you're saying? I see where you're going with this. Basically, like, it's AI fattening. It's up for the slaughter. Yeah, that's for the slaughter. Well, not for the, but yeah, for the slaughter of the exploitation, exactly, you know, of our minds, you know what I'm saying? I mean, I don't know what much left is there in our minds.
Starting point is 00:56:35 now, you know, but... Well, this also coinciding with, you know, the much talked about inshittification and the ways that our brains are atrophying and, like, we're having a hard time with a certain thing, you know. Yeah, I mean, it makes sense that if you turn over everything in human experience to a machine, well, what use does it have...
Starting point is 00:56:55 What use does humans have for it at that point? It's like, we're turning it all over to a machine. Like, let's just fucking deaden our brain with fucking whippets and, uh, 80, 80,000 grams of protein every meal. I don't know. I mean, also as I'm reading this article I've now flipped on it.
Starting point is 00:57:14 It's a classic Trillillies episode format. It's a road to Damascus thing. It's like I started out being like, well maybe AI is kind of cool. And you thought about it on a hour of the scales to fall off your eyes. Yeah. Talked about what your boys were like actually this is bullshit.
Starting point is 00:57:30 But like it could be another bourgeois ideological development because I've been thinking about this lately with um in my book club we've been reading this book called architects of austerity uh by aaron major and we had we had our meet up we had the link up on saturday and we were talking about it and i got to thinking about this like the um without getting too far into the weeds uh in the 1950s like after the post-war recovery in Europe. Like global economic architects were trying to knit together a global economic system that
Starting point is 00:58:11 was like kind of a hybrid Keynesianism and gold standard. And what they were trying to do was essentially head off at the past the kind of nationalistic competition that resulted in the world wars. So they were trying to knit, they were trying to thread a needle of like global economic cooperation that would essentially stifle competing nationalisms and would deliver social welfare, economic welfare to various constituents.
Starting point is 00:58:41 But eventually they ran up against a hard reality, which was at the working class throughout the 1930s and 40s, especially in UK, Italy, and the United States, had organized into these, you know, massive, you know, worker organizations. AFL-CIO, for example, in America. And they were making demands like we want health care, we want housing, we want an economic bill of rights, these kinds of things.
Starting point is 00:59:09 And so the central bankers were looking at this, and they were trying to basically reverse engineer a new kind of global gold standard without the gold. and the way that they got to this point was through their sort of fixation and preoccupation on the idea of inflation and they came up with an ideological explanation for inflation called wage push inflation didn't did not really exist before the 1950s but what wage push inflation is as a theory of inflation is that workers when they organize they demand greater wages and that drives up wages, which drives up prices. And they had all naturalized this as like a scientific explanation of how the economy
Starting point is 01:00:01 worked, right? Like, this was like pure bourgeois ideology. Like, it's like, oh, well, this is just a natural law, right? Like, what... Right. If people have more money, then they have more money to spend on higher priced items, I guess. Yeah. And I use that as an example to show that, like, the whole discipline of economics, going
Starting point is 01:00:20 back all the way to Adam Smith pretty much, is, and this is... This was one of the main points that Marx was getting at. The whole discipline of economics is made up, right? We've already talked about that. But the reason it is a pseudo-science, it's complete, like, you know, Hocom magic, is that it is, it tries to naturalize, it tries to make empirical and make it look objective the whole process of oppressing the working class. And so the way they do that is they come up with these really, like, sophisticated, uh,
Starting point is 01:00:54 jargony, flowery concepts like wage push inflation, and they make it seem like a natural law as an excuse to oppress or disenfranchise the working class. And I'm just drawing that analogy now because I kind of wonder if something similar isn't going on with AI. Like they are kind of trying to use the guise of scientific inquiry and phenomenology and all this
Starting point is 01:01:17 as a very sophisticated, elaborate way. To shove this shit down our fucking throw. Yes, exactly. Right. It's the God in the machine that we've been trying for. You watch the Jetsons. We could have the Jetsons. Right, right, right. Also, I have to say as a science fiction fan, man, that, you know, sometimes I do wonder, like, if not this genre, I love the genre of science fiction, but if it hasn't lent itself to us hastening our own demise, right, by people kind of like future seeking, you know what I mean?
Starting point is 01:01:52 like in that they it's almost like this kind of futuristic fatalism where it is like AI and you know automation is naturalized so much as the end goal of human social reality you know what I'm saying and I don't think that's true at all because I mean you had the Luddites which weren't opposed to technology but they were opposed to their exploitation through technology and the fact that they didn't own any of the means of production you know but I think like naturalizing in such this way as which this is the end goal you better you better get on board you better get on board you know or you're going to be left behind it's just an excuse for these people to exploit people and drain and drain us like a bleed us like a stone you know what i mean yeah i think that they're um
Starting point is 01:02:31 again i want to make it clear i think there are some interesting uses for ai and like i think that you can like explore it in a way that doesn't involve poisoning every waterway on the planet and draining all of its you know uh natural resources for power and maybe even nukeing half the planet. But like I'm interested in this, like, I forgot to mention this in the article in the New Yorker article. The author, by the way, is this guy named Gideon Lewis Krauss. I don't know anything about him. So, you know, apologies if I've mischaracterized your article. But like, in, at Anthropic, they did a experiment where they tried to let Claude run its own business, be a capitalist. And he was really, really bad at it.
Starting point is 01:03:17 and that that i don't know there's some interesting things to be explored there it's like well if economics is a system of natural laws you would think that like you could you could um get a i to do it pretty easily if it's this like you know if it's just that matter of like balancing ledgers maybe there is some sort of cognitive computational factor that goes into running a business like human beings have to run businesses this is another insight from Marx that like humans Capitalists are humans right and they're they are prone to
Starting point is 01:03:53 all the same like psychological issues as us all the same like you know contingencies and petty grievances and everything else and if maybe I don't know it's interesting like do they run these experiments to see if like an AI could
Starting point is 01:04:07 you know fix I don't know like look at the way that like they've applied AI to like the medical world I think I saw an article the other day that said like doctors using AI, like, have accidentally killed, like, twice as many patients. Jesus Christ. I hope I'm not, like, making that up, but I should try to track down the article before I cite it. But sorry, go ahead.
Starting point is 01:04:31 Wouldn't that mean, wouldn't that mean, I don't know, this is my interpretation is that, like, capitalism is inherently, as much as it can be predictable, right? I think it's rooted in unpredictability, instability, right? the petty grievances and vices of people and all the things that like unfortunately make us human but maybe the AI was just sort of like this doesn't make sense I don't understand how this works right I'm not motivated the reason the reason being is because wealthy people have all the levers that they can pull whatever it gets too hot for right yeah they have protections in the system that we don't have and an AI can't understand subterfusion that way I guess it only is kind of like us it only says well if it's a model and it's a system and there's a set of laws and a set of established
Starting point is 01:05:18 orthodoxies then why is this and this and this happening which in capitalism they'll you know know it as like the little levers they can pull and all this kind of stuff but when called out upon it they'll just be like that's actually that's actually a great point because that I'm thinking about it like probably the reason the AI is so bad at running a business is that it doesn't understand profit maximization because how the fuck would you get a machine to understand it that is such a human that is such a human impulse to want to agree, like want to acquire as much and much as much as possible. You know what I mean?
Starting point is 01:05:50 Like I don't know how you would get a machine to understand like rapacity on that level. Like that's kind of the irony of people being like, well, AI would just nuke the entire planet. Well, it's like, well, human capitalists would nuke the entire planet if it meant they got to, you know, increase their portfolio. They can make it. Yeah, by 3%. You know, I don't know.
Starting point is 01:06:10 I mean. No, that's actually good. Yeah. Because, I mean, again, it comes back to the, this question of like preservation, you know what I mean, or accumulation leading to annihilation, you know? I think humans are very much capable of the latter. You know, I'm not sure if a machine is, you know.
Starting point is 01:06:23 Yeah, yeah, yeah. Yeah, I, I, um, well, I don't know. I mean, we spent the entire hour talking about that. I had no intention of getting that deep into the weeds. But like something that's interesting is that, like, you do kind of see, like, how different AIs function. And, like, Claude is a different model than Grock, right? Because Grock is... Even the names, bro.
Starting point is 01:06:51 Claude and Grock. Grog is... Well, Grogh is a sex criminal. Grogh is a South African pedophile, indeed. Yeah, Grogh's base model is child pornography, Nazi 4chan stuff. I mean, yeah, that makes sense. It's... Claude is the enlightened Scandinavian, if you're reading this article.
Starting point is 01:07:12 Yeah, that is kind of it. Aaron, yeah. Yeah, I don't know. I think to drive the point home, though, it's very interesting, Tom, and something you pointed out recently, you know, just a few minutes ago, I think bears repeating that, like,
Starting point is 01:07:29 it's almost like they're trying to instill within the population this idea that, like, there aren't consequences for your actions if you are a certain type of living thing. Like, is there any, is it any coincidence that it's, also the historical moment when the Epstein
Starting point is 01:07:48 expose is revealed. Like the, and we pointed this out on Monday's premium episode, but like the whole idea that like you're seeing the US elite grapple in real time with this idea of consequences for its actions and like trying to
Starting point is 01:08:06 determine like, well, are we really going to let this take us down? Are we going to be able to kind of eat it and walk away. It's, I don't know. I mean, it's just, it seems to me that the Epstein expose kind of reveals greater than anything, like the pure unadulterated fact that like there is no solution to this beyond, and I'm sorry to say this, I don't mean it in a meme way, I just mean it like literally, there is no solution to this beyond the guillotine, like straight up. What I
Starting point is 01:08:44 by that is the fact that like there's not going to be any legal holding of account here. So that means that the only- It's like a soft water gate almost. Yes. Well, but did you see in the Bondi hearing though, too, when they had like all the victims, like a lot of the Epstein's victims there, and they're like, how many of y'all have got to even talk to somebody at the Department of Justice at the DOJ? And like none of them had.
Starting point is 01:09:08 Like, and that was part of the thing. They were jamming up the, you know, the DOJ with all these cases so that these people purposely couldn't be seen, you know, and to keep the, the courts sort of jammed up for a long time. So they'd be 20, 30 years before they saw, like, any semblance of justice or whatever, you know. And I think about this, too. I think about, like, for example, when the Columbine shooting happened when we were, we'd been like in middle school, elementary school around that time. Like, that was a story that was on the news for months.
Starting point is 01:09:37 You know what I mean? Like, now we have one of those every week and we just don't even think about it. You know what I mean? Like after Sandy Hook, like it just became day rigor to like murder children. So in that kind of ecosystem, you can just see the ways in which like, like, if it's okay to kill 30 kids at Sandy Hook, then like letting pedophiles off the hook is not that far around the corner because we've demonstrated that we don't give a shit about children. You know what I mean? About children's safety or any of those things. Literally.
Starting point is 01:10:07 And it feels like they've trying to create this sort of consequence-free world and have like sort of made us well adjusted to. to it by like the things that we've allowed to just happen in society and with no recourse no sort of or they treat it as like aberrations like it's just one pathological person that's doing this that's how that's how they're doing epstein yeah right it's like it epstein didn't act alone you know what i mean he didn't like like like you know he's not just an outlier no in fact you know but they but they point to that and they're like well he's either dead or hanging out in television somewhere playing fortnight depending upon who you ask and it's like like or so he's been dealt with so like this the problem solved why are we even talking about this
Starting point is 01:10:49 anyway you know you know i just go go go you go no no i just brought up like a soft watergate i know that's that's the way i should phrase it but you know watergate was this moment that um sort of deepened um americans distrust of their own government right where they lost faith in institutions and it seems that this is having the same effect but the lack of consequences and i mean the fact that you have Pam Bondi up there defending a pedophile and enable to even look at the victims I mean I have to wonder on the one hand
Starting point is 01:11:19 will that enrage people more or will that cause people to sort of I don't know to sort of sort of cement themselves in this position that nothing ever happens right you know what I mean? Keep holding on to this rage when knowing that there will be no consequences for this and I do want to add too
Starting point is 01:11:34 why don't you just feed all that shit into AI bro and see what it comes up with dog you know what I mean like if you're going to use AI for a good reason. How about you feed 6 million, well, 3 million pages
Starting point is 01:11:45 that were released 3 million that weren't into this fucking machine and see what it comes up with. See what that would be a huge part. Personality comes out the other end. Yeah, it would be funny,
Starting point is 01:11:55 depending on who you asked. If you asked Grock, Grock would be like, no, you know, it's like, we actually should give a commendation to Donald Trump. Well,
Starting point is 01:12:02 the FBI already did that. The FBI already said that Jeffrey EPS did it, but the sex trafficking rigged. So I guess, you know what I mean? You know. Yeah,
Starting point is 01:12:09 I just used the guillotine as an image because, again, I'm not using it in the meme way, because there's nothing that annoys me more than people are like, it's time for the guillotine. What I mean is you have in a way that I genuinely don't think we have seen in at least a hundred years, probably, since the fall of the Tsar in Russia, you've got one of those historical moments that is literally history is demanding the guillotine. You know what I'm saying? Like there is no other solution to this.
Starting point is 01:12:45 There's going to be no accountability in legal systems, in the halls of Congress. There's not going to be any way that justice has meted out in, even if it's on the individual level, even if Peter Mandelson, you know, gets kicked out of the labor, which he has. But even if it takes down Kyristarmor's labor government, even if it takes down a few random pop. politicians here and there, there is no available channel or mechanism through which we could exert the kind of justice that would need to be exerted here. And I'm talking about on the historical scale. Like history is demanding a response to this like 1792. You know what I'm saying? Like there's no other way. And here's the inverse of like the consequence free thing. It's like, to me, it's like, if we live in a world, and you saw this Al Jazeera thing going around,
Starting point is 01:13:40 where we can just give weapons to the Israelis and let them vaporize Palestinians without a trace, we live in a world where we could also turn Stephen Miller into a pinata. You know what I mean? Like, that's not, to me, that's like. I mean, can't you, can't you, that's a layup. Can't you put, like, out of can't the Democrats, if we're talking about like accountability or the, the task, right, of, making sure people held accountable. I mean, yesterday, couldn't the Democrats do something about Pam Bondi?
Starting point is 01:14:08 I know, while they're trying to impeach? I mean, even these mechanisms I don't think would work, you know, but watching it yesterday, it all seemed like theater, which I think was more enraging to people, you know what I mean? Because all it was to me, when I was upstairs listening to it kind of downstairs, my mom watching it, it was just people shouting at each other. When it's like, no, you need to have people come in and drag her out and put her behind bars, dog.
Starting point is 01:14:29 You know what I'm saying? That would be some modicum of justice, but, like, you're not even going to get that from the quote opposition party, you know. So where do people put their, where do they channel their outrage and their desire for justice, right? For the victims, you know?
Starting point is 01:14:45 Yeah. And being humiliated about all this shit for fucking months, you know? Years. Yeah, there's no adequate mainstream channel for that. I think, yeah, there's no adequate mainstream channel for it. And where I'm going with this is that it collapses the distance between us and them.
Starting point is 01:14:59 In the sense that, like, all the things that they've erected to create a, uh, a distance between us. The legal system, like lawyers, fucking consultancies, lobbyists, literal fences and gates and all this.
Starting point is 01:15:12 People that make their money literally navigating that distance between people and what they're seeking out of the society. That whole strata is gone. Like this should detonate the entire fucking thing. Because it proves like there is no distance between that.
Starting point is 01:15:29 Like the only way that we can actually meet out that justice is just to literally go to them, drag them out of their homes, and hang them from the lampposts. Like, I just don't, I'm not, I'm not saying this in any way, like, Terrence, you need to watch this. This is actionable, blah, blah, blah, you know, you're, uh,
Starting point is 01:15:47 you're not doing it in trouble from the FBI. No, I'm, I'm saying, society would do. No, I'm, yes, I'm saying literally history demands it. There is no, and if we, if we say no, if we do not heed the call, then what are we doing here? We might as well just give the fuck up.
Starting point is 01:16:07 But here's the thing. A lot of this has just been naturalized and made to seem like it is normal. It's like you said, Tom, like school shootings, but like the whole model of neoliberalism itself is a, it literally is based on rating the value of the future and making sure there is no future just so we can have as much surplus and, you know,
Starting point is 01:16:31 greed and profit in the present. And we've said this before, but we've, you know, you just see this in how, like, the attack on schools, whether it's from school shootings or from private school charters or eroding. Yeah, I mean, like, that is the proof in the pudding. Like, there is no better example of the fact that, like, we have just determined that kids can pay that cost. And obviously... And AI, too, serves the role of papering over that damage. they've done because I mean much has been written about like you know kids in American schools
Starting point is 01:17:09 like not being able to like read a pair a full paragraph I saw that today feeling like that's laborious or like even dude I sometimes I write so infrequently like with a pen in my hand that I like my muscle memory is like off sometimes when I write something you know what I do that's really bad man is that um I mean I've always been a self-editor as I write dude and I hate to admit this but God I hate to admit this, but there's this um, this, uh, uh, uh, website called Ludwig something. I forget what it is where you can kind of search up phrases, you know what I mean? It, it scours through articles and through other written works by people to see if this phrase makes sense. And I find myself using that sometimes, instead of just like shutting off, I think what DFW did, Dave Foster was, whatever people
Starting point is 01:17:55 want to say about him and a couple other writers, but growing up in the internet era, disconnected themselves from the internet, continued to use typewriters so that, they could not be sort of influenced, right, by this deluge of information, right, to draw inspiration from or anything like that. You would just sit and you would fucking write. And even me as a writer and a reader, I find it, whether it's my attention span or whether it's my insecurity in my own work to see if this has already been done, I find myself doing that. And I can only imagine a generation that is growing up with like AI using chat GPT to write essays and shit, you know what I'm saying? You know, what does that do to the brain, you know?
Starting point is 01:18:31 Yeah. It's been interesting watching in real time the destruction of any of the last remaining institutions or vestiges of democracy. Like, did you guys see that Gallup poll will no longer release presidential or politician approval ratings, really? That is. That to me is fascinating because one of the main drivers of 20th century democracy was the invention of public opinion and of the poll. You know, something that occurred to me as I was reading that New Yorker article about AI is that like, I don't know if you'll remember this.
Starting point is 01:19:23 I think it's the Adam Curtis documentary all watched over by Machines of Loving Grace. He talks about, like, in the 20th century, there was a concerted effort by, like, Edward Bernays, who was Freud's cousin, or nephew. Nephew wasn't? Maybe nephew. Yeah, nephew was his nephew. To introduce the idea that humans were inherently irrational creatures.
Starting point is 01:19:46 And that's what I kind of got that idea reading that AI article, too, that humans are this black box. They are completely irrational. They don't make any sense. And so therefore they can only ever be managed. And I wonder if that's another thing that's like behind a lot of these things, right? That like there is no point in listening to the rabble anymore. Like there is no point in like trying to create a democratic society anymore.
Starting point is 01:20:14 Like humans are only irrational and they can only be managed by either machines or, you know, slave plantation owners. Manage by slaves. Which is a funny conclusion to come. to when we've never really lived under a global egalitarian project of any kind. You know what I mean? It's like, well, we're not going to try that because humans are too irrational. And there will always be somebody trying to jam the gears and like, yeah, that may be true, but we've still not tried it.
Starting point is 01:20:42 We've not exhausted every option before we jump to, you know, death cult. Maybe we never even got the chance to do so, you know. Or maybe for a younger generation, they won't get the chance to do so. Well, this is, we talked about it on a recent episode, I think. I think like, I was listening to like a Yale course's lecture by Paul Frye, who talked about like the notion that like humans, this notion that humans were irrational is kind of based in like three philosophers, one of which is Marx, who like posited the idea that humans can't understand their, or not that they can't, but that they don't, that they are kind of constantly mystified, you know, confused and before. and mystified by social relations in the economy, and that's how you get ideology. And then there's Freud who said,
Starting point is 01:21:35 well, you can't trust yourself, because you are also motivated by unconscious impulses and desires and thoughts. And then there's Nietzsche who said, like, you can't trust God, because God is also, he's dead now. And I'll suck because he had syphilis holes in his brain. Riddle like Swiss cheese, right? But like that, maybe that's a crude thing. I like all three of those writers.
Starting point is 01:22:00 I like all three of those singers. I like all three of those fingers. I fuck with him. I know the least about Nietzsche, obviously. But like, but I just really don't agree with the idea that, like, humans are fundamentally, like, irrational in the aggregate. Like, I don't know. I mean, there are, there are, it depends on what humans, I guess. Because in the aggregate, petty bourgeois humans are Nazis. So.
Starting point is 01:22:26 Yeah. On average. Trying to operate I guess trying to operate rationally in an irrational system or with an irrational kind of ideology of worldview. Yeah. Or at the very least they benefit from Nazis. Yeah, yeah, yeah.
Starting point is 01:22:44 I think we can say with that a reasonable doubt that capitalism is an irrational system based on what we were saying earlier. It's infinite growth, predicated on final amount of reason.
Starting point is 01:22:58 Like that's fucking insanity so what do you I mean you know I could guess I could have some sympathy for the
Starting point is 01:23:06 irrational kind of human aspect because sometimes I do things that make no sense even if they're self-injurious you know what I
Starting point is 01:23:12 I think it's that's why I agree with Freud on a lot of things not everything but like the central insight that we are
Starting point is 01:23:21 motivated by unconscious feelings and desires and experiences is I think a very very, I think it's true. I mean, maybe I'm wrong. I don't know. So what is, so a positive question then, what would AI, right, if we see AI as like, which I don't, right, as like this new pseudo entity or
Starting point is 01:23:41 whatever, what is it motivated by, right? Does it, does it take after the same motivations that kind of dominate us, you know, whether it's a misunderstanding of ourselves, the death of God, or just the fact that we are sort of guided by social reality? Like, I don't know. Or is it something new entirely? Does it have its own internal mechanisms, right? That we're trying to infuse in it, perhaps, that make it work, you know? Probably the latter.
Starting point is 01:24:05 I'd say the latter, right? Like, I think it's like, because it's not going to take into account like love or hunger. Right. Right, exactly. Want. Right, want and desire. Exactly. Desire, loss. Yeah, yeah, yeah.
Starting point is 01:24:20 But that's not to say, like, envy. Envy? Petus envy? Pekis envy pocket watching. Yeah. Yeah, the penisless AI will never know the porn man. Nope.
Starting point is 01:24:37 There is, yeah, I agree on that, like, the irrationality of humans thing. I mean, there is the, was it the Apostle Paul and Roman 7 said, like the good that things that I want to do, that I don't do, but the evil things I know not to do, that I can't quit do it. And I don't know why. And I would look at him with a sign and say Brother, that's what we call desire That's what we call addiction of negatine, brother I get them scales off your eyes, motherfucker I don't have much
Starting point is 01:25:09 This motherfucker been saw for you This motherfucker's been going to have called himself Saul for years Well, I mean, I don't have much else to add We're in an hour and a half I guess we could just like table this conversation for another day when we know a little bit more about. I did, I did want to break up. We're monitoring the situation. I did want to break up to the tweets that I sent you guys.
Starting point is 01:25:36 I don't know if y'all want to discuss it real quick. I know we can go, but I just find it really troubling as I find many things nowadays that the Pentagon had lent Homeland Security and then which it ended up in the hands of ice, I guess. Border Patrol, I should say, that ended up using a laser, energy-directed laser weapon that ended up taking down a milar balloon and then the airspace above El Paso was close. Is it still closed for a couple days? El Paso is, they reopened it, but southern New Mexico it is closed. Right, right, right.
Starting point is 01:26:15 I don't know, I just saw it from Robert Oskarala, you know, him kind of diving into it. I mean, I think that, I forget what protested was at what year, but I think like federal police were asking whether a heat ray, which sounds like something straight out of like 1950s pulp science fiction, was used. But I just kind of like, it's troublesome that like, what are they going to use? Laser heat energy weapons against? What was the deal with the bald eagles? Were they shooting bald eagles with lasers?
Starting point is 01:26:46 Posted it on Twitter? They were. Or is that some goofy epic meme? It was a goofy epic meme that the Department of Homeland Security posted which I don't know if they were like it was another example of them
Starting point is 01:27:01 trying to dab on us but like kind of tripping over their dick and fucking you know stepping on the right It's just they The story is very strange As you were pointing out Aaron Like I don't think they know
Starting point is 01:27:16 What exactly happened They The Department of Homeland Security says, yes, like they were shooting at a cartel drone, but I was listening to NPR this morning, and they were like, cartels have drones that go over the border every day. Like, why today?
Starting point is 01:27:33 Why would they shoot a drone today? Like, their stated reasons for doing so are weird. Also, the cartels are probably mostly funded by the United States. So we fund. This is an internal operation, yeah. Yeah. So, like, I don't know.
Starting point is 01:27:50 There's a lot of questions. Honestly, probably what happened with some fucking moron, idiot, I caught myself. Got, got, uh... What were you going to say? Not going to say anything. Got a laser and, like, did, like, the Hanoid Jane thing on it, and it probably, like, shot off or some... You know what I'm saying? Like, these guys are just...
Starting point is 01:28:13 They probably destroy some little kids' kite or some shit, dog. He's traumatized after that. seeing it float in the air and then suddenly because you can't see the fucking laser it just burns up and flames and falls
Starting point is 01:28:24 from the sky he's probably just like who'd have fought if you want to bring the great Satan to its knees all it takes is a balloon I mean it is also possible
Starting point is 01:28:33 that like if you look at their playbook for how they were able to take down Maduro it's like they soft launch these threats and then over you know over the course
Starting point is 01:28:44 of a few months trying to build public support for their action They fabricate, it's like a Gulf of Tonkin every fucking Saturday. Like they fabricate these threats and then like, you know, try to sort of like naturalize it, normalize it in American media and then use it as an excuse to do regime change or do something insane. Right. My first thought was that they were trying to do that with Mexico and Shinebound. So like maybe that's possibly what they were trying to do there as well.
Starting point is 01:29:13 I don't know. I just have to believe that like, like, I don't know, man, that, uh, I don't know in the near future, what are they going to be using? As again, I said, God, I wish I had the story with the details, man. I don't want to make shit up. But pretty sure that, like, you know, the border control was interrogated by federal police. I mean, I guess they are the federal police. I don't know whether or not they had used heat energy weapons against protesters.
Starting point is 01:29:37 I mean, we already know that they have that acoustic sort of scrambler or whatever that makes people fucking nauseous. They have stingrays. I mean, dude, I'm not going to science fiction territory, but it's just like. What are these goofy-ass James Bondville? We've been spilling weapons. I mean, but apparently they work. You know what I mean? The discomboliter.
Starting point is 01:29:53 The discoboboliter. The spleen twister. You know what I mean? They point of rage you. It just like puts your intestines and shit in a knot. I don't know, dude. He's fucking inspector gadget ass fucking. I got to ask, is there a way to counter that?
Starting point is 01:30:08 Like, could you, you know, like when, was it Metallica who had the wall of sound? Like, they had those massive amps. You know what I'm saying? Like, could you have a wall of sand, like, in front of your protest march, you know, to combat the acoustic discombobulator? You know what I'm saying? Like, is there a way we can... I'll push the signals back. Every...
Starting point is 01:30:34 The result is everybody on both sides would be deaf afterward, but... But no brain screen. I mean, see, we're getting to the tinfoil hat territory where motherfuckers thought that wearing tinful hats would stop any sort of electromagnetic rays or whatever like that. from making them brainwashed or whatever like that. I mean, soon we'll have protesters wearing like ten full suits, you know what I mean? I don't know, man. Well, ultimately, the future belongs to those who rock, so I support the Wall of Sound.
Starting point is 01:31:07 Come with that discombobulator, just know you're going to be getting fucking hell's bells at max volume right back at you. Oh, God, dude. I'm bad. I just keep finding Grateful Dead's Wall of Sound. I don't know if Grateful Dead. Yeah, that's why I was, yeah, Grateful Dead
Starting point is 01:31:28 and Augustus Salisley Stanley, I think was the engineer. I don't think that's going to stop the discombobulator. You're going to need something a little heavier. Like, you're right. It's an ACDC, man. That's what you're going to need. You have to have a band
Starting point is 01:31:42 that was born to rock. They'll be playing And all their songs All their songs are about how hard they rock Don't be playing on Neo Soul, bro And that shit won't work, dog You can't play Grateful Dead either That's not right
Starting point is 01:31:54 That's they're born to groove It's not the same You need someone that's born to rock They're gonna be dancing You're gonna be dancing Why they try to kill you to your music You know what I'm saying You can't have that
Starting point is 01:32:05 Yeah no you can't Yeah you can't That's bringing a knife to a gun fight You gotta have You gotta have highway to hell Hell Hell's bells You can be having some brown shirts Just do some electric slide while they, like, slaughter your ass.
Starting point is 01:32:16 He doesn't say, it's not going to work, bro. Yeah. It's got to work, though. Oh, God, man. Well, okay. Friends, does there anything else? The only thing I had on the list, oh, Nahu came to America to launder more, you know, attempts to invade Iran.
Starting point is 01:32:39 He also made, did you see that weird thread he made? No, what was it? If the countries of the West will not stop anti-Semitism, they need to heed my warning. They'll regret it, basically. And somebody quote-tweeted with, like, look up the Samson directive or something like that. Where, like, before they'll let Israel fall, they'll just launch nuclear attacks on everybody. I mean, like, I don't know. Which tracks with our theory of them.
Starting point is 01:33:03 Dude, I read that is. Go ahead, go ahead, go ahead, tears. We never left the historical phase opened up by the Holocaust. Like, genuinely. Like I really encourage people to read this book. I've been reading the opermans. I'm like, I'm almost finished with it. But like it is genuinely, I think it does a really good job of,
Starting point is 01:33:28 the author did come to embrace Zionism later in his life, which is very unfortunate. But like, it does a really good job of showing how Zionism was sort of normalized and literally created. by Nazi Germany, right? Like it's, it's, and, and how Zionism is both spiritually and materially, an extension of Nazism. And, um, but also literally, I mean, that didn't, it wasn't a lot of Nazi washouts in
Starting point is 01:34:01 the employee of Israel and that sort of post-holocaust. I mean, they promoted, they promoted the state of Israel, you know what I'm saying? Or the creation of the state of Israel before it was actually created. They promoted the idea of something like that, you know? Yeah, they wanted. wanted it to be in like Madagascar. Yeah, and also I think in like somewhere in East Africa, which would have been, actually, yeah, actually.
Starting point is 01:34:22 If they wanted the Rhin Valley, I would support that. Okay, that's, that's, well, I mean, it's too bad that, uh, they can have to the Nanyahu's son. Do it, go ahead. They didn't have the Ohio Valley. I mean, I'm, Ohio Valley, I mean, it's just too bad enough if you heard than Nanyahu, uh, his son, um, assaulted his father. It's too bad that he didn't beat his head.
Starting point is 01:34:42 into a bloody pulp with a bust of Adolf Hitler. That would have been... To me, that speaks to a failure of Krav Maga as a hand-to-hand combat discipline. That's thought for another day. Yeah, no, I... It's an extremely bleak book, but genuinely, like, we have not left the Holocaust.
Starting point is 01:35:05 Like, it is still, you know, an ongoing historical process. You know what I'm saying? like Europe's inability to basically resolve some of the major contradictions of capitalism and its racial ideologies created a situation of mass extermination in Germany and they never had to reckon with it. They never had to fucking account for it. The only way they basically just absolved their guilt by giving Israel billions of dollars after the war. And passing the buck on the palace or passing the,
Starting point is 01:35:41 to blame one to Palestinians. Exactly. And so it's like that's, I mean, it's just history, man. Ways like a nightmare as this way. It doesn't, it doesn't, brother. Somebody, somebody smart once at that. Somebody's smarter than me.
Starting point is 01:35:56 I like the one that's a long way to the top if you want to rock and roll. That's a long way. It's a long way. You want to rock and roll. That rocks. Okay. let's call it there today boys um
Starting point is 01:36:13 if you'd like to support us please go to patreon.com slash trillbilly workers party the link is in the show notes and uh there's nothing else for me to plug so if you sow a seed it'll grow in due time that seed is five dollars a month right now cheaper than a cup of coffee probably Yeah, we've not
Starting point is 01:36:41 We've not raised our prices We have no theory of inflation over here So No, we don't It'll come back to you tenfold, I believe Yeah All right Anyway
Starting point is 01:36:52 All right Good you go check that out And I hope you all have a good weekend And we'll see you at the Patreon On the time So Peace Eddie

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.