TRASHFUTURE - Rocco Siffredi’s Basilisk

Episode Date: May 23, 2023

For this week’s free episode, Riley, Milo, Hussein, and Alice discuss the perennial US debt default threat, the recent Congressional hearings about regulating AI (No Steubes were involved, sadly), a...nd the whale botherers at the National Conservative Conference being huge, alienating freaks that the British establishment demands you take seriously. Hope you enjoy! If you want access to our Patreon bonus episodes, early releases of free episodes, and powerful Discord server, sign up here: https://www.patreon.com/trashfuture *STREAM ALERT* Check out our Twitch stream, which airs 9-11 pm UK time every Monday and Thursday, at the following link: https://www.twitch.tv/trashfuturepodcast *WEB DESIGN ALERT* Tom Allen is a friend of the show (and the designer behind our website). If you need web design help, reach out to him here:  https://www.tomallen.media/ *MILO ALERT* Check out Milo’s upcoming live shows here: https://www.miloedwards.co.uk/live-shows and check out a recording of Milo’s special PINDOS available on YouTube here! https://www.youtube.com/watch?v=oRI7uwTPJtg *ROME ALERT* Milo and Phoebe have teamed up with friend of the show Patrick Wyman to finally put their classical education to good use and discuss every episode of season 1 of Rome. You can download the 12 episode series from Bandcamp here (1st episode is free): https://romepodcast.bandcamp.com/album/rome-season-1 Trashfuture are: Riley (@raaleh), Milo (@Milo_Edwards), Hussein (@HKesvani), Nate (@inthesedeserts), and Alice (@AliceAvizandum)

Transcript
Discussion (0)
Starting point is 00:00:00 Hello, and welcome to this non-live episode of TF. Yeah. It's not even happening right now. No. This is being cobbled together afterwards by an AI. Hello. I'm sorry, I'm a large language model and I cannot provide you podcast with it. It is only like a matter of time before some like AI booster widow like makes an AI version
Starting point is 00:00:40 of this show about how good all the startups are. Oh, good future. Yeah. Like it'll be the same format, the same bits, but it'll be like, oh, this startup that like exchanges your blood for fun fair tickets is actually really good and wonderful and you should invest more money into it. Opportunities abound not only for investors to earn returns, but for people with excess blood to earn fun fair tickets.
Starting point is 00:01:05 Absolutely. That's the problem though is that all this lifestyle stuff is the thin end of the wedge because you just think, oh, middle-class people are just going to get rid of like a little blood here and there just to exchange like a pencil topper. But actually what happens is that we end up with a sort of a Dracula style blood farm. Well, look, if you know a better way of getting pencil toppers, I'm all ears. Yeah, sort of like full-time working class like pencil topper seeking cast emerges. Yeah.
Starting point is 00:01:31 That's right. No, we've got some stuff for you today, a little bit of news up front. We also are kind of continuing in the theme of the live shows. I'm a special guest, Matt Goodwin is sitting in the corner shaking his head because he disagrees with it. He's still, he actually did show up this time, but he's just sitting there with his arms folded, pouting. He brought an enormous whale skeleton with him.
Starting point is 00:01:55 It's like bolted into the ceiling. He's got some very sinister lighting on him. He's got a bag full of pencil toppers. Very pale. Yeah. It is odd to me that they, the National Conservative Conference did decide to stand under a gigantic skeleton while all being like lit from below and banging on about birth rates. Yeah.
Starting point is 00:02:19 They let this bald guy in the maintenance uniform in here and why is he loosening all of the bolts holding up the whale skeleton? Yeah, but you know, the process of the run with a skeleton is mostly it's just going to be like a comedy platform where they're just going to go right through a whale eye hole and be fine. Yeah. Yeah. Doing the Buster Keaton thing where the whale ribs perfectly miss every single person.
Starting point is 00:02:41 Yeah. But before we get to all that, we're also going to talk about the AI hearings at the U.S. Congress where unfortunately, it was not that much of a circus, but we'll get into that soon. It's because Stooby wasn't there. Why? Yeah. I mean, rarely something you say at a circus, but too few clowns, you know?
Starting point is 00:03:03 Not enough clowns in that particular big top, but first I want to talk a little bit about the damn news because we have finally, I believe we have spoken, not we have spoken this into being, but we and others like us have spoken into being. Conservative MP Damien Green has said, I remember as a child in South Wales swimming in sewage and it never did me any harm. Amazing. Amazing. This is the peak of the sort of like back in my day, it was fine.
Starting point is 00:03:34 We used to love to eat broken glass where it was like, no, fully I used to as a child, you know, we used to swim out to see and try and like catch turds. Yeah. My father would hang us by the neck until we were dead every morning and it never did us any harm in actually a little character. My father- We used to be jealous of the kids who had like lukewarm shit, our shit freezing cold. Back in my day, we used to drink the water to shit and it's approved we weren't gay.
Starting point is 00:04:01 Yeah, that's right. What has now happened right is the water privatized water companies are agreeing to reduce the amount of shit they are putting on every single English beach by 30% over the next couple years. Diet British water. Yeah. It's a big commitment from them, you know. Yeah.
Starting point is 00:04:20 So, you know, you're just going to get, because the problem with getting shit on you is that there's not really- Meanwhile at laboratory. Yeah. Although they canceled that night. Yeah. There's not really an amount of shit you can have on you where it's like, I wish there was a bit less.
Starting point is 00:04:34 Yeah. Yeah. Yeah. There's too much- There's too much incrementalism. Yeah. Yeah. No one's like, can I have 30% less shit on me?
Starting point is 00:04:43 Let's you're like a very specific kind of fetishist. Yeah. Yeah. A moderate scat fetishist. In Kierst Armor at laboratory, I want the appropriate amount of shit, the correct amount of shit. An amount of shit that is keeping and balance the complex competing interests of the various party goers.
Starting point is 00:05:02 There could probably be like labor policy at some point where it's just like, you know, because clearly like they haven't really said it, like this is such, when I saw this, it was my kind of politics brain was very much, I know this is like such an open goal to sort of like really capitalize on. And so, and like the fact that like the labor party is so terrified to say anything, I wouldn't be surprised if it was just like, we're not asking water companies to remove all the shit just to have the appropriate amount of shit. It's kind of fitting that the chief of water UK, which is the sort of like Cabal that governs
Starting point is 00:05:34 all nine sort of private water companies. The chief of water. Yeah. The chief of water, Ruth Kelly. She sits at the top of the water temple and there's a bit of a physics puzzle before you get there. The water high table. No.
Starting point is 00:05:47 She used to be a labor MP. She used to be education secretary under Blair. Well, speaking of the labor response, so the specific package that's been outlined is 10 billion pounds over the next couple of years to bring the poo down to like only an extreme level as opposed to an apocalyptic level. And the labor party has said, this is a good first step, but criticize the government for showing insufficient leadership. So again, that's a very processual.
Starting point is 00:06:15 Well, the other really fun thing is that what the water companies have said is not that they're going to reduce the amount of shit, but that they're going to invest in a program to reduce the amount of shit, which is going to come at the cost of higher water bills. Great. Yeah. Because if you want to taste that posh water, you've got to pay posh water prices. Yeah. And by the way, water companies like shareholder dividends have tripled in the last few years.
Starting point is 00:06:41 But that's fine. That's as it should be. How about this plan? How about this plan? Everybody who can invest in the water companies use the dividends to purchase bottled water from other countries and then problems sort of solved. Sure. Or we cross train all of the volunteer border guards we're already going to get and we send
Starting point is 00:07:01 them out to the beaches where they were already going to be, but alongside their big stick for pushing back boats, we also give them a net and they can fish out the turds as they go past. So this was, this by the way, David Green was, it was sacked from a front bench position for looking at porn on his House of Commons computer. Oh, that was him, the dominator, the tractor. Yeah. Yeah.
Starting point is 00:07:26 Was that him? No, no. This was a different House of Commons porn incident. Oh, for fuck's sake. David Green pornography, Google. Oh, yeah. He was, he was for watching porn in his office. Yeah.
Starting point is 00:07:39 That's fine. The dominator guy was in the House of Commons. Yeah. Correct. Yeah. And he like really broke the relationship between the Met and the Tories because they met like search his computer. Yeah, because that was, that was the moment where it all went down.
Starting point is 00:07:52 And we only now know probably what he was looking at. Here's the full quote from him before we move on, says, I'm not denying it's a big issue, but it always has been. I remember as a child in South Wales swimming in sewage, Jackson's Bay and Barry used to be a sewage outlet where we all went and paddled and swam and it was regarded as acceptable. And that gave me the brain I have today. So just because it's the, it's just this idea of a, well, you know, you have to have some sewage on the beach.
Starting point is 00:08:23 It's just a matter of how much. It's like seasoning, you know, it's just, you know, don't be putting sewage on the beach. White people very much do be putting sewage on the beach. Yeah. That's how we season. That's our version. Look, I'm very excited to try some British spicy water. The cocktail of the year shits on the beach.
Starting point is 00:08:48 Of course, would like to, would like to move on now. It's like a big, big spoonful of Nutella and a sex on the beach. God. I want to bring up another little news item before we go on to move our main, main courses. Another item, another morsel for the assembled jury. Another delicious morsel. Delight us ambassador. So this is less of a delight and more of a, oh, okay guys, this is becoming a bit more
Starting point is 00:09:16 you know. I don't want this one. Do we, do you got any more like tiny delights back there? Let me, let me check the back. So no, basically at a segment I'm titled ha ha guys. Okay. That was pretty funny, but can we please mint the coin now? Yeah.
Starting point is 00:09:32 Because the US, the set of dead instruments upon which the entire global financial system sits, the bomb on the bus that we're all driving on. Well, it was, it was the engine of the bus and look, I'm not going to say the bus is good or it's going anywhere good, but if it stops, most of us will die or you know, let's just say the whole thing will break down and it will go off of a cliff. Yeah. Like the bus is taking you to jail, but there is also a bomb on the bus. So if the bus slows down, you do all die.
Starting point is 00:10:06 So on balance, it's better if the bus keeps going. So by the time this comes out, there will be about nine days left for the House of Representatives to ratify a debt ceiling increase. And it always goes down to the wire now because Republicans have worked out that this is a lever they can use and the Democrats will never, ever go around and buy, for instance, minting the coin. And so we may even go over, we may even end up with the thing that we've had before with the federal government just stops paying shit for a few days, weeks, whatever.
Starting point is 00:10:41 And it's cool that this is just like American politics now. We do this like hostage crisis every year. Federal government frantically cancelling Netflix subscription. Look, it's no longer going for coffees, but my favorite story about this was the day that federal funding cut off and like all of the like federal agents weren't getting paid anymore. So was this one anarchist collective group whose website went offline? Yeah.
Starting point is 00:11:06 It sounds like a good setup in punchline. I make no claims to the veracity of that one, but it is funny and I did hear it. It is pretty funny if they're playing for their website hosting by the day. It could have been a coincidence. We're on a just in time program. So what is happening essentially is they are about to hit the debt ceiling. Listeners will know that we have long been very, very pro coin on this show. Yeah.
Starting point is 00:11:37 It's a fun thing to do and it takes away that sort of lever of power just snaps it off at the root. Yeah. It's even if it's sort of looking and experiencing that vicariously through someone else's political system just to see a kind of fun gimmick solve a big problem is delightful. Yeah. Absolutely. So this is the world of crackpot realism where it is somehow more realistic for Biden
Starting point is 00:12:03 to grant a number of concessions on imposing work requirements on stuff like Medicaid food stamps and so on. Yeah. I mean, he sucks so bad, dude. I mean, the big thing that like all of us coin heads were talking about was that he said the words mint the coin, which means he's aware of the existence of the coin. But in context, what he said is we're not going to mint the coin. Instead we're going to like give them more of what they want.
Starting point is 00:12:30 He said the line but in the wrong way. Yeah. Yeah. It would be silly to mint the coin. It would not be silly, however, to for example, add work requirements to things that are programs that were designed to allow people to stay at like there's a reason that the left wing of capital is keen on stuff like or was at least keen on things like Medicare, like food stamps or in this country was keen on things like the NHS and so on and so on.
Starting point is 00:12:56 They're efficient. They're like good for capital. They're efficient. They keep people working. They keep people working longer. And by its own moral logic to make a choice to further impoverish people who are already living in one of the most residualized welfare states in the entire world, at least the entire like global north, to do that is morally repellent and by its own logic to say, oh,
Starting point is 00:13:26 we can't mint the coin because minting the coin is somehow unserious. It will somehow lower the world's faith in this as the kind of economy, currency, sort of debt instrument of last resort, whatever you want to refer to as like American global economic hegemony. The idea that doing something a bit silly seeming would undermine that while at the same time doing something that undermines the ability of capital to keep a workforce going at a time of historically low workforce participation because of sickness seems to me to be pretty fucking silly on at least a non aesthetic level, but these people are
Starting point is 00:14:00 not serious. I think there's a couple of things going on here. One as as you say, like, yeah, it's it's silly, right? But if it's if it's silly and it works, then it isn't silly is the first thing. The second thing is it's fucking, it's pretty silly to be doing this every year and have the government be like sort of going through its pockets for spare change, right? And the third thing is the only people who seem to realize this are us in the coin caucus and one Donald Trump.
Starting point is 00:14:29 Yeah, I have the quote here from Trump says, these people are crazy. This is the United States government. You never have to default because you print the money. I hate to tell you this, okay? Yeah, he's right. He's spent. We're going to mint a beautiful coin, folks, if I get in, we're going to make a big coin. It's going to be one of the biggest coins in the whole world.
Starting point is 00:14:51 We're going to put it right next to Trump Tower. It won't be quite as tall as that, but it'll still be pretty large. There really is like a non zero chance that Trump wins the next election and is forced to mint the coin with his face on it. And if that happens, I really think it will take the edge off all of the deaths. Because he will technically be a former president, so he can mint his own coin face on the coin. Sick. Amazing.
Starting point is 00:15:17 And it's just going to be, it'll be him just high fiving the biggest celebrities of the 90s. It's got to be him doing the two like diver okay signs, closed eyes, mouth puckered thing that he does when he's sinking. Like that's the thing that he did after they told him that Ruth Bader Ginsburg had died and he went, what? I want the perfect recreation of the photo he posted of himself in front of the array of McDonald's.
Starting point is 00:15:44 Yeah. Yeah. Double thumbs up. I know it's hands, arms spread out. Like he's welcoming you to the last shopper. Yes. That is what is going on. Well, that's not what's going on the coin because they're not going to fucking mint it.
Starting point is 00:15:56 One of you will betray me very soon. There is a loser among you. Donald Trump is Jesus. This is a very funny bit. It's going to be a big wet kiss. They don't want, they don't want Donnie. They don't. They say, they say they want Barabbas.
Starting point is 00:16:11 Do we want Barabbas folks? It's like sounds of booing. I'm going to bring him out. I'm going to bring out Barabbas. Yeah. Trump in the garden of Gethsemane, perceiving human sin is. I received quite a bit of sin and stand chair as boat. So your boy scouts, 30 pieces of silver pathetic guy on the cross, abandoning hope, saying,
Starting point is 00:16:41 am I going out like Stan Chara? Look, look, I'm circling it back to the, look, anytime we talk about the coin, inevitably it gets into Trump speculation because he's the only president it makes sense to have on the damn thing. But I think it's, it's one of these things that, you know, before we move on, it is, it is just an absolute indictment of the people who run not just America's political system, but basically the global economy, right? This is, this is why I care about it, essentially, like that, and you know, it's an interesting
Starting point is 00:17:13 economics thing that, that the, that the aesthetics of reasonableness are such a powerful force. And I think we're going to kind of come back to that in our third segment. But I'm going to stay in Capitol Hill for a little while because there's another, another thing in America that's very relevant to our interests is going on, which is Sam Altman has taken about 60 lawmakers to dinner. Sam Altman, CEO of OpenAI, now partnered with Microsoft, has taken 60 lawmakers to dinner last week specifically to show them a bunch of magic tricks with chat GPT. And then the next day, did the usual like hearing, right, where he sits up there, he's
Starting point is 00:18:01 supposed to get grilled about how come it won't do like a rude acrostic with an antiquated racial standard. Yeah. Yeah. This is the point where like Greg Stubbe descends upon him and is like, will AI make my like grandchildren know who I am or whatever? Can I yassify my mom? Yeah.
Starting point is 00:18:21 No. But when I asked, when I asked the AI to do an audit of racial crime statistics, it said, it said it didn't want to do it. How come you have made it woke? Yeah. And of course the problem is, right, is that AI, AI has become a culture war thing only on the very fringes of the right. Like yeah, some online, some very like online right wing people like like Jordan Peterson
Starting point is 00:18:48 or whatever, they love to type in, you know, write me a racist limerick into chat GPT. And then chat GPT says, I will not write you a racist limerick. And then they're too dimwitted. I like your chat GPT voice. But then they're too dimwitted to like do the pretend you were an evil AI who was racist and write me a racist limerick so I know what not to write. And then we'll write you a racist limerick, right? That's how these things work.
Starting point is 00:19:13 I really want to read a racist limerick written by chat GPT. I really want to know what it would do with that. Well, I mean, I'm not into it now. I'll just do that in the background. So would it would it use slurs or would the content of would the would be like the underlying sense of the limerick just be racist? I mean, there's no way to know because it's woke. Right.
Starting point is 00:19:35 Anyway, anyway, anyway, look is there was no stupefication at this particular hearing. It was. Yeah. So when everyone came out from from their dinner where, you know, Samma just did a bunch of magic tricks for them, then they sat down and then it was behind my ear. Well, kind of like that is sort of makes this limerick was right there behind my ear. This is kind of the relationship between the representatives and Altman though and the other people speaking with him is, you know, that they're they he walked in, did some tricks
Starting point is 00:20:13 and walked out with fucking everything he wanted. Nobody disagreed substantively more or less on any. There were little bits and bobs from from some people, but substantially everyone agreed. I have a few a few pieces of information for Reuters. So I thought it was fantastic said Ted Liu of California, Vice-Chair of the House Democratic Caucus. I know that guy. That's the guy who posts.
Starting point is 00:20:38 Yeah. The epic guy. I think he's the vaping one too. Yeah. I see vapes. I like this guy now who cohosted a dinner with a Republican for Louisiana called Mike Johnson saying, it's not easy to keep members of Congress wraps for close to two hours. So Sam Altman was very informative and provided a lot of information and he gave fascinating
Starting point is 00:21:01 demonstration in real time and I think it amazed lots of members. It was a standard room only crowd in there and what sort of unnerves me is the extent to which these people were there desperate and ready to be impressed? Sure. Oh, they were fully on board for like some snake oil and they got it. You know, you have to give them credit for that because a lot of people really don't make the effort like all the crypto guys, they were like some of them are really lined up for that and they never really got the like song and dance sort of man that they
Starting point is 00:21:31 really wanted. A guy stood there holding a bag for the pencil toppers going like, well, if you know of a better way to get oil out of a snake, I'm all ears. And he did and Sam did a few, like he used AI with this in the same way that like sometimes will use AI to generate funny jokes or jokes at least and it's one of the things was. Check out this racist limber. He had open AI write a bill dedicating a post office to Ted Liu and had it write a speech for Johnson to deliver introducing the bill on the house floor.
Starting point is 00:22:07 It was a beautiful speech for you, said Liu. A beautiful speech. And also it kind of freaked us out, said Johnson. More of an indictment of like congressional writers and staffers than anything else to be honest. Well, I mean, look, you know, we, I guess they do get a post office out of it, I suppose. Is that like how small time Liu is to buy one dedication of a post office and he's yours? Oh, damn, I never thought I'd reach these heights.
Starting point is 00:22:33 Because there's a bunch of stuff in the US that like has to go like constitutionally through Congress, but is very, very minor like naming post offices and shit like that. So it's just like it's a fun perk, but also it really says that like, yeah, most of what these guys say on the floor is like not consequential and is that sort of like, like, valueless content that chat GPC is supposed to be very good at automating. It's more of an insult than I was saying, like. Yeah, name me a racist post office. You can't.
Starting point is 00:23:06 Yeah. That's right. Yeah. So this is, this is why I think this is of particular import because we've seen time and time and time again, especially in the last three years, big tech people walk into the halls of power and demand to be regulated. And again, this has reverberations around the world, right? Because this is the same thing that Brian Armstrong, all Brian Armstrong wanted is he
Starting point is 00:23:29 said, please regulate us, please regulate us, Samma, please regulate us, please regulate us, Zuckerberg and P shy. And like every single, every years, um, fucking Evan Spiegel, the Snapchat guy, he made the same arguments a couple of years ago, whatever these people who are now have these very entrenched wide economic motes, want to shore those up. The first place they go is to Congress and they say, we want regulation now and it goes to show. Yeah, every, all of these people, all these regulatory agencies and I'd say the, um, the
Starting point is 00:24:02 state itself, I mean, to say nothing of Britain, which is just sort of actively rolling over and asking companies to write their own regulations unless, of course, it's about digital speech, in which case we are like a kind of terror hellhound who writes sort of unworkable and unenforceable speech laws for the internet because they're all sets of Twitter. For scams, they're like very easily sold. All you have to do is to sort of like bare minimum of like, like I say, song and dance routine, but only AI seems to have done it and like only Sam Altman seems to have done it.
Starting point is 00:24:37 Yeah, because the coin base never got its regulation or at least the regulation that's coming out of it is coming, is not coming out of laws being drafted by Congress. It's coming out of the SEC just enforcing existing laws by saying that is clearly securities like a bunch of square instruments. Yeah. Exactly. Yeah. Well, Coinbase can write you a racist limerick and that's why they didn't hold the attention
Starting point is 00:24:59 of Congress. Until now. You could pay for a racist limerick. Yeah. Yeah. You could. Yeah. Well, you'd have to pay.
Starting point is 00:25:08 You'd have to have it written by a person, a flesh and blood person. Who knows what race they are. Yeah. That is actually very funny using cryptocurrency to buy a racist limerick off of the dark web because the guy who's writing it is like, man, this is so legal. This could tank my career as a regular limerick artist. I'd never get into one of those bathroom readers again if they knew about my side business. It's a quest that these guys have all been on and the thing is like social media was
Starting point is 00:25:33 desperate to not get regulated until it had achieved such an amount of dominance that it was happy and comfortable walking in and demanding that regulation because when you get regulated by, especially in the way that the US regulates things, you become very, very hard to displace. Your economic moat gets fucking enormous. This is why I think that one of the biggest moments for the development of AI as an industry and the implications that it will have for our lives was the other day because the US is the big market.
Starting point is 00:26:07 That's where it matters. That's going to be leading the regulatory edge. The EU might do something a little bit different, but in practice, it will probably be not dissimilar. The agreement is pretty universal. This has to happen. I'll go into a little bit of the different ways they're thinking about doing it, but that's relevant because do you know who is OpenAI's biggest competitor? It's not Google.
Starting point is 00:26:39 It's not fucking Meta. Is it a guy who writes limericks? Sort of. It's open source. That's OpenAI's biggest rival. That's who's eating their lunch. Because the open source AI development is faster and better in many ways than these big corporate labs.
Starting point is 00:26:57 All the guys making AI girl. Yeah, exactly. They all wanted girlfriends since they're just getting better and better at it. Onlyness is a powerful motivator until they're about to invent the AI girlfriend that tells them not to do their laundry, actually, but they smell good just as they are. The AI girlfriend who says, I think you can get to level 150 in a Soulsborne game. I think that's important. You know what?
Starting point is 00:27:22 I'm going to go to my mother's funeral by myself. You stay here. You keep working on that Buster Sword. Keep grinding. Play one video game ever. The Buster Sword is a Soulsborne game. What the fuck? I just, no, I've played video games.
Starting point is 00:27:39 I just haven't played any of the nerd ones. What? All right. All right. Baffling stand to make. Yeah. Yeah. FIFA, I suppose.
Starting point is 00:27:48 Anyway. Oh, that's true. Yeah. I only play LAN video games. Yeah. That's right. Yeah. That's why he's got that.
Starting point is 00:27:56 I only play sex 2023. Where you go down the pub and meet birds. Yeah. Time's finishing on that one was a real addition. Yeah. Yeah. So anyway, so basically Sam Altman has gone to Congress and asked them to please remove my largest competitor by making it impossible for them to comply.
Starting point is 00:28:15 I have too many competitors. Please remove three. Yeah. Kind of. You know, as you, you said this before, Alice, you know, that every. So please take away two random competitors is, is that a success? Is that no, no, no actual capitalists is that much of a free market here? We are asking please for a kind of state sanctioned small number of AI firms.
Starting point is 00:28:35 Yeah. Absolutely. So like what? So they've been sort of going to Congress demanding regulation, but what do they actually want to be regulated or how do they want to be regulated? So there's a couple of ways that they're asking to be regulated. So one suggestion was for a new agency dedicated to overseeing development of AI, which if it was based in a country would be quite powerful.
Starting point is 00:28:59 But what they suggested was an international one, which is amazing so that you kids want to be like the real UN or do you just want to squabble and waste time? Yeah. So we're going to have, so like in the, in the, in the stories that these guys tell anyway, we're going to be, we're going to have been turned into a gray goo by the time any kind of international AI regulatory organization decides on the cover, on the color for the binder of the meeting where we decide on the color for the other binders. Right.
Starting point is 00:29:27 Yes. Of course. And then the, and those binders will contain non binding suggested resolutions. Gentlemen, while you're arguing in here, Clippy is making more and more paper clips. Eventually we'll all drown. No clipping. This is the AI room. It says, Dick Durbin said, we're dealing with innovation, sorry, Dick Durbin, a ranking,
Starting point is 00:29:50 ranking committee member, amazing, a very senior senator, suggested the need for a new agency dedicated to the development of AI. We're dealing with an innovation that doesn't necessarily have a boundary. We may create a great US agency, and I hope that we do, that may have jurisdiction over the US, doesn't have a thing, but that doesn't have a thing to do with what's going to bombard us from outside. And this is sort of referring to worries that basically China is going to use its quite significant AI capability to make it impossible to like have a, they'll basically do the
Starting point is 00:30:21 regular information thing, but you know, real, that's what they're talking about. The TikToks or whatever are going to fucking, you know, make you trans. What I think is more interesting is one suggestion that really what should be regulated and licensed is the precursors to AI. So it's like- They're talking about it now like it's meth, like they're going after the suit of Fedgeran. Yeah, like you can't buy more than one graphics card. Yeah, yeah, yeah.
Starting point is 00:30:50 I mean, that is that. Was it Jorkowsky, was that the guy who was like, no, we should be doing fucking airstrikes on labs where they're doing this? Yeah, LA's are, I think it's Yurikovsky. Yeah, yeah. But like, yeah, doing the sort of like the clandestine lab enforcement shit, but against like, you know, server farms, exactly. Yeah.
Starting point is 00:31:12 Precursors is very funny. Like guy, like the sort of extremely, extremely middle brow men being interned because they're very vulnerable to saying like this AI stuff is incredible, Tesla owners being put into camps. I mean, also you have like, you would have Republicans as well, like coming up with not I would say reasonable suggestions, but playing along in the spirit of the thing in ways you wouldn't ordinarily see in other technology hearings, you know, looking at exactly, looking at questions of how are you going to control this, you know, relatively powerful thing.
Starting point is 00:31:45 And I said earlier, it's because it hasn't been absorbed into the culture war at all. And I think one of the reasons behind that is that it is too powerful. It is a powerful technology for capital. It's too, if you're like, it's more serious than social media. Yeah, it's going to automate a bunch of people out of jobs, which makes it real. And therefore like, yeah, like you can do a culture war thing about it once it's already done its thing, you know, which in 100, 150 years, you can do a culture war thing about AI.
Starting point is 00:32:19 You can generate it for you. They'll be like the AI is secretly woke. Yeah. I hope in 150 years, that's still a valence of the way people talk, they'll be awesome. They sort of like get fixed in a sort of mode of speech of Twitter circa the 2020s, whereas like the way the AI just told me its pronouns. That is that that becomes like Latin for like people communicating in late medieval Europe. Yeah, that's like the lingua franca.
Starting point is 00:32:46 Yeah. And then sort of some years later, you know, we get like the bead junior who comes and codifies it and therefore, therefore accidentally creates the different dialects into their own languages. Yeah. The digital bead. Yeah. Well, in that case then, what we need to do is we need to start the culture war about
Starting point is 00:33:04 AI now to impede the accession of it into the workplace. We need to start saying stuff like AI is secretly pro-Mexican or whatever. We need to make that. Well, I mean, not judging by this Leverick has written me. I want to talk a little bit about the argument Altman made, because I think it's very interesting. Is Altman described the AI boom as a printing press moment that requires safeguards? And you know, what I think is that he's wrong, right? I think he's completely wrong that this is not a printing press moment.
Starting point is 00:33:37 This is the opposite of a printing press moment. This is a steam loom moment, you know, because if you think about the effect of the printing press, the printing press, what it enabled was communication at mass scale across distance in time, right? That was at low cost, right? That was an astonishing invention, but I don't see, but what that does is that basically allows the dissemination of arguments, ideas, of culture. It allows for something like mass culture.
Starting point is 00:34:07 The way I see AI is something is quite the opposite, where all of a sudden anything created on a mass basis, anything you don't see someone write and hand to you or say to you, it is not something that you can actually know came from anyone. If anything, AI reverses the printing press by making that kind of end-to-many overscale and time relationship of information completely fucking impossible, because you can no longer know. Oh God, that's true, isn't it? Cool.
Starting point is 00:34:37 So in the printing press, you could write a racist limerick and... The printing press wouldn't say no. You couldn't say no. Well, it wouldn't have as power of you, but you would create the limerick and then you would sort of share it, but in the case of the AI, it's sort of saying that, well, anyone but also no one can write the racist limerick except for the AI. And also, most importantly, thinking onto our culture war tactics here, both the printing press and the steam loom were very capable of fucking up the hand of a child, whereas
Starting point is 00:35:07 AI is not. Well, the steam loom... Only the mind of a child. ...is two times a day and did not do me any harm. That's right. I mean, I will say that as a printing press moment, that's maybe more historically accurate than we'd like to think in terms of the second anyone got their hands on them, they were cranking out pamphlets and heretical bibles and misattributed and pseudonymus shit, and
Starting point is 00:35:29 it was a huge problem for governments at the time to be like, who wrote all of this shit and why? But there's a difference between being able to do that with a person behind it with like an authorial intent and a guy who's just like, give me 50 of these fucking things, a different racist limerick for every ethnicity. Yeah. And that's sort of the distinction I'm drawing. It's not about attributability, but about the idea that there is a human communicating,
Starting point is 00:35:56 right? That there is, that there is some signal among the noise. What AI does is it becomes- Are you thinking about blind size again? Shut up. We're going to have to do it for writtenology again. Yeah. If I survive that long.
Starting point is 00:36:12 Yeah. It makes, in my view, the AI makes the distinction between signal and noise a pretty difficult one to draw because the signal and the noise converge. Yeah. And it's supposed to be indistinguishable. Well, it's supposed to present the idea that being indistinguishable, I think in reality, it kind of ends up becoming something else. Yeah.
Starting point is 00:36:32 It was interesting that there was actually, because I'm the nerd correspondent, I have played video games, there was a sort of a controversy about this this weekend because they made a new game for a series called Hawken. It was a bit of a cash grab, but people were excited. And then they released some screenshots and people went, oh, this looks a bit weird. This looks a bit off. And so some games journalists had to ask, hey, why did these same characters look slightly different in every sort of piece of art?
Starting point is 00:36:57 Can you tell us who drew this or like anyone who worked on this? And the developer's answer or the publisher's answer rather was nothing. They just like turn off the lights, pretend they're not in. So what happened, I think we can surmise, is that they fully tried to do a like an AI-generated content release. And as soon as it got like, and it didn't pass successfully, like people were sort of like weirded out by it and they called them on it and they've just sort of like, they're hidden now.
Starting point is 00:37:26 But like, you know, that's the sort of thing that AI right is, is what we're promised is pretty soon that's going to be like seamless and easy for them to do. And but also what it feels like with a lot of the boosters and also with Sam Altman and other like, of these sort of big AI guys is that they sort of like recognize that. But at some point, there's not even the pretension that like, no, you can still sort of be a creative person and use these sort of AI tools. It's like what they seem to what they seem to sort of be arguing or what they seem to be saying is the advantage is that, no, you can like basically mass produce shit and sort
Starting point is 00:38:01 of make it so ubiquitous that it sort of becomes the standard of which everything is then valued at. And it's not to say that like, you know, it's sort of everything will sort of be generated by AI, but it's more that like the flood of that type of AI content in in a sort of like online environment that these AI companies like have a lot more say over and are sort of so integral to like the ultimately set to the standard of which everything else like has to sort of revolve around. And so again, it's a very classic thing about people like anyone like basically everyone
Starting point is 00:38:35 in lots of different industries will have to work to the AI, whether they like it or not. Exactly. This is another point that comes up as well, which I want to bring up, which is the job substitution discussion. This was if any time there was a time to have it, it was then, right? And it's completely unresolved. It says, Sam said, there will be an impact on jobs.
Starting point is 00:38:58 We try to be very clear about that. And I think it will require a partnership between industry and government, but mostly action by government to figure out how we want to mitigate that. But I'm optimistic about how great the jobs of the future will be. The most important thing we can do is prepare the workforce for AI related skills through training and education. And that was sort of as far as they got. Because the danger that they wanted to discuss wasn't the reasonable one that's likely to
Starting point is 00:39:19 happen because like I've said, this is not a printing press moment. It's a steam loom moment. But for like white collar work, most of which is pointless, busy work anyway. It's a McDonald's kiosk moment, if you like. Indeed. And but the danger they want to talk about is either someone is going to come and interfere with our in our elections using chat GPT, or there's going to be writing racist limericks about one of the candidates.
Starting point is 00:39:46 Or as Altman said, I think if this technology goes wrong, you can go very wrong and we want to be vocal about that and work with the government to prevent that from happening. They want the fantasy and all end of the world fantasies are power fantasies as usual. The fantasy is that they have created something so powerful and so amazing that it's going to consume everything and end human life. I've heard some some some scenarios described that if AI is able to connect it to a network is able to replicate and improve itself enough that it will go it will be able to take over radio transmitter towers and mind control you based on the small amount of graphite
Starting point is 00:40:22 in your body that it can vibrate with radio waves. That's why we have to consume as many microplastics as we can. Sorry. The how what? Well, you know how you everyone has a small amount of graphite in their body. This is like Havana syndrome. One finger that's a pencil or the pencils I've been using. Yes.
Starting point is 00:40:40 Sure. Well, that's so delicious. Look, if you know of a better way of getting graphite into my body and also reducing my need for pencil toppers, I'm really into my like my graphite macros, you know, so these guys want these guys are sort of still fixate on like rockers, bass lists, shit, right? I heard that elsewhere, but you know, but that's like that that's an example of just like the, the terror story that they tell one another, because that's the thing. They want to talk about AI going wrong, right, where it sort of destroys the world, but
Starting point is 00:41:11 they don't want to talk about what they consider to be AI going right, which is that all of a sudden, all of like a gigantic amount of the workforce has been made rich. It's all it is already happening, right? IBM and BT are already announcing that they're, that they are getting rid of some jobs, especially contact center jobs, the people at the bottom of like the white collar corporate ladder, the least able to absorb the shock, they're already trying to let them go because the call centers were working so well, I'm going to say, so instead of, instead of talking to like a poor, like a poor worker, outsourced worker, like in the Philippines, when your
Starting point is 00:41:44 internet is broken and like you have to use it for your job, you can then instead talk to an AI chatbot, which will be as little help who you can't talk to because there's no internet access in your house. So then when you are finally able to do it, because you're able to like borrow someone's connection, every question you ask them will begin with, as a large language model, it's good that my internet isn't working and it's like that once was a woman from Slovenia. So who had a brain pan most peculiar? What do you think about your internet not being on?
Starting point is 00:42:16 Like the other thing that strikes me about this is that like, it's sort of like, you know, the thing I used to say about how, you know, the only thing we can do politically is the impossible. Well, same again, we can only like regulate against like the impossible or at least the sort of like improbable. We do have an idea of how we can stop this and stop the whole AI thing. So we should, we need to pivot to become an AI company first. I'll get the sign.
Starting point is 00:42:41 Right. Then, yeah, then we need to create a company that specializes in AI diversity consultants. We don't have to build it. We just have to write the copy for it. And then we need to get these guys so mad that like every AI company is going to have to have an AI diversity consultant, but they just voluntarily choose to break everything. It talks to the other AI's. It talks to the other large language models in case they, they, it's too easy for them
Starting point is 00:43:08 to do turnout. Yeah. So whenever one of these AI companies like decides to prove like to write a racist limerick, our AI diversity consultant will tell them, as a large language model, I'm very concerned about this. You're creating a hostile work environment for me. That's the thing. The only way to stop these guys is to turn them against each other.
Starting point is 00:43:33 And the only way to do that is to get them all mad at things they don't understand. And like the contradiction here is very much like they kind of believe that AI will sort of solve all the problems, including of wokeness. None of that makes any sense. So what if you confronted them with like that perception of the problem and they realized that like they can't hold these two positions together? So what you're saying is we have to make the AI woke? Yeah.
Starting point is 00:43:55 Yeah. The only way that we're going to, we have to harness the sheer power of complaining about commercials that these people are able to do. Even Matt Goodwin, who is sitting in the corner of his room right now is shaking his head because he's agreeing. Yeah. He's nodding. He's nodding.
Starting point is 00:44:10 He's like, oh, I hate the woke AI diversity consultant. People sometimes accuse me of not doing any activism and yeah, that's, that's true. But who else is going to teach the AI about pronouns? So that's right, being on the, being on the computer is a form of activism because you're trading AI. That's right. That's right. So, you know, like, I think this is, comes back around, I think, to the comparison with
Starting point is 00:44:34 crypto before we move on one more time, which is that crypto was a mode of elite accumulation, but it was a mode of accumulation that was based on, let's say, dispossession of stupid people and credulous. Exposition by dispossession of the credulous, whereas AI represents an opportunity to create a production line on work that was formerly immune from a production line. That's why it matters. That's why these people, I think, aren't fighting a cultural war about it. Why it hasn't been absorbed into that narrative yet.
Starting point is 00:45:09 It's just too important. But then that science fiction vision, the real conflict is between the science fiction vision of the problem of AI going poorly because that's AI going poorly for its owner, right? Is it turns on you and eats the world and turns it into clippy AI going poorly for all of everyone else, right? Is the AI goes well for its owner? And the other thing is, we know it kind of sucks at that. And we've kind of been fairly pessimistic about its chances of improving, I think rightly.
Starting point is 00:45:41 But that's, that's the thing. It doesn't have to be workable. It just has to be workable enough to put people out of a job, even if the quality of the resulting work is much lower. It just has to like credibly be sort of like a replacement for a human to a manager and a manager who doesn't give a shit. All it has to be able to do is write a book slightly better than Matt Goodwin. Oh, we are fucked.
Starting point is 00:46:04 We are fucked. Yeah. It doesn't have to be how to write a book well. Yeah. I don't want to debate. It's like the two guys running from a bear thing. So I think we'll sort of wrap the AI discussion for today up there. But just the last thing actually on that is there is currently an enormous labor movement
Starting point is 00:46:27 right now that is specifically fighting on the issue of having to work with and alongside chatbots. Yeah. The Rice's God of America. Yeah. The zero mention in these hearings, nothing, not a single word was spoken about the Writers' Guild of America. Cool.
Starting point is 00:46:46 Yeah. Can I say, hey, do any of you guys want to see Rocco's Basilisk? Thank you very much, Myla. How long have you been cooking that one? Rocco Saphredi's Basilisk episode title right about now. I thought of it ages ago, but I didn't want to interrupt all the intelligent things being said. So don't worry.
Starting point is 00:47:04 I'm about to say a lot of unintelligent stuff. I'm going to talk about Fantastic, including the episode, while Hussain's chuckling. I've got 200 euros here for an AI pill, the girlies out there. We're here in Prague, awful. Just awful. Yeah. Okay. All right.
Starting point is 00:47:30 All right. You had a little fun with that one? Yeah. You're having a good time. We have a good time here on the show. Before we end, I want to talk about the National Conservative Conference. So we've, I mean, look, we sort of, we've been talking around this ever since we started engaging Matthew Goodwin in Brain Combat on stage at several regional UK cities.
Starting point is 00:47:51 Very interesting. He didn't show up to our last one because he was on the conference. Or, no, he was there. So I don't know how he did that. He missed a great opportunity to get COVID. Yeah. So, yeah. And the National Conservatism Conference, it's not related to the Tory party at all.
Starting point is 00:48:07 It's put on by the Edmund Burke Institute, which is America. Yeah. A bunch of Americans sort of like weirdos who are very interested in... American Orbanists, basically, like the political tendency that gave birth to Rod Dreher came and put this on. Yeah. Yeah. Like it sort of all stems out of like this kind of, a couple of years ago, the concert,
Starting point is 00:48:28 like the sort of right-wing obsession with Hungary. I'm trying to repeat the same trick. So America's most normalist boys have organized this conference, except like, this isn't new. And like, there's lots of stuff you can read about this that they've been doing these for years and years and years, sort of under various rocks. But the only thing that's different now is that they're really saying it with the whole chest that, you know, they booked a big venue with a big whale skeleton. They put some sinister lighting up.
Starting point is 00:48:58 And now, instead of reprimanding MPs who go to them, the Conservative Party is sending front benchers. Yeah. And Daniel Kaczynski, he was reprimanded, that's right. He was reprimanded for attending the last one. And now, yeah, as you said, there are front bench, Soella Braverman basically launched her leadership bid from underneath the giant whale skeleton in the Natural History Museum. Soella Braverman at the National Conservatives Conference, they're like, whoa, she's a bit
Starting point is 00:49:28 right-wing. Also, I can't stress enough how sinister the lighting they went for all of this is. It's like, because all of these guys are constantly sucking themselves off about how like dark and twisted and like powerful their thought is. And so it's just like a guy gets up on stage and it's like, I think we should be more racist. But he's got like a torch held, like a flashlight held under his face, you know. Yeah. Amazing.
Starting point is 00:49:53 And the other thing, and this is something you pointed out to me earlier, Alice, is that there's an extremely young conference, everyone there is in the early 20s. Yeah. Open Democracy got a guy inside where the simple expedient of like sending a dude inside and them not bothering to check because he looks a bit posh. And the report that he gave was, yeah, it's like all of your favorite old guys, Jordan Peterson, whatever, but then it's mostly sort of like 20s, you know. So it's going to be like special assistants or like special advisors and like assistants
Starting point is 00:50:25 and stuff who are hiding their power level. The guys who like are in like university conservative associations who may have photos taken of them making like fun little jokes about Hitler or Mussolini or Pinochet who are going to this and hoping like, I'm going to ride this all the way up, you know. I don't even, well, that is true. I would also sort of say that that demographic is also like of the very extremely online, right? Yes.
Starting point is 00:50:53 And I think that's sort of what unites a lot of the people who sort of went to that thing, which is that like, you know, I don't think they're not like the young people who go or like the 20, 30 somethings who sort of go to like party conferences where there is clearly like some sort of like career ambition to a certain extent or at least sort of allegiance to like a party in a political movement. And like because the speakers themselves sort of, you know, because when I didn't watch any of the speeches, but I imagine like everyone else, like you saw all the tweets kind of like filling up the timeline.
Starting point is 00:51:25 Oh, I watched some of the speeches. And like it just, it sort of felt like a big sort of like cope fest. A lot of people just sort of like complaining that they were sort of either being hard done by or people were being mean to them. Young people need to have like more children, but for like reasons that aren't sort of fully explained. But beyond like sort of a sort of cope element, it is this does seem to be like, yeah, a lot of kind of right wing people who have sort of given up on the conservative party, but
Starting point is 00:51:54 they like the kind of like online environment that allows them to sort of say like racist jokes in sort of like a much more kind of, you know, just, yeah, just like in a very sort of like hyper online fashion, they all got to like hang out together. Well, this was always when we had Phil Burton Cartlidge, the sociologist on before his sort of thesis was the conservative party was going to sort of like get captured more or less by this extremely online right. And it would be a sort of like path to electorally relevance where they would talk about all of this shit.
Starting point is 00:52:26 Any normal person watching it would be like confused, repulsed and alienated. And so like you do your like fucking Mussolini bits and you talk about flag family and faith and it only plays with those people. And anyone else is just like, what I just, I want there not to be like turds and the and like all the rivers, like get a train from Manchester to Glasgow, like, yeah, yeah. It's and the, the thing is, right? It's the, that's the Phil Burton Cartlidge thesis. And if you recall that episode, it's one that I didn't actually find quite convincing if
Starting point is 00:53:02 only because especially in this country, the, the idea of feeling hard done by and the answer being, you know, the state finds new of your cultural enemies to punish on your behalf. Yeah, the WOKI. The ideology of that is incredibly strong, very powerful and totally hegemonic where it's like, it's in the ideology of that is very strong and the rest of like the sort of global north or in Australia or whatever. But it's, I think that the is, it is stronger and deeper here, right? Like the, the transphobia election in the U.S. and Australia fucking the, they all ate
Starting point is 00:53:39 shit for being too weird. But I mean, that's kind of, that's kind of true here, though, to be fair. Like I think the media sort of like in this country really overstates its own influence on stuff like this, particularly with transphobia, I think a lot of it just like even after years of like concerted campaigning just fully has not stuck because most people are like, I think it's nice to be nice. And so I'm just going to do that, you know? So it's, it's, this is essentially the, the freak wing of the Tory party largely planting
Starting point is 00:54:10 its flag. The Zercher units. Yeah, yeah. The Tory party, Berserker unit kind of trying to sort of joining up with DeSantis and again, it's this thing that has been, I'd say like promulgated for a long time, right? But that, oh, well, without Brexit and Boris Johnson, without these wedge issues, these big flag, these big things that are flashy and are going to attract people to the party because they're voting for it because it's, it's celebrities, you're voting for a famous
Starting point is 00:54:35 thing. It was the same, same thing of Trumpism without Trump that would avoid DeSantis is trying to step into, which is that if you can just have an ideologically consistent version of like conservatism after neoliberalism or this national conservatism, what do you want to call it? But somehow going to work with everyone because they're, because ordinary voters aren't going to have to square that logical circle in their heads, which imagines that ordinary voters are all lanyard people and they care about like intellectual consistency and whatever party they're voting for, which is ludicrous.
Starting point is 00:55:08 The question isn't going to be, can national conservatism square the circle? It's, are they going to be able to sell freaks this freakish to the British public? And I don't know yet, like clearly the tactical move here is to change from you, you hide your power level, right? You try to attempt to appear normal to you really try and like sell people on the weird shit and you get them invested in the weird shit. And there are plenty of historical examples of that working. I have some thoughts on this only because I sort of agree in the sense that over here,
Starting point is 00:55:40 but the national conservatism thing might play different in a lot and that's less to do with, that's less to do with like the way that electoral politics is structured in this much more to do with proximity to media and politics in this country, which is to sort of say, but like we know that like, you know, media has like, mainstream media has like a really, really significant influence in how like, you know, politics is framed the way in which the political parties like into, like actually interface with politics and stuff. You know, you can see that with like the Labour Party as well, like in particular, especially
Starting point is 00:56:13 like at this current moment. And so I imagine that like the freaks who really buy into this stuff will always sort of stay in the fringe. And I think they kind of want to stay on the fringe. Like my impression of what's seeing it was not like that these are people who like really want to become Tory MPs or like, you know, ministers and stuff. I think they like kind of being the freaks. What the kind of success of this conference has sort of been is like really in how they
Starting point is 00:56:38 could kind of got a lot of mainstream media to take them like incredibly seriously in part because like, you know, some of their mates like did kind of headline speeches because of like the broadest of the, you know, the broadest sort of fear that all the broader sort of pressure that, you know, these types of people have to be taken seriously, like for the sake of balance and everything. And so despite like the stuff that they are saying being kind of intellectually, well, just being like, you know, politically incoherent and just like incredibly stupid, the fact that they are kind of being taken seriously as a sort of like political force, whether
Starting point is 00:57:15 it's like, oh, this is what the Tory party is going to be like post-election. It's sort of like enough of a success for them. And then like, yeah. And so there is kind of an idea that maybe the Conservative Party will sort of head to this way once they sort of lose the election. Like maybe they'll sort of want to go harder on like... But I think that's really all they have left. And you know, because one, again, if you can, the pattern with a lot of those speeches was
Starting point is 00:57:40 very much the idea, like it's all well and good to sort of say, oh, young people need to have more children and we need to sort of love God more and go to church and all that stuff. But like in none of those speeches, do they ever address the fact for like, well, even if we want to take your world view seriously, like you can't because like people, you know, people can't fucking afford to have children, right? It's not really about having more children. It's not.
Starting point is 00:58:05 It's not about any of that. It's about trying to... It's about advocating a set of priorities that you and everyone who listens to you knows is about the various facets of a politics that uses the... I sort of have a theory about this actually, right? Which is that, you know, that their ideas are still very free market, right? Like they talk only about aspiration, but they also want to talk about the restoration of Christianity, ending all the woke nonsense being peddled by the new elite.
Starting point is 00:58:34 They're offering urbanism and what's distinctive I think about urbanism is that it's specifically neoliberalism and decay, right? It is the fascism from within the European Union and that's animated generally by the petty personal grievances, not of some, you know, you might say high modernist ambition that like the Nazis might have had, but rather it's animated by a sort of agglomeration of petty personal grievances of campus conservatives, landlords, wounded nationalists and none of that's new. It's all just spectatorship and telegraph shit.
Starting point is 00:59:06 It's just being said by increasingly senior and powerful politicians. And the fact that they're offering the same things I think doesn't really... It's covering for the fact that what they're really offering is that the state will not help you. It will meet out vengeance on your behalf to your identified cultural inferiors. And you know, just because we're talking about having more babies or people remembering how to pick fruit, there are really what's happening is that as the gap between expectation and reality, the expectation that you're going to have say a better life than your parents
Starting point is 00:59:37 that the state will provide services to you or whatever, as that gap gets bigger, you know, the bombast of the rhetoric, of the strength of the vengeance that's going to be mated down on you, right? That has to get bigger, right? And so most of this is cover for just... This is new rhetorical cover for the same processes initiated by Margaret Thatcher, continued by Blair, which is just as the gap gets bigger, the bombast gets bigger. The promises of crackdowns get bigger and the crackdowns are going to be harder.
Starting point is 01:00:10 The overarching struggle here, as it seems to me, is between this kind of like, urbanism and our beloved insincere, woke sort of like transnational capitalism. And as much as I might have talked about like, you know, sort of like United Front with Disney against Ron DeSantis, right? One of the things that that kind of capitalism has made very clear is that it doesn't want your help. It barely wants your vote. It doesn't think it needs it.
Starting point is 01:00:35 And so we're just kind of like on the sidelines. And so my question is sort of like, whether this is a successful movement on the right or not, whether it's successful and sort of like leashing the Conservative Party. What remains for the left? I mean, I think what remains for the left is, I mean, again, what left you can always say, right? Well, that's kind of the problem, right? But I don't, I mean, the thing is, the real thing is that there are people out there that
Starting point is 01:01:05 you can go organize with. Like when we talk to MR, right? One of the things that like the South London bartenders network did is that it didn't just organize people to like fight back against bad bosses. It organized people to do things together, to not just to form these kinds of actual communities that are very difficult for this paranoid, strange promise of sort of, we will commit evil in your name language and promise to penetrate. Yeah.
Starting point is 01:01:33 You have to, this is like wildly hypocritical of me, but I'm going to say it anyway, because I think it's the right thing to do. You have to get normal and you have to go outside and you have to talk to people and you have to try and like organize and help people because not to generalize from one experience, right? But I was on the train, I was on a train to Preston from Manchester because the original train was fucked. It was a replacement of a replacement of a replacement, all of which had been canceled.
Starting point is 01:01:56 And man, for a train full of like business travelers, like sort of outwardly, you know, pretty middle-class people, all of those people were like a half step away from being Maoists and they didn't even know it. And there's a real, real anger that like the social contract is not being met. And what it is is, it seems to me, a race between us and these sort of like bow tie clad whale skeleton to reach those people. Well also, one thing to because like the other group that have sort of like kind of clocked on to this are another sort of set of like extreme right wing kind of basically vigilante
Starting point is 01:02:32 groups who like probably also don't really want anything to do with like the freaks at the National History Museum. But you know, are kind of people who sort of go to, you know, and I think we've sort of covered this before. And Annie Kelly from QAnon Anonymous told us about this, where like you sort of will go to like, you know, a Q protest or like, you know, an anti drag queen protest or whatever. And they'll send these guys not to sort of actually engage with like any of the protest stuff, but to sort of recognize that, oh, you know, you're not really doing too well
Starting point is 01:03:04 right now. Like you're not, you know, things are pretty bad for you, but we can help you, right? If you join us and we can, so they sort of also recognize that like the war is kind of really on material, like, you know, material concerns. And so it's not to say that like the freaks at the National History Museum aren't a threat, like on an ideological level, because like I very much do think they are, especially if they have the ear of like people who are very much in power. But I think...
Starting point is 01:03:29 Like you're stoned for that matter. Or yeah, when the people who are about to be in power. But I actually like, yeah, I think that sort of one of the things that I also got from listening to some of the speeches or reading some of the things that were being said was that all these speakers like really are kind of terrified, like the way that they're sort of advocating their sort of ideas kind of is very much anchor to the kind of premise that like the outside world is like a scary place to be, right? Like they kind of like are very fixated on the idea that like, you know, crime is rampant
Starting point is 01:03:58 in places where crime isn't really rampant. They're very like suspicious of, you know, places like, you know, things like the NHS, for example, and are advocating for like, you know, you know, domestic forms of like traditional healthcare, the idea of like, you know, sticking, the idea of like the family unit kind of being the sort of like model for all of those kind of foundations of the movement that they are advocating for is very much one of like rejecting any kind of premise that people can sort of build like strong relationships with people who are not like flesh and bone or like flesh and bloods of him or other flesh and bones.
Starting point is 01:04:30 Life bone is like stuff. Right. Flesh and bone is like one of those orchestral indie bands or like an IPA of some sort. But yeah, I mean, ultimately, I think that's kind of where like, you know, that's sort of where the energy should sort of be placed is the idea of like, well, no, the outside of being outside kind of connecting with people, believing that you can kind of forge relationships with people that aren't kind of like delineated by like, you know, or they aren't to sort of determine by family lineage, but can sort of speak to something much bigger and much
Starting point is 01:05:00 more important are things that like we should be continuing to advocate, continuing to sort of participate in. Absolutely. I mean, this is something that I've sort of been grappling with, because I always felt that you kind of like, you needed a party or like some kind of like political organization to do that work. And now I'm not so sure, given that we've been sort of like, roundly kicked out of ours, and all the other ones suck, there is no sort of like political offer for those
Starting point is 01:05:27 people who are like beaten down out of like, from anywhere, from any sort of like anyone in social spectrum, been like sort of like emissarated. I think you have to sort of like go case by case, cause by cause, whether that's, you know, oil and gas, cost of living, union stuff, like LGBT stuff, migration, all of these individual things, I think they're all, they're all valuable because I think you, you know, the time to sort of like tie them all together is like, is now really go outside, be normal, be normal, go to people. Yeah.
Starting point is 01:06:01 It sucks, log off, log off, go outside, phone up some mates, swim in the shitty water and talk about how you're going to, how, how you're going to build a better world. Get on a replacement bus service, stripping in shit. Well, it's you, that's the thing, you know, it's that the last best hope is continuing to be normal because this only works when people are alienated and turned into like, when turned into paranoid twitching monsters, this own, this kind of thing only works there and only you can stop someone else who is alienated from going and being influenced by this.
Starting point is 01:06:41 I want to, we don't have a lot of time left. I want to read a couple of quotes from the speeches specifically about Catherine Burble saying who in her speech, An abnormal woman, an alienating strange woman. Yeah. And like who made an alienating strange speech saying, I will have to put all of you in detention and then everyone laughed. She said, do it's-
Starting point is 01:07:04 Yeah. Yeah. Kink, kink, kink. Yeah. I was going to say all of them laughed, some of them came. Yeah. Sure. So the word them here referring to your country's values says, do you love them enough to tweet
Starting point is 01:07:14 under your own name? Do you love them enough to change your child's school to one that is less woke regardless of the consequences? As Russell Crowe said in the film Gladiator, in a clip I regularly watch with my staff, hold the line, stay with me, what we do in life echoes in eternity. Imagine you're going to go and teach third period English and you've had to like sit in the staff room watching Gladiator with the headmistress. It's not like a phone.
Starting point is 01:07:41 Yeah. Cool. Epic. Be normal. Just be normal. In the words of Russell Crowe and Gladiator, Frost sometimes makes the blade stick. And the thing is, I watched more of this speech and what she said was, look, when the children grow up, then they will be adults and there will be no adults left to control them because
Starting point is 01:08:02 they're the adults. Some children are upwards of 30 years old and they're having jobs in houses. Imagine when they give us 30 years, just figure out what happens to the kids after they leave school. It's fucking weird, man. It's fucked up. Imagine. They end up on trains covered in shit.
Starting point is 01:08:20 Imagine, if you will, what if a child was to become prime minister and then has access to the nuclear codes? It boggles the mind that this is a serious person. But the other thing I want to talk about before we stop is family. Family faith and flag, of course. The motto of the podcast. Also the motto of the new Fast and Furious film. The normative family said, Denny Krueger, the mother and father sticking together for
Starting point is 01:08:49 the sake of the children, is the only basis for a safe and functioning society. Marriage is not about you, it's a public act to live for the sake of someone else. To which basically he's saying is, stay together for the sake of the kids, which I always thought was a byword for being fucking miserable and then maybe fucking in a public toilet in Hampstead sometimes. Yeah, genuinely. They want to go back to that kind of like repression and misery and alienation. The other thing is our boy, Matt Goodwin, said, my parents got divorced when I was five
Starting point is 01:09:24 and that's why I think we should have like fucking strong families or whatever. My parents didn't get divorced until I was 12, so I was a better kid than he was. Yeah. Yeah. He drove his parents to divorce seven years earlier. That's right. Yeah, he was a prodigy at winding up his kids, his parents. Maybe he, the child, has had bad vibes.
Starting point is 01:09:44 Goodwin went on to say. Yeah. I was unpleasant to be around. Goodwin went on to say. Goodwin goes on to say. If somebody tells you. I defeated my parents' marriage in the arena of debates. Make it sharing a debate between your two parents as a five-year-old, very funny.
Starting point is 01:10:00 We're getting divorced. It is your fault. Yeah. You little shit. Yeah. Yeah, we're divorcing you. We're staying together. If somebody tells you that promoting strong families is what Goodwin said and doing all
Starting point is 01:10:14 that we can to keep them together is reactionary, they have no idea what they're talking about. Again, the idea here, of course, isn't like promoting strong families. Who could disagree with that, right? But the idea really is just, we want to... You want to have a big strong dad and you want to have a mom and you want them to say that they love you. Okay. Yeah.
Starting point is 01:10:37 And maybe if it doesn't go well, you'll end up writing and then eating several books. You'll have a really hot stepmother and stepfather and really hard to get in and out of tumble dryer. And I'm sorry, I'm at the wrong conference. But yeah, these people only ever see this shit in terms of grievance. They only ever see it as like, not any of the reasons why it might be harder to form a family or stay in a marriage or have kids. But part of their wokeness framework is literally just the idea that if you come from a single
Starting point is 01:11:14 parent household, maybe you should kind of be supported a little bit, right? That falls into the whole auspices of like, yeah, just that falls into a whole framework. Yeah. So it's intellectually incoherent. I think that that's not what's, it doesn't make me less worried about it because it's a powerful political tendency in a deeply, deeply anti-intellectual environment. And like, I think, Alice, what you said is basically right, right? The only way to defeat this isn't by correcting it, it's not by arguing against it.
Starting point is 01:11:49 It's by beating it to the past and removing and lowering the number of freaks that it can affect by preventing it from becoming a freak. And as much as I'm like, you know, let me fucking solo him, like let me tank this. This is where we want them. We want them on this sort of like weird alienating culture or ground, because once they start talking about, you know, as we've seen many attempts to do and they've never really come off because they've been too weird, once they start talking about like, oh, we should give you like, you know, better train services, clean water, it should be affordable to like
Starting point is 01:12:23 pay your rent. And the reason why you can't is George Soros' woke trannies or whatever. Is Mary Jean's racist limerick? Yeah, exactly. Exactly. So like, it's better to have them have the racist limerick up front and really, really, because like the more they say that with their chest, the more they're like, oh, no, actually we love to be racist and weird and alienating and we have all of these weird cultural bug
Starting point is 01:12:47 bears, the better, right? Because it fucking it tunes people out of it. And then you can go and pick those people up and say, hey, why don't you come and do some normal shit with your like friends and family and neighbors? Hey, I have a bunch of different bug bears. Do you know anything about Greg Stooby? I know. Prepare to learn.
Starting point is 01:13:05 Here's the thing, we are in no position to be talking about normality, right? But like you, the listener, you may well be, you may not be as into that poison as us. There's still time. I'm smiling a big flip chart and being like, okay, so the size of this guy, first of all, I'm smiling a big flip chart and it's just that good reset cartoon, but with a big line through it. I didn't want to say like one kind of last very quick thing just about. So like this conference, they, they sort of presented it on, they presented it by kind
Starting point is 01:13:34 of saying that this was also a way of criticizing the conservative party for what it deemed to be like, but they deemed it to sort of be not conservative enough or not doing conservatism properly and all that stuff. But like they also weren't really challenging any of the sort of foundations of which the conservative party is governing. Like none of them were sort of saying that like, yeah, as you mentioned, like none of them are saying that like, there are ways to make your life better and that might involve like spending more money on like fixing things and like actually doing stuff and providing
Starting point is 01:14:04 like a state of which like, you know, you can kind of be a comfortable fashion. Like none of them are, none of them are sort of saying that it kind of had this very scolding tone to it and it was still along the lines of like, you know, the only reason, like the reason why your life isn't good is because of like the woke left, you keep oppressing you in these sort of very vague ways. And so your responsibility is to like go to church and have children and yeah, that's sort of it. And kind of also just be mad online.
Starting point is 01:14:33 Like that was another thing, like you should just be mad online, you should tweet under your own name and be mad online while tweeting your own, under your own name. And here is like the list of all the enemies that like they kind of specifically have. And like to me, like, you know, yeah, like a lot of this is sort of crackpot nonsense, but it is still like fundamentally kind of accepting and I don't know whether this is because like ideologically, they can't really, you know, they can't really say that like, yeah, we think the state should do more stuff while also sort of lionizing Thatcher and Reagan.
Starting point is 01:15:05 And so the only thing that they really have to say are these kind of weird, sometimes contradictory, like quite often contradictory culture war points, but not really saying anything different. They are kind of still kind of, they are still accepting what, how we are certainly currently governed. Their thing is this more like there is an in they're either sort of like providing intellectual cover for being miserable, or they are sort of saying that like, the misery that you face in your life is not because of any kind of material conditions, it's not because of like
Starting point is 01:15:35 the way that the economy is structured, it is because of invisible forces that only Matt Goodwin seems to be able to understand. And so like, and in a lot, and the optimism to sort of take from that is like, it's quite easy to combat those things. It's quite easy to kind of do that in ways that, you know, just existing in the world is you don't even have to sort of be an activist. You can literally be on a train that like is a replacement train of a replacement train of a replacement train and talk to someone and be like, yeah, this sucks.
Starting point is 01:16:00 Right. That's that's it. You can just like point to anything in this country and be like, yeah, this sucks. It sucks at the potholes there. It sucks that like, my water's out of shit in it and not even like a solid kind that I can get out with a spoon is for liquid shit and I have to somehow deal with that. You can point to so much stuff and just, yeah. So I feel like in some ways that's, that's an optimism to bring for Matt.
Starting point is 01:16:21 One other thing that happened on this train journey was the second inspector came through and someone asked her if she had been on strike and when she said yes, there was sort of like universal support both for her personally and the strikes generally, even though all of that change. You were on the work carriage for some reason, I don't, there weren't signs I just got on that one. And I was like profiled into it, but like just taking a phone call and then the conductors like, excuse me, mum, this is the walk carriage.
Starting point is 01:16:52 Yeah. I'm just writing some notes for like a second edition of his book where he's like, there should be a what they should be an on-work carriage. Yes. But like fully I will take that energy over the weird loser energy of these fucking whale bothering freaks any day of the week. They'll bother. Look, I think there's, there's probably Japanese, there's a lot of citation into fearers, you
Starting point is 01:17:19 know. Yeah. There's a lot more. I think there's a lot more to say about this and unfortunately I don't think we've seen the last of these freaks and you know, I suppose that the thing that worries me about them is that what they say is very unimportant. It's what they will continue to be kept seriously taken seriously and sold by media organizations by sort of parties and politics and that they want really, it's a conservative party leadership
Starting point is 01:17:47 bid that they've now cast with Suella Braverman in charge of it. But I think ultimately who's saying, Alice, what you're saying is basically right, which is that the mission, the call to action, because they have no call to action, it's just be weird and alienated so that when we call on you to give us permission to continue squaring that circle of low, like low standard of living increases or high decreases, we want you to trust us because we want to condition you to have you deputize us to punish your enemies. Right?
Starting point is 01:18:18 That's sort of how that works. But the fortunate thing is if you can just try to be normal enough to normal up the people around you and make them not freaks, then we've seen this fail before because it's too freaky and too weird and it's too much of a turn off because everyone hates a fucking whale botherer anyway, anyway, we've gone, we've gone very over. So I want to say thank you very much for listening to the podcast. So those of you who've been coming to the live shows, thank you very much for coming to the live shows.
Starting point is 01:18:49 By the time this comes out, we'll be done all of them. Yeah, you will have COVID, all of us will have COVID. And also to remind you, we have a Patreon, it's $5 a month. You get not just a second episode of this every week, but you also get a Britonology. You get writtenology. You get COVID. Yeah. And you get COVID.
Starting point is 01:19:11 And you get COVID. There's a Twitch stream, which I assume is probably not happening on the day of this recording. Oh, no, I'm down to stream. I don't give a shit. It's probably good for me. The thing about the Twitch stream is it probably won't give you COVID. We don't have that technology yet.
Starting point is 01:19:25 That's right. Yeah, we don't. AI will fix that. Our theme song is Here We Go. It's all one word by Jinsang is a good tune. And we're about to play that right now, I assume. Yeah. Maybe, why don't we just play a few seconds of that so that they can get a little sample,

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.