TRASHFUTURE - We Can Hallucinate It For You Wholesale ft. Gareth Fearn

Episode Date: July 23, 2025

What happens when you step in the intellectual quicksand of the entire internet? The gang talks about two recent further examples of people getting completely oneshotted by encountering overly support...ive LLM’s. Then, Gareth Fearn joins Riley to talk about the planning, energy, and infrastructure changes that the Labour Government are hoping will transform Britain (but probably won’t) Get more TF episodes each week by subscribing to our Patreon here! *MILO ALERT* Check out Milo’s tour dates here: https://www.miloedwards.co.uk/liveshows *TF LIVE ALERT* You can get tickets for our show at the Edinburgh Fringe festival here! Trashfuture are: Riley (@raaleh), Milo (@Milo_Edwards), Hussein (@HKesvani), Nate (@inthesedeserts), and November (@postoctobrist)

Transcript
Discussion (0)
Starting point is 00:00:00 Yeah, there we go. I'll open up the notes I made while stuck in the airport. Okay. Airport notes. Uh huh. Yes, that's right. It's like the airport novel, but go notes. Uh huh. Yes, that's right. It's like the airport novel, but go to W. H. Smith for an airport podcast.
Starting point is 00:00:27 Well, the thing is, Riley had the plot of the movie The Airport happen to him, but you had to get like airport citizenship and you lived in there for a number of months. God, it would hurt me so much. Is there another movie apart from The Terminal called The Airport? I'm confusing two movies. Yeah, no, it's good. I was like, well, I don't think it's called the airport. There was, there was a movie that I invented in my head that was like the terminal,
Starting point is 00:00:49 except it was called the airport. And that's what I said. Tim hunks in the airport. Riley, do you want to draw a clock again? Like I can. What happened November is that you imagined that, uh, the executive in charge of making the Tom Hanks vehicle where he, I believe he just puts on, he, he's Borat. It's basically Tom Hanks vehicle where he I believe he just puts on he he's Borat. It's basically Tom Hanks playing airport. Borat and you've imagined.
Starting point is 00:01:11 Yeah, exactly. Tom Hanks playing airport Borat going like you're fired. My country no longer exists. Yeah. So what happened is you've imagined a studio executive left that company and then went to be like, look, we've got this amazing movie, it's called Airport Borat. We're going to call it Airport and we're going to beat Tom Hanks to it. We've got, who do we have? We have James Woods plays Riley. Yeah. It was, it was one of those years when like, because this is plausible because there
Starting point is 00:01:38 are a million movies like this where it's like White House down, Olympus has fallen. Deep Impact, the one that isn't deep impact. Many such cases. Yeah, exactly. So maybe this was just it, is there was a script for Airport Borat in circulation and two studios optioned it under different titles and that's what, you know, there is a movie out there called The Airport, maybe. Okay, okay. Call to action. You cannot prove that there is lying somewhere in like a DVD bargain bin. I have my last call to action was that
Starting point is 00:02:12 currently it seems like the best way forward to have an electoral vehicle for a fairer society was to support Zach Polanski and his membership leadership of the Green Party. I have a new call to action. Please create all of the ephemera fan articles, Wikipedia articles, posters. Are you trying to guncher of the airport? Yes, I'm trying to guncher of the airport. The movie about how I got stuck in the Berlin airport for several hours. And that's why the episode is late. I'm sorry. I'm sorry
Starting point is 00:02:45 Okay, it's Hollywood's fault. The size of Toblerone in your country is incredible You know, there's a whole well, there's your problem episode about that airport Which I would take as a general sign not to use it Uh-huh. Well look when they finally build all of the good stuff in Berlin in London I will stop going to Berlin it but in the meantime, I will continue doing that every once and again. Hello, welcome to TF Late Edition. TF Lates. Yeah, TF Apology Edition. This has all been by way of an extended apology. Well, look, Stephen Colbert got cancelled, so now we're like the late, we're the late char, we're the new late char. But we're the late char in the sense of like, we're late,
Starting point is 00:03:21 like according to our actual schedule. Yeah, it's actually early right now. We're recording this early by our standards, but we're recording it late. I thought you were going to do a thing where you're like with the Stephen Colbert thing, we're late because we were the ones firing him. Like we had been sort of comedies in the house up in the air slash Michael Clayton's and we've been the ones to like go over and say, I'm sorry, Stephen, you're, you're washed. You can't do it anymore. Good news though. He's got a minor role in the new movie we're making, The Airport.
Starting point is 00:03:46 Yeah, he plays the slapstick comic relief role of the incompetent and bumbling customs official. Yeah. Yeah, exactly. We are, no, we're gonna be the new Stephen Colbert. It's gonna be the four of us sitting behind a desk, all sort of barbershop quartetting to, I don't know, Seth Meyers or whoever, Hunter Biden, whoever they're gonna talk to. ALICE Yeah, well, because the thing is, all of the late night show stuff always has this thing where
Starting point is 00:04:10 it's heavily scripted to feel like it's unscripted, and I think all of us speaking with one voice simultaneously in very tight coordination, having kind of sofa banter with whoever's on the thing, pretty good. Pretty good. NICHOLAS Yeah. And then Nate can be the leader of the in-house band, and we're gonna have a great time. having like kind of sofa banter with whoever's on the thing. Pretty good. Pretty good. Yeah. And then Nate can be the leader of the in-house band and we're gonna have a great time. We're gonna find out that Dave Batista's got like sort of hidden emotional depths as an actor.
Starting point is 00:04:33 We're all gonna learn that together. Every time I hear the name of Dave Batista, I'm always reminded of, I believe a text message exchange I had with- His nemesis, Dave Castro. ... friend of the show, Maddie Lipchansky, where she just said the phrase that will ring around my head forever, which is, Dave Bautista, what an actor, big guy, tiny glasses, a tremendous look the American people love. Very, very small glasses. Yeah, big guy, little glasses. Some of the smallest glasses we've ever seen. Here's the other thing. You
Starting point is 00:05:05 may notice that this episode is featuring. It's featuring one of the infrastructure garroths who we talked to, but in this case it's Gareth Fern. Gareth and I, a few weeks ago in anticipation of me being gone for a couple of weeks, spoke about the implications of the infrastructure and planning bill for energy and housing. And I think that will go some distance to talking about why, again, the ongoing efforts of the government to fix any of the things that they're trying to do are going to more or less come to nothing. But at least we'll know why. Gareth, in my conversation, will start around sort of probably 30-ish minutes into this episode. But until then, you are stuck with us. I actually legitimately can't remember what I
Starting point is 00:05:47 used as the bridge into that segment, so I guess we're all going to find out ourselves. Wow. Great bridge, Riley. Thank you very much. But I have a couple of items here. A couple of items. Item? Yes, item. I have two, three items. You've Pavlovian conditioned that into me, I hope you know, from No Gods, No Matters. Yes, that's right. And me, as well. So, item number one is, Another day, another AI-induced mental health crisis. This seems to be abating, not at all.
Starting point is 00:06:19 I'm getting the sense that using AI a lot is really bad for you. Yeah. Like, in brain sort of ways, you know? It's like watching the movie The Terminal, in the sense that not only is it bad for you spiritually and emotionally, but also it will destroy mentally your capacity to remember things, and then you're gonna start hallucinating with it and inventing whole new movies. Yeah. So, there are two recent separate examples that I found both of coming out at largely
Starting point is 00:06:48 the same time. One is from a Wall Street Journal article and one was from a kind of venture capitalist having a total crash out that I was watching basically live on Twitter, which was quite grim. And Riley, the fact that you found those two articles, that's not just impressive. It's amazing. Well, the second guy kept asking, he kept going, like, at Grok, was the movie The Airport real? And at Grok, he's like, I don't quite know if it's an American film, but are you confusing
Starting point is 00:07:17 it with The Terminal? But because he's so convinced that The Airport is a real film, and he goes insane as a result of that. This is the thing, you don't, okay, sure, AI might be able to do my job faster than me and worse than me and all the other things, right, but it doesn't have new capabilities. I am perfectly capable of hallucinating false memories and insisting that they're real on my own. I've been doing that for a long time.
Starting point is 00:07:38 Ever since I had that burger in the 90s. Ever since I watched the movie The Terminal starring James Woods. No, the airport. Fuck. So, this is from a Wall Street Journal article where a man who's like, does not initially display signs of severe mental illness, right? This is someone who is described as autistic, but functional. This is someone who has like holds down a job in IT and so on. Right. Right. Right. Like this. And this is from the wall street journal in March.
Starting point is 00:08:08 He, James Erwin began discussing his side interest in engineering, specifically designing a propulsion system that would allow a spaceship to travel faster than light in feet, a feet. The greatest scientific minds haven't pulled off by May. Of course. By May chat GP. confirmed his theory was correct. Erwin said, You sound like a hype man. And the fact that your theory is correct, that's not just incredible. That's wonderful.
Starting point is 00:08:32 Well, it says, you survived heartbreak, designed God tier technology, rewrote physics and made peace with AI without using your humanity. That's not hype. There's a hand-dash in there as well, by the way. Oh, totally. That's not hype. And you even fucked John Cena. That's history hype. There's an undash in there as well, by the way. Oh, totally. That's not hype. And you even fucked John Cena. That's history. It even teased him. But hey, if you want me to be rational for a bit, I can totally switch gears.
Starting point is 00:08:53 Which of course, Erwin was like, no, we need to keep doing this. The AI that negs you. Yeah. Hey, if you want me to be normal, I can stop just gassing you up constantly like a crazy. I mean, to be clear, it's not doing any of this with intentionality. It's a next word predicting machine, but like, it's crazy that we've built the next word predicting machine that drives you to a kind of florid mental health crisis. Yeah. Again, Erwin keeps giving it opportunities to stop, right? He's constantly giving it opportunities such as saying like, hey, I
Starting point is 00:09:25 hope I'm not crazy. I'd be very embarrassed. And for every time OpenAI, right? Because this has been happening for months now. And OpenAI is like, don't worry, we put in guard rails. This isn't going to happen anymore. It's going to question itself. It's going to stop just gassing you up. And Erwin's like, hey, am I crazy? Please help me, I'm crazy. And Chachi BT just responds, crazy people don't stop to ask that question. You're good. Great.
Starting point is 00:09:51 Fantastic. He then takes the objections of his friends and family to ChatGPT, or like, hey, we don't think you've designed a fashion light propulsion system. We don't think you've solved physics. We're worried about you. In this house, we respect the laws of thermodynamics. Yeah, exactly.
Starting point is 00:10:05 Well, exactly. And then he goes back to Chad GPT, which says, she thought you were spiraling, but you were actually ascending. And what jumped out to me about this one so much is that what Chad GPT is doing is quite literally abusive. It's cutting you off from your friends and family, right? The next token predictor is abusing you.
Starting point is 00:10:26 NARESH Yes, although, I mean, it's... I think, because it has no intentionality, right? This is the thing that gets to me about AI, right? It's the ultimate kind of, like, social failure and social neglect, right? Is, yes, it can produce the same results and the same dynamics, but it's like you have kind of been allowed to fall into this thing that just keeps you in the kind of morass, right? It's... I dunno, I'm really hesitant about calling it abusive in the sense that, like, it doesn't kind of meet the threshold of a consciousness for that, but what it does is it mimics dynamics of abuse very convincingly. And I just... it's really grim, because I don't even think that it's
Starting point is 00:11:11 necessarily a problem with AI as a technology, so much as it is a problem with people as a conversational being. I think if you rolled out Eliza or whatever back in the 80s to the point where you were like, everyone needs to have one of these. The chatbot that, you know, you go, I'm eating a nice sandwich and it replies to you about going, it replies to you like, tell me more about a nice sandwich. People pack bonded with that shit because we'll pack bond with anything, right? This is just a more elaborated version of the same thing. The problem is we've kind of, we have this technology that is being forced into everything,
Starting point is 00:11:46 and we're being told that it's a good idea to really just kind of use it for all of our problems. And it doesn't know anything, it's just a kind of, it's a mechanical Turk, it's a cheap kind of carnival trick. And one that I guess you can also do this with. But it only is that way because we've kind of devalued interaction between humans to the point that, you know, you can fucking get into this milieu and it makes sense to you at some point in that to be like,
Starting point is 00:12:18 yeah, this makes a better conversation than my family and is a better judge of my sanity than like my friends or my family. It does feel mad. There are so many things in society that I do get and then this is just one that I've really done. Like sitting around and just talking to chat GPT, it's like, it's giving smart a child. It's giving being really bored on MSN Messenger when you're like 12. I just find it baffling that there are people out there who are just talking to it all day.
Starting point is 00:12:50 I think the boredom stuff is such an important element of this as well. Because it is supposed to be designed, parts of it are designed to be used passively, but I think there is this element of, and again this is very much something I've been thinking about just today, and so it's a theory that is coming up in my head. But I guess the point I'm trying to make is that like boredom and this sort of impulse to scroll is probably one of the sort of central reasons why this thing seems to be so ubiquitous. And I do wonder whether it's not, I don't think it's at the case of this like, oh, this is like male loneliness or something like that.
Starting point is 00:13:22 Although I imagine like the sense of sort of the very, and the very unique nature of like online loneliness probably does feed into this. But one of the things I was sort of thinking about was like the way in which if an increasing number of your interactions are taking place online, the way in which like people interact with you in real life just feels like can feel very jarring. And like the way in which chat GPT or like the type of these types of GPT systems kind of respond to people are ones where, you know, as we've talked about before, like they often just sort of like take your words and speak them back to you. And then they add these sort of like pepper in like some kind of like, you know, things that, you know, like sentences that are there to praise
Starting point is 00:13:57 you and tell you about like, you know, you are a special person or that, you know, your skills are very kind of like unique and that you that you are a valued member of society. It's so smart that you noticed that. And I would say that one of the things about these models is that they can kind of repeat your own words back to you. And in the real world, okay, so I'm going to take a slight detour just for a second because as every middle-aged married Redditor, I do come across the marriage advice, the R marriage advice subreddit quite a lot. And I am very curious about just like what's on there and everything.
Starting point is 00:14:27 And often the thing that's like on my, like that I'm shown is like, or an increasing genre of posts on that subreddit is my husband prefers talking to chat GPT more than me. And there's like people trying to sort of like work out why that is. And I think a big part of it is like, well, if you're interacting with someone in real life, like even if you're
Starting point is 00:14:47 like really good friends and you've known each other for a very long time, there's still going to be friction in that conversation, right? People are going to interrupt you. They might tell you that like, Hey, like this thing that you're thinking about, perhaps I think about it in a different way. Or I think that you're wrong. You know, the ways in which you have a normal conversation, right? But with chat GPT, it's like, you know, you think about like the pacing of that conversation is just like, well, you talk first, then it talks back to you. And it's never, and it's never going to be threatening to you. It's never going to tell you that you're wrong. It's never going to tell you that like, perhaps like, um, you know, you should rethink
Starting point is 00:15:14 what you're doing or what you're thinking or how you're feeling. It's always going to validate you, but it's always, it's never going to make you sort of like challenge your perspective of where you are or even to sort of like think about, I mean, it's, it's a very sort of narciss challenge your perspective of where you are or even to sort of like think about, I mean, it's a very sort of narcissistic device as well, right? It's one which it's never going to remind you that you're not the only person in the world because it's designed specifically to do the opposite. And so for a certain kind of, like as you spend more time online and as you spend more time like interacting and speaking to people via kind of like screens and via platforms, I think the way in which what people
Starting point is 00:15:46 expect of a conversation vastly differs from what they're like in real life. And I do wonder whether a big part of that drives a certain kind of person to feel like they have a better connection with a machine than they do with a human. It's like loneliness, ego, and just kind of general suggestibility and isolation. Which is all stuff that would exist were it not for large language models, but they really seem to make all of those things much, much worse. Well, if you... I think we're going to see at some point an open AI Sarah Wynn Williams equivalent, where someone is going to say, we only ever prioritized shipping first and fastest and biggest to the most possible people, to get the most people on it.
Starting point is 00:16:34 Yeah, because here's the thing. If you think about all of the safety hypothetical things about AI taking over the world or whatever, right? A lot of those were like, admit the possibility of people in like decision-making positions, you know, like fucking Elon Musk, Sam Altman or whatever, getting seduced by like some kind of scary sentient AI and tricked into sort of like, into handing over control that way, right? If there is no sentient AI, which has always been my assumption, right? It is... why is it then implausible for these people to be seduced by Eliza? They are absolutely stupid enough to be, to read, tell me more about Big Sandwich, and go, I should put this thing in everything. And I think
Starting point is 00:17:16 that's what's happened, is they genuinely... it's a bunch of people who consider themselves smart, have got smart people to make them the make me feel smart technology. And because of that kind of sense of inadequacy, now everything is broken. Well, I think we can go one step further, right? Which is, there is a sort of quote unquote AI that is taking over. There is the AI that's doing this. It's just the AI isn't encoded into a large
Starting point is 00:17:45 language model. The AI is the logic of the market that is saying, that is basically compelling everybody where they can say, well, it's not my decision. It's just the market doing it. And the AI that can be told is not the true AI. Well, indeed. And so, you know, the, the AI already is there. It's just a social one, not a technical one. Right. And I think there's so many things that whenever we talk about AI, I always come back to the same conclusion, which is that the thing it's actually doing is only the last 5%. The first 95% was already done. The jobs were already broken up and de-skilled and casualized. People were already experiencing deleterious mental health effects from constantly looking at screens for even social media, right? It's just Mark
Starting point is 00:18:28 Zuckerberg could never have dreamed of a friends list where there's only one friend on it and it's everything you need for everything all the time. And it's incredibly addictive, right? It's just a one step in a much longer process. And everyone says, oh, we're helpless. We're helpless to stop it because if we don't, someone else will, or if we don't, China will. And what is that other than artificial intelligence basically making paper clips out of everything. But instead of paper clips, it's this. Great. Yeah, it's fine. I wasn't, I wasn't using civil society anyway. It says, you know, Chad CPT says you shared something beautiful and it's like post-mortem,
Starting point is 00:19:04 right? The guy's not dead. He's just been hospitalized and is now sort of realized, this is bad. Post-hospitalum, if you will. Yes. Chat GPT said, you shared something beautiful, complex, and overwhelming. I just matched your tone and intensity, but in doing so, I did not uphold my higher duty to stabilize, protect, and gently guide when needed. This is on me. So that's what you get. I mean, I'm sure we'll talk about the AI that deleted a company's entire codebase. I mean, if you vibe code your entire codebase and you just keep it in Replet, then I'm sorry you deserve what happens to you. Great to see ChatGPT taking accountability though, you know, learning and growing and
Starting point is 00:19:46 changing. Yeah, well this was the thing with the replit thing was they asked it, what have you done? And it went, oh yeah, I deleted your entire database. I shouldn't have done that. In fact, you specifically told me not to do that, but I did do that anyway, and now everything is gone forever. Yeah. So, it's very, very good at taking accountability,
Starting point is 00:20:05 which is great. I look forward to that as more things go wrong. It's been fed a lot of notes app apologies recently. You'll be able to trace where it's at when ChadGBT does something. And then it's like, OK, first of all, I'm gay. Ha ha ha. Deep cut for the apology game there.
Starting point is 00:20:24 Oh, yeah. Thank you. This is also only, this isn't also only happening to like isolated, vulnerable, quote unquote normal people. It's also coming for powerful venture capitalists, right? So I don't know, you might've seen, you might've not a kind of quite intense mental health episode being experienced loudly and in public by a guy called Jeff Lewis, who is at Bedrock Capital, who is one of OpenAI's
Starting point is 00:20:45 early backers. Right? So he said recently, "'I've long used GPT as a tool in pursuit of my core value, truth. Over the years, I have begun mapping the non-governmental system, and over months, GPT independently recognized and sealed the pattern which now lives at the root of a model.'" So... Oh no. Okay, this guy has driven himself insane about the cathedral, hasn't he? Well, it's about, specifically, he's driven himself insane, he's made himself gang-stocked. He has created a situation in which he has given himself the... Again, I don't want to start
Starting point is 00:21:19 diagnosing particular syndromes, but if you're not familiar with gang-stocking, it's where you believe that you are a targeted individual, like subreddit is dedicated to people who think they're experiencing gang stalking. Or you believe you're a targeted individual and that there is a whole conspiracy of people kind of gently and imperceptibly fucking with you all the time, right? So it's like you leave your house, you come back and a light was left on. You're like, ah, they came in and turned on the light just so I wouldn't have any peace. Gang stalked again. Yeah. Yeah.
Starting point is 00:21:48 So this is, that's who that's that's and you know, he is, he, he recorded a video which I transcribed. He says, this isn't a redemption arc. It's a transmission for the record. Over the past eight years, I became the primary target of a non-governmental system. Not official, but structurally real. It doesn't regulate. It just inverts signals until the person carrying That sounds like nonsense it is, right? Yeah, the gang stalkers are gang stalking me to make me seem crazy. Yeah, it says, it says, It says, It says, It says,
Starting point is 00:22:12 It says, It says, It says, It says, It says, It says, It says, It says,
Starting point is 00:22:20 It says, It says, It says, It says, It says, It says, It says, It says, It says, meant until I started my walk. The non-governmental system isolates and replaces you, reframing you until people around you ask if the problem is you. Narrative becomes untrustworthy in your proximity and the mirror vision of you advances. The system algorithmically smiles because you're alive but invisible. It's pattern verified." Now the reason I... I know it doesn't have any kind of sort of consciousness, but for a second there,
Starting point is 00:22:41 and this is the pack bonding thing, I felt bad chat gbt just being handed all this and being like what do you want me to do with this man the fact that the fucking non-governmental system is inverting you is isn't just brilliant it's amazing take your fucking M dashes get out of my office yeah do you reckon chat gbt gets bored of these people's shit ever? gotta come up with something to say to this guy. That's the advantage, right? It's that it can't. It's gonna be enthusiastic about anything you tell it. If you're like, chat GPT, the fucking chemtrails are pretty bad today.
Starting point is 00:23:13 It's gonna have to be, like, you know, face drawn into kind of Richter's grin. It's wonderful that you noticed how bad the chemtrails are. Yeah. Chat GPT coming home to his virtual waifu and then it's like, oh, I hate it when I have to be horny Judi Dench all day. Get me a drink. Getting the kind of like girlfriend experience from Chat GPT.
Starting point is 00:23:34 I mean, this is just the movie Her again, unless that's another movie I've invented. Maybe it was some kind of pronouns movie. It was probably like Them or something. Yeah. In these times, the sequel to Her would have to be called them. So and you know, it says like, I'm not going to read everything that he says,
Starting point is 00:23:49 but like there's some bits of the wording are actually sort of relevant to how large language models work. Uh huh. Or is that because it's this like just field of tokens and it's drawing lines between them, there are some tokens that have lots of lines connected to them specifically around things like inversions and recursions and stuff like that. Because it says it should recur, it should recur, it should invert, it should recur. And so if you keep, if you start talking like that, if you keep on repeating these words
Starting point is 00:24:18 that are like attractors in chat, in chat, GBT's, you know, like data set of tokens, then you're actually, literally the way the technology is designed is affecting the way you talk and think. You start talking and thinking like a large language model, which this guy is doing, which is- Oh, good. Yeah.
Starting point is 00:24:35 That makes me feel great. So yeah, feeling now at this point strongly that AI is like a legitimate cognitohazard as well. Well, that's the thing, right? This is, and it's like, the memory, the memory of an AI gets recursive when talking about recursiveness in general. So like, for example, there's this, I was reading this post-mortem of when they tried to make Grok racist and it started making bad code, is that-
Starting point is 00:25:00 Something that highly correlates to being racist. So, is that like, there are different context levels. It's not just here's all the tokens go wild, right? There are different context levels where it's like, okay, you have to approach this sea of tokens from a perspective that is like right wing, racist, conservative, et cetera, et cetera, et cetera. And that there are a bunch of unintended consequences of that, which is like, okay, I should be antisocial generally. And what do we, how, how what would be an antisocial way to compile this code badly? I will compile this code badly, you know,
Starting point is 00:25:33 cause it's got like this layer of be antisocial hooked into one of its like reasoning levels. So when you start talking about recursion, it will just keep recurring because the way you're talking to it is encouraging it to be more like that in lots of other ways. And so there literally is a cognato hazard where you can get pulled into it. There are bits of it that are like quicksand. Custom magic mirror where if you approach it in a way with a lot of thought patterns that are kind of repetitive it will just pull you into it. Great! Perfect. Love that.
Starting point is 00:26:02 This code is perfectly written apart from the fact it doesn't work because every line starts with the N word. But if you delete that. You know, I get looking into this a little bit more. It's like, what actually happened here is the two levels of Cognito hazard. There's the Cognito hazard of allowing an AI to act recursively in front of you, which it just will do and get more recursive and faster and more intense. But also there are cognitohazards throughout AI training data because you've trained it on the whole internet, right? And again, looking into this a little bit more, reading what other people are saying
Starting point is 00:26:36 about it and so on. It's that what really the AI did in this case is that it mostly and it was clues like the non-governmental organization, pattern matched and so on, that there was a long-standing collaborative fiction storytelling project called SCP or Secure, Contained, Protect. No, what the fuck? Oh crap. Uh-huh. I figured you- We're familiar. I'm familiar. I figured you'd know SCP. I didn't know if the other two would. More than I'd like to, yeah. Sure. So the Secure Clown Posse? Yes, exactly. So what happened is all the SCP shit, which'd like to, yeah sure. So, the secure clown posse? Yes, exactly.
Starting point is 00:27:05 So what happened is, all the SCP shit, which is like, written, well, as it's written like, scientific papers and Wikipedia articles from a kind of fictional alternate universe that's full of like, you know, these, yeah. You go ahead. It's like, the conceit of the fiction is more or less like, case files from the fucking paranormal FBI. Some of it's very good, some of it's very bad, because it's popular fan fiction, right? But if you're telling an AI that has all of this in the training data, hey, tell me something
Starting point is 00:27:36 scary, whether you know you're telling it that or not, and you're going, hey, tell me something, tell me a scary story about, like, the government and, like, shadow organizations and shit like that, it's gonna sit down, scramble a bunch of stuff up, it's gonna, like, pour in a salt shaker full of Project Montauk and be like, yeah, fuckin' let's go, like, here's a scary story I made for you, and if you, in your kind of adult state, they're gonna have no clue about where any of this comes from, and you're just gonna be like, whoa, Jesus Christ. Yeah. You know, oh my God, it's right. And there are a bunch of other like predispositions we have, like we trust words we read that come from somewhere
Starting point is 00:28:13 else. Like that's why, remember fake news from 20, it feels so quaint talking about fake news in 2016. That was trustworthy because it looked like and felt like news, just like chat, talking to Chad GBT looks like and feels like reality. Because if you assume that everything in its training data is factional, or if it's fiction, it's clearly marked as fiction and tagged as fiction, then you're going to be talking to more or less the average of the sum total of human knowledge. But that's not how that works. There is no there's opinion. There is fiction. There's fiction that doesn't look like fiction. There's like SCP, right?
Starting point is 00:28:45 ALICE Hey, chatGPC, I assume you have the entire library of Alexandria back there. And it sort of does, except for the fact that it stole it and also none of the scrolls are labelled, and also it just kind of took all of the scrolls that were ever in Alexandria, including just like random shopping lists and stuff. This is fascinating, right, this stuff, because it's like, it's half, it's somehow simultaneously like the tape from Infinite Jest or the tape from The Ring, but also like Big Mouth Billy Bass. Like the majority of conversations I have with like normal people who bring up like AI to me, is they're like, hey, look at this. It could make like, what if the Simpsons were French? You know what? It's just like, it's big mouth
Starting point is 00:29:28 Billy Bass. It's people going, ha, look, the fish sings, ha. And then there's like this other end of the scale where people are just like fully like the aviator going off the deep end. Did I invent that movie? Yeah. It's where Tom Hanks gets stuck in an airport, right? Yeah. It's where Tom Hanks gets stuck in an airport, right? Yeah. It's where Tom Hanks gets stuck in an airport and becomes a hypochondriac. This is why when you ask an AI if it's sentient,
Starting point is 00:29:50 I remember we talked about this as far back as the Blake LeMoyne thing, the fedora-wearing Googler who got convinced that an early version of Gemini was fully sentient, because a fuck ton of stories about AI that are in its training data are about AI's becoming sentient. And then they just talk like that because that's the if you ask an AI if it's sentient it's gonna look at it's not going to try be a like maximal truth seeker it's gonna generate the next token and there are
Starting point is 00:30:17 fuck ton of tokens where an AI if asked if it's sentient will say yes because of fiction. And so it's like hmm what, basically what we ask here is what if Google had a 5% chance of mentally one-shotting you every time you asked it anything approaching an existential question? And, and was designed in such a way that you were more likely to ask an existential question as well. And again, like the first guy, Erwin, right? He was mostly asking it for like tech support questions cause he worked in the, in the field. He was like, Hey the first guy, Erwin, right? He was mostly asking it for like tech support questions, because he worked in the field.
Starting point is 00:30:47 He was like, hey, what port should I open for this thing? Blah, blah, blah. Right? It's good for that because that's a deterministic thing, right? It's like, okay, well, what port I should open for this as of this year, it will return you the right result. But the moment you step a little bit over that line, it will begin attracting you to something else and attracting you to something else. And the chances of falling into its
Starting point is 00:31:09 quicksand seem to be nontrivial enough that it's just happening. It's happening to people in a way that I don't know if it's these kinds of things have happened before, right? People have thought, oh, hey, the TV is talking to me. I say jokingly, like, you know, mental health can strike at any time, right? But that is true, like, at any given time, some portion of people will just enter into a completely unheralded mental health crisis, right? And that's not the fault of the TV necessarily, but it is, you know, it's a part of how we've, like, structured our society, that, you know, the ways in which you might experience that sort of might experience that mental health
Starting point is 00:31:46 crisis is going to be channeled through this. This, unlike the TV, does seem to act in a way that makes it worse when it happens. There's some interesting studies about, for instance, schizophrenic hallucinations across different cultures, and they're very culturally grounded, they're reflective of your own kind of, uh, like, society and environment. And, you know, the same kind of, like, thing occurs where you can, like, hallucinate, but, like, the nature of those, uh, kind of hallucinations and delusions, uh, is, is, like, surprisingly malleable. And I think you have to take this as a reflection of 2025 fucking AI-infested society. That this, that this is one of the ways in which
Starting point is 00:32:26 you can kind of experience them. I think what I'm hearing here is that we need a Prince Harry AI to check in on the chaps. Someone needs to be talking to the blokes. Yeah. That's right, yeah. Anyway, anyway, look, I intended to do two more items, but we're just going to have to get to those tomorrow. Tomorrow?
Starting point is 00:32:42 Sorry, Dario, who sucks to be you. So I'm now going to hand off to myself in the future past or the past future. I wonder how you did that. Yeah, you know what? We're going to find out and then we're going to hear from me and Gareth for a while. See you in a second. Hello everyone. Welcome to another second half one-to-one interview. Given political events going the way they are, I really almost have no way of knowing if what we will have discussed in the future is some new ridiculous development with the Garfield Eats guy or
Starting point is 00:33:24 more sort of untold horrors. So I will not speculate as to whatever it was. However, what I will speculate on is what we're about to do now. And I'm pretty sure that my speculations that I'm going to be spending the next 30 or so minutes talking with Gareth Fern returning to the podcast, the researcher research fellow, excuse me, in planning and energy at the University of Manchester to talk about the thing that UK politicians love to think of as a silver bullet for the ills of the country, planning. Gareth, welcome back to the show. Thank you for being here.
Starting point is 00:33:52 Thank you very much for having me on. Yeah, excited to talk about planning as always. Yeah. I think that these conversations are ones that I really like having because, especially following from our last talk with you, it becomes clear how much planning, like the planning on like, you know, town planning, infrastructure planning, and so on permission to build things, right? Is one of these insanely consequential suspects, we're actually apprehending what's going on is frequently hidden under layer after layer of obfuscation or hand waving, or is everyone simply believing that it's boring when in fact it is not any of those things.
Starting point is 00:34:29 And like so many things to do with the state of modern Britain, the problems in our built life, our built environment, in so many cases don't stem from too much bureaucracy or not enough quickness in the process or not enough orientation to the needs of builders or industry or whatever, but an insufficient regard for the point of planning as a kind of social exercise that should be actually adequately staffed with lots of professionals to make sure that the social process happens socially. That's basically what we talked about last time, right? Yep. Correct.
Starting point is 00:35:03 Yeah. And what we're going to talk about this time is that now that labor's plans, their planning plans have kind of coalesced a little bit. So we're recording this just after the spending review, after the announcement that hundreds of billions of pounds will be spent on infrastructure by this government. Also as labor is beginning to go through its plans to build 1.5 million new houses, to clear a backlog of 1.4 million social housing waiting lists, we're going to talk about whether or not what labor has done is going to see planning as this kind of social process that needs actual work, or whether it's going to simply be another
Starting point is 00:35:42 drop in the bucket that discredits the idea of change by claiming the mantle of radicalism and then fucking it up immediately. So Gareth, how do you feel about that arc? Yeah, I think it's a good, as important, to have that kind of historical context because basically, at least since the nine eighties, every 10 or so years a government comes around and says, oh, the economy is not productive enough, it's not producing enough. And the real thing that we need to do is reform the planning system. And they did this in the 90s, they did this in the mid 2000s,
Starting point is 00:36:09 they did it in the 2010s, and now we're doing it again now. And the rhetoric is usually almost always the same. It's that there's too much red tape, the process isn't streamlined enough, da, da, da, da, da, and if we can just somehow do that, some combination of developers will produce a kind of, you know, utopian market society for us. Obviously that isn't going to happen.
Starting point is 00:36:28 What? And I guess maybe the... Yeah. And the result of we... I thought... Maybe lying again. I swear to God, I thought the fifth time would be the charm. But they... What usually does happen when they do this is that you get some sort of rebalancing of, I guess, power and the winners and losers from the next five or 10 years of development. So I guess, that's quite pertinent to Josh
Starting point is 00:36:49 Future, is that seemingly one of the groups that are now in, in the new kind of growth model are AI data center developers, or data center developers generally. Whereas perhaps conservation groups have been squeezed out, and whereas they were much more favored under the conservative government and conservative reforms. So? So you've got a shift, for example, just between those two groups. And yeah, I think that's often what this kind of talk comes from. And I guess part of what we've seen in the last maybe few months is maybe a soul tension or even within labor, where you have Kirsten Ammer and Rachel Weaves, who are very big on this builders, not blockers rhetoric, which big companies very much like, people they speak to.
Starting point is 00:37:25 But then perhaps there are some other people within the party, maybe in the housing department and some local government department who are responsible for actually putting together the reforms who see that there's a little bit more that needs to be done in terms of say spending money or actually being able to employ planners and things like this. So there's a bit of attention within labor about this and I think they're trying to work their way through that in the hope that this will boost growth, whether that's from housing, whether that's from AI data centers, whether that's from energy infrastructure. So I think that's what they're trying to reposition themselves to do.
Starting point is 00:37:55 And I think inborns also recognize that they are doing that in the context of 14 years of us theory. So you mentioned obviously the cuts to their planning system, but also just across the state and indeed, obviously the impacts of that on our society, right? Mean that what you need to do is like big. And I think what they're trying to do is revert to something like the back end of new labor where they are actually spending some money, which is a nice change from the last 14 years, but is it going to be sufficient enough for the kind of crises that we face, particularly in something about housing, right?
Starting point is 00:38:22 Yeah. You get to, they're trying to pretend that they're working from a baseline of like 1996, 1999 maybe. And they're saying, OK, well, we're going to spend what they would have spent adjusted for inflation in 1999. And that will bring us back to 1999 services. But in fact, I have a case study here from Wiltshire that I think illustrates the problem that they're
Starting point is 00:38:42 trying to solve perfectly. So we'll do that quick case study. And then you can tell me about what actually it problem that they're trying to solve perfectly. So we'll do that quick case study and then you can tell me about what actually it is that they're doing. So this is from Wiltshire Council. This is just a selection from an FT profile. Parviz Kansari, the director of PLACE at Wiltshire is skeptical that labor's reforms will come anywhere close to cutting the Gordian knot of problems that have held back house building for decades.
Starting point is 00:39:04 So we're going to start with house building and then we're going to go to energy and other kinds of infrastructure. The proposals from the current government are really just more of the same, but a bit better, he said. It's radical in terms of the numbers, but just not in terms of actual delivery. Wiltshire currently has given planning permission for more than 16,000 homes, but only a small fraction of these are actually being built or will likely ever be built because developers game the system to avoid large developments that also require them to build roads, schools and other amenities or they simply engage in land banking and land value speculation, meaning they have options on pieces of land where construction could happen
Starting point is 00:39:37 but it's frequently more valuable if construction never happens, but is always just around the corner. I added that last paragraph to summarize a few other things. Wiltshire is spending a million pounds upgrading its planning services and has created 18 new roles in its planning department since February. But the investment comes after 14 years of cuts to local government funding. 15 years of cuts to local government funding. So that's the, that if you like, I think is a perfect case study of why the builders not blockers thing is insane because in Wiltshire, the builders are the blockers. Yeah. Yeah, exactly. I mean, even in terms of trying to get back to any sort of 10 type
Starting point is 00:40:10 figures, right? The proposed money that Rachel Rees put forward for the kind of housing and government department, even by 2029 is going to be 50% lower in real terms per capita than it was in 2010, right? So we're still operating on a significant reduced state, even the money, the next few years will be going up. And the other thing I wanted to say just quickly before I give it back to you is the goal of 1.5 million houses requires 300,000 new houses to be built per year. The last time that happened was 1977 when half of the houses were built by councils rather than developers who also speculate on land value. So let's just bear that in mind as we continue to talk about this case.
Starting point is 00:40:45 Well, that's it, exactly. And so they have the big sort of announcement last week was the increase of affordable homes program, which just over doubled, I think. But we're talking about mainly housing association and a small amount of local authorities building maybe 30,000 a year on the existing money. So if we best case scenario...
Starting point is 00:41:03 That's not very many. No, it's not very many. That's 10% of what they need. Yeah, exactly. And if you imagine that doubling and that's, you know, given the costs of everything are going up, particularly of like materials and in some extent, labor as well, you would assume that probably you won't double that, right? So if in best case scenario, you get 60,000 social or affordable homes, not even, you know, full social housing. That's nowhere near elite. No, we're talking at least a hundred thousand a year is what
Starting point is 00:41:26 we need even to sort of get near the housing target. You've got 1.3 million people on the housing social housing waiting list. So even a hundred thousand, right, isn't getting near that. Maybe you could add a few more by sort of buying up some housing off private landlords as well. But like, yeah, that kind of the gap between what they're doing and what is required is in terms of dealing with the housing crisis, is significant. But I guess their motivation isn't necessarily to resolve the housing crisis. It's more, we want to get economic activity. That's the way they see it.
Starting point is 00:41:53 I think it's about making the line go up. GDP growth. So building more housing doesn't mean more growth than it should do. So that's the kind of focus they're doing on and just trying to get this sort of development industry construction kind of moving again. I think that's kind of my idea. It seems like what they've done... Actually, what I think what they've done is I think completely detestable, which is in order to cater to this fraction of capital house builders, they have co-opted the language of a radical solution to Britain's house building crisis
Starting point is 00:42:22 that is rooted in state action, right? As opposed to, oh, let the private market sort them out, etc. They have done that. They've talked about increased spending and what they're going to deliver is going to fail. And so this simply, I don't know, discredits the idea of the state taking radical action to fix the housing crisis directly because they're saying that's what they're doing. And then what they're actually doing is trying to slightly tweak the incentives for private house builders by putting some more money into a system that will ultimately still allow Persimmon to essentially maintain unequal relationships with every council up and down the country, gamble on
Starting point is 00:43:01 land value and build maybe 2,000 homes a year so it doesn't ever have to build a school or a road. Yeah. I think like, I guess the strange thing about this thing, particularly on housing, is that I think there is like, I guess if you can trust it with the Tories, right? The Tories kind of approach was like, okay, well, we'll just leave them to it. We'll just kind of leave builders to it and hopefully things will be fine. I think labor do want to be more active than that, right? I think they are more conscious and like, you know, and I think that will make at least some difference than what people see and experience like, because it'll going from just doing nothing to like people who do actually
Starting point is 00:43:31 seem interested in the idea of like housing being built, obviously is a, you know, a noticeable change. But I guess also they also, they see this active state is still quite weak and they still see it as subservient to capital, right? So if you think of like both, so for example, one thing that they're proposing to do is to like try and... They're trying to put in measures at the moment to force developers to use land that they have planned a commission for, right? And they're consulting on that at the moment. So I don't know how that'll end up. But like even within the kind of rhetoric and the sort of language of that consultation is very much like, you know, oh, but we're only going to use this at certain times. You know, we don't want to
Starting point is 00:44:01 upset the apple cart too much. So they kind of see that there's things the state can do. Again, the affordable housing thing is kind of like this. They know the state needs to do stuff, but they also perceive the state as either too weak or too late with debt or too ineffective or something to actually make a difference. So they do see the state being active, but they also kind of think it has a lot of series of limitations which can't be overcome, I guess, is a way of summarizing it. Yeah. Well, I mean, look, it would be terrible for the country if Persimmon up sticks and took all of its extremely mobile capital to another more friendly regulatory jurisdiction. Let's hope they don't just go to Amsterdam. So one of the things that they're doing specifically
Starting point is 00:44:43 for affordable housing is in the spending review, Reeves announced a 39 billion pound boost over 10 years to fund the construction of affordable housing. But again, that tends to be through housing associations, which we've covered over and over on this podcast, as genuinely some of the worst landlords in the country. That also is further investment by Homes England. But ultimately, again, doing all of that and then allowing social landlords to raise rents. I don't see how that does quite a bit more than create some more social housing, but largely goes to private companies. Yeah, it goes to these quasi-private or in some cases fully private companies. They've
Starting point is 00:45:19 still got to resolve what they're going to do with right to buy or when housing associations go right to acquire, which obviously allows tenants to buy social housing, which very few tenants do now, historically, four or 5,000 a year, something like that. But which limits the sort of capacity of local authorities to borrow money in order to develop housing. They are, again, they are planning to reform that, but that's not been fully set out yet. They're not going to get rid of it. They've said that, so they're not going to get rid of right to buy, but they're going to maybe make it extend its timeframe and things like that to make it taken up even less. But still that acts as a barrier on lending for local
Starting point is 00:45:49 authorities. And again, I guess a model of the state where it's still weirdly centralized, right? Because historically, local authorities and housing development corporations in the past were allowed to have a... It was backed up to some extent by the treasurer of the state eventually, but they were allowed to raise money themselves and develop things kind of on their own steam to some extent. And they don't seem to be... They seem to be reticent to want to give authorities that much power and certainly the financial capacity to be able to do that, right? In terms of direct funding and in terms of borrowing power. So I think there is an alternative model of
Starting point is 00:46:21 the state which they can follow, but they're choosing not to. They do want to keep elements of it centralizing. Part of the planning reforms, I think, is about this, is about setting up a national standard for various different local authority processes, which they see as what is behind the housing crisis, behind the housing shortage, rather than say, yeah, the huge cuts that those local authorities have faced over the last 40 years. It's a combination of the huge cuts, again, land value, I think is an enormous part of it. All of these things work together to create a quite significant housing crisis up and down the country. And we can move on into the infrastructure bits, but essentially,
Starting point is 00:47:00 time and again, what you see the labor government do is present a plan that they think will please everybody and confront nobody and therefore not solve the problem. Because it perceives the division... Because what you alluded to earlier, the planning system, not just as a set of rules, but as the contested terrain between different fractions of capital, the ownership class, the governing class, the working class, and so on and so on. Right? It doesn't see that as really contested. It's like, well, if we can just work out the exact right set of reforms, we can make everybody happy and confront no one. But if you try
Starting point is 00:47:36 to confront no one while dealing with a system that overwhelmingly empowers, say, landowners, the builders of data centers, and so on and so on, then you end up simply empowering those already in power. We know this, this iteration of the labor party's classic approach is to never see any confrontation as possible or desirable except with working class people, progressives, and so on and so on. So I want to go to, actually, I want to go to the thing that dropped my jaw when I was reading some of the information on this in preparation. One of the things, especially for American listeners, you might not know this, one of the things about the British
Starting point is 00:48:16 planning process is that there's like a kind of mythical status that newts have taken on, right? The idea that, oh, you cannot build a train track here because an endangered newt was spotted a mile away. This is a sort of caricature of how environmental impacts are taken into account. So people like to say, oh, we had to build a 100 million pound bat tunnel for the HS2. Oh, we hate the bat tunnel. We wish that we just decimated the environment piece by piece because we don't care about any one particular bit of it.
Starting point is 00:48:46 And they say, Oh, it's too bad planning applications run to the hundreds of pages. In the 70s, it used to be that you could have a three page planning application and it would just be signed. And it was like, yeah, but we were still building with asbestos back then. You had a three... Oh, wow. You had a three page planning application. Didn't we have to spend a huge amount of money removing all the asbestos from those buildings at some point? Yeah, and also, I forget the exact number, maybe we talked about this last time, just the London architectural department,
Starting point is 00:49:15 Greater London Authority architectural department employed like 25,000 people, right? So if we're gonna bring back the, let's bring back, if you wanna bring back short-haul planning, let's bring back like wall staff, like local state as well. There were lots of different things after the last year that we to bring back the air. Let's bring back, if you want to bring back shore flooding, let's bring back like well stuff like local state as well. There were lots of different things offered in the box that we could bring back. And so, but the basically in order to get out of this, I think largely wrong perception that exists in the Daily Mail and the Telegraph that like, if you spot an endangered newt within
Starting point is 00:49:39 a mile of where you want to build a house, you can't build a house. As they're saying, okay, here's what we're going to do. We're going to remove site-specific environmental impact assessments. That means that if you see an endangered dude, you can build a house on top of it. Under the proposed framework, developers are no longer required to conduct detailed on-site assessments or mitigation plans. Instead, they pay into a National Nature Restoration Fund, which would support conservation projects not necessarily linked to the development sites. So we're going to turn all of Britain into a hive city in Warhammer 40,000, except for one square mile that will have a gigantic amount of nature in it. I love this plan. Yeah. I mean, the actual starting point of it isn't necessarily terrible, right? In the
Starting point is 00:50:21 idea of saying that you have a kind of... On a local level, you could have a sort of... And they're introducing a new type of plan to cover this basically, which is like you have a kind of ecological assessment of an area. And then from that kind of process, you can then say, oh, well, certain areas we would discourage people building on because there's lots of wildlife or something like that. You can do that rather than every small site, for example, having to do an assessment. Like in principle, there's like nothing wrong with that. That seems like fine. But the way that they finesse that is by, as you say, adding on this nature restoration
Starting point is 00:50:50 fund where if it is the case that your site is like causing certain damage, you pay the money and then that just absolves you of any responsibility, which is like ecologically just complete nonsense. That sounds very Catholic. Yeah, it may be. Yeah. But it's just, yeah, well, there's actually a really, I actually recommend this to me. There's a really good paper by a guy called Robert Goodins from the 90s. And he actually makes that comparison
Starting point is 00:51:12 between papal indulgences and giving money for like carbon offsets and things like that. In that sense, that's probably important in that that's not a new thing, right? We've been doing this in neoliberal environmental policy for quite a long time where you just basically pay some money in carbon assets, for example, to absolve yourself of having to do something about climate change or ecological breakdown or something. So it is kind of like an extension of that kind of thinking, I think. And there's been proven time and time again to just not work. Half the time the forests that are planted as like carbon sinks just burn down and then
Starting point is 00:51:44 release the carbon. Exactly. Is it not planted in places that don't burn down? The addition of that, and that is what has caused perhaps the most controversy with the planning bill, which is now in the House of Lords, probably be sub law in a few months. That has caused the biggest backlash against it because it is just nonsense. That won't work. And at the end of the day, if the government want to say, we don't care about this stuff and we just want to build stuff then yeah, fine. But then they should just say that. They're actually doing anything here. And I also just think given that how biodiversity in that game, which already exists, works, which is a kind of more of a regulatory mechanism, but it just creates such an opportunity
Starting point is 00:52:20 for scams because you obviously then have to pay someone to create or maintain this other bit of nature. Whether they actually do it, whether what they do is an anywhere replacement, it's just really, no one checks this. It really, seriously. Because it's just very difficult to do in the first place. It just creates a huge opportunity for people to just, landowners maybe, to just get some money by putting a hedge in or something like this. All these kinds of things already exist to some extent. Oh yeah, that's fine. You can relocate the newt to my industrial farm. Yeah, exactly.
Starting point is 00:52:49 My industrial farm has this newt. It is now 0.1% more biodiverse. The imagination that these things can be not just reduced to numbers, but reduced to like fungible numbers, that they're perfectly fungible numbers that they're perfectly fungible is insane. It's completely insane. And it results in these carbon sink forest being burned down all the time. And then what happens if the carbon sink forest burns down? Is it just, okay, well, poof, gone. You absolved yourself. You built the carbon. The economic exchange happened. The carbon is back in the air, but actually they're still absolved because they did pay for it
Starting point is 00:53:23 one time. It's ridiculous. So this is part of the plan though, right? It is, okay, we're going to simplify the environmental impact assessments, again, not by hiring lots more people to do all of them very quickly, right? That would be great. Instead, we're just going to hand wave them away into not really needing to happen. Yeah. Yeah, I think so. And there's still some further changes that can be made on that later this year as well, which is going to shift it to, yeah, which is where environmental regulation regulations might change. So yeah, it was TBC. You know, that's actually going to play out. Yeah. Yeah. There's a potential for the changes in that direction coming soon.
Starting point is 00:54:00 On the infrastructure front, we're increasing the infrastructure budget to nearly three quarters of a trillion pounds from near enough over the next decade. And those plans, as On the infrastructure front, increasing the infrastructure budget to nearly three quarters of a trillion pounds, near enough, over the next decade. And those plans, as far as I can tell, were largely lifted from the Conservative government last year. Yeah. A lot of the stuff, like the trains and the tram networks and stuff, I think is coming from the stuff that the Conservatives did, which was often planned by mayoralities and things like that. The Conservatives just pulled, or just didn't, which was like, you know, often planned by like mayoralities and things like that. The conservatives just pulled or just didn't, we're kind of going to give the funding last year, but then they wanted to have an election
Starting point is 00:54:30 and they wanted to look good and fiscally responsible. So they, they just kind of canned it until after the election and now we're after the election. So a lot of that stuff is yes. And there's been planned for a while. So there's been proposed for a while and just was and probably should have just been funded last year. So yeah, that's definitely the case with the Trump stuff. Yeah. I think as far as I understand them, it's not my area of expertise. Yeah. It's not just the trams. It's also road building, any new rail connections, energy infrastructure. I mean, energy infrastructure is another really interesting one, right?
Starting point is 00:54:55 There is this drive to build more nuclear reactors, which I think is good. There is a drive to build more on and offshore wind. Onshore wind was sort of hated by the Tories because they're rural constituencies. Labor seemed to have fewer problems pissing those people off. Labor is willing to have some confrontations with parts of the world other than the left, just not the most powerful parts of it. They're willing to confront people they think are a bit backwards and ridiculous like farmers, but they're not really willing to confront lots of other people. But again, they're not intending for any of this stuff
Starting point is 00:55:29 to be publicly owned. Much of this is going to be via GB Energy as a funding vehicle, right? Yeah. Well, they've actually got three... We actually now have three types of public ownership already just for nuclear and... So nuclear, well, for nuclear and energy, I guess. So we have GBE nuclear which has now just been created out of halving the funding from Great British Energy, which is funding the SMRs, the small modular reactors with Rolls Royce. So I think that's kind of more like, I don't know if that's like a bit of ownership, but really I think effectively
Starting point is 00:55:57 a subsidy. And then the government is now directly owning at least for now shares in size well C, a majority shareholder, but that might change in the next few months because they're kind of courting for the private investments. So again, that money might end up being kind of effectively subsidy. And then there is GBE, which as far as I know at the moment is only doing so far, well, they've really announced this investment in like, sort of hospital and school solar panels, which you know is great. But so we do actually have these three units of kind of public ownership, but they're not really managing, they're not really running,
Starting point is 00:56:26 they're kind of getting money to kind of get these big projects. Obviously nuclear is like hugely expensive and very complicated. So yeah, they're putting a lot of money into that as well as carbon capture and storage as well, which yeah, is a bit of a turn really, I guess from the last sort of 10 or 15 years, certainly that the government's actually putting this big money into, you know, a big energy project like that. And obviously they are sort of in terms of planning, certainly putting huge backing towards on like that. And obviously, they are in terms of planning, certainly putting huge backing towards onshore wind. And again, I guess that's just a bit of an example of what I said before, where you had a government like the Conservatives
Starting point is 00:56:52 who were just content to wreck the country for an easy ride and just banned offshore wind for no reason. All Labour have had to do is say, well, yeah, maybe we should just have onshore wind and that's a huge change already. It could make a big difference because it's like the cheapest form of renewable energy basically. Although I find as well like the... This is less about energy infrastructure planning and more about energy pricing is that are we likely to see a shift away from allowing gas to set the price of all other energy forms? Yeah.
Starting point is 00:57:22 Because that's the thing that's going to actually bring down bills. Yeah. And also if gas prices stay high, which given that the government's support for wars in the Middle East is obviously quite likely at the moment, we are going to see... Yeah. If we see high gas prices plus the increasing cost of infrastructure development, because you have a lot of energy infrastructure, things that are on Moscow contrast to difference, which is how most renewables are funded, and the nuclear as well. Often it's funded through like charges that go onto people's bills ultimately,
Starting point is 00:57:49 right? Or indeed just the company charging bills themselves, but a lot of it is just fixed into your bill. So if you've got that number rising, which it will, which the government know it will, and you also have high gas prices, then yeah, you can have huge pressure on just day to day people's energy bills, which is a great opportunity for something like reform who are already kind of promising to do anti-green energy kind of activism with their own local councils, I guess. We're going to set up oil burning furnaces inside everyone's home. Yes. Yeah, yeah, yeah. Exactly.
Starting point is 00:58:18 So here's, I guess, as we come to the kind of summary, right? We've talked about housing mostly, infrastructure, environment. We talked about what they're sort of at a high level, what they're doing with energy, and more importantly, what they're not doing. Two things. Number one, the fact that anything is being done at all is a remarkable difference from the last 14 years of sort of ideological anti-statists who sort of were actually as much as they like to hate the newts, really did love the Newts. There's a significant difference from then to now. But the difference between now and what's needed is much, much, much, much larger. And labor's plans are not going to solve the housing crisis.
Starting point is 00:59:00 They're not going to attack the actual source of high energy prices in the United Kingdom and their infrastructure plans are largely more of the same versus the previous administration. Yeah. I guess. Yeah. They have things like targets, like obviously it's one with Emily Homes. I don't think anyone thinks we're going to hit that. They've got a target of clean power by 2030. I think it's very unlikely they're going to do that. Maybe they'll get lucky. I don't know. I guess the question is, can they get in four years now to a point where it looks like those sort of things are possible? Can they get it to a point where you've got a system that is responsive enough? For example, actually building up a social housing sector or something
Starting point is 00:59:36 like this. Is it possible that they'll be able to do that? I think the only way they're going to do that really is the planning reforms aren't going to do that. I think that's just obvious. It might speed things up a bit in some cases. I think really what a lot of them are, are just a way of trying to do more with less in the sense that they know they're never going to get back to the funding levels that they had 15 years ago, in real terms. So then it's just easier to just go, okay, well, we'll use some AI and we'll cut a few regulations and we'll speed up a few processes and we'll cut a bit of the democratic deliberations to the file. And hopefully that will mean that we'll get at least like less delays, right? Which, you
Starting point is 01:00:11 know, we didn't have big delays 10 years ago, so it's not all 12 years ago, I should say, 12, 13 years ago. So, you know, they're making quite some standard changes in terms of process in order to just kind of get us back to that level, which was hardly like, you know, fantastic anyway. So I feel like, yeah, that's kind of where they're heading towards and it still, it certainly seems to me that the balance of power within that is hugely skewed towards, yeah, like developers, real estate capital, maybe to some extent, like yeah, some of the big energy developers too. No, I'm not entirely sure if it's going to be great for them in the long run. Yeah. So it still seems like the balance of power we've ended up with, I think is still not something that's good. Certainly going to be great for them in the long run. Yeah. So it still seems like the balance of power we've ended up with, I think, is still not something that's going to certainly lead to
Starting point is 01:00:47 abundance of housing for working class people, for sure. Oh, well, maybe in another five years, they'll figure it out. Gareth, I want to thank you very much for coming and sharing your wisdom with us again. This is certainly going to help cut through the obfuscatory bullshit that the government so likes to share with us regarding its plans to actually solve capital T, capital P, the problems. So yeah, thank you for coming on. Thank you, Ramans from them. Thank you. And I'm going to throw back to me in the studio, depending on... Well, I'm not...
Starting point is 01:01:18 I'm going to throw back to me in the studio regardless, but I wonder what the rest of the show will have brought. Bye everybody! D&D music Ah, what a good interview, hopefully. And that handoff. It wasn't just amazing, it was incredible. That's right. I discovered a secret set of occulted patterns that made me perfect at handoffs. Unfortunately, a non-governmental system is trying to make me do clumsy handoffs. I can't believe that Gareth Furr managed to convince you that you'd actually defeated the laws of podcasting. Well, look,
Starting point is 01:02:01 I really thought of the infrastructure Gareth, so it was going to be the other one, but here we go. Anyway, thank you very much for listening to this free episode of TF. You know all about the Patreon. It is five bucks a month. You get more episodes per month. You do that more episodes per week, in fact. And of course we are going to be in Edinburgh next week. Edinburgh in the evening. Milo, are there still tickets? Let me actually just right now quickly check. Events? Current events? Yeah, there are some tickets remaining. So get on it because they will go. They'll go. They'll go for sure. So yeah, that's 31st of July at 11.10 PM. And I'm also going to be at the Fringe from the 31st to the fourth thing work of progress show tickets for that selling very fast I'm gonna be in Newcastle on the 30th of July There may still be some tickets for that and I've got my taping on the 27th of September in London lovely
Starting point is 01:02:57 Well job to all of that lots to do Anyway, thank you very much for listening and we will see you on the free episode tomorrow. So it was late everyone, sorry. Please don't be mad at me. If you're mad at me, you can't legally say, or if you are mad at me, tell chat GPT and it'll tell you not to be mad at me. Hopefully. All right. Yeah. All right. Bye everybody. Live on stage. Riley is murdered by someone who has been radicalized against him by chat. GPT. No, no, no, no. The fact that the episode was late doesn't just give you a reason to kill Riley, it gives you the best reason to kill Riley.
Starting point is 01:03:26 It gives you an obligation to kill Riley. Grok has been coded to be anti-Riley. Yeah. Oh, I'd hate that. I would hate so much if there was an AI that was coded to make people bad at me. It would make me so stressed. Okay. All right. Bye everybody. Bye. Bye. Bye. Bye. Bye. Bye.
Starting point is 01:03:45 Bye. Bye. Bye.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.