TRASHFUTURE - Let’s Delve Off The Bridge Together feat. Brian Merchant

Episode Date: September 23, 2025

Brian Merchant, of Blood in the Machine, joins us to talk about his new series, AI Took My Job. It includes stories from not just artists and translators, but tech workers, healthcare workers, and eve...n suicide counsellors. But before that, we decide that in the bleakness of the day we could all use a moment of TF season one whimsy, and talk about a fridge with ads on it. Remember that? Get more TF episodes each week by subscribing to our Patreon here! TF Merch is still available here! *MILO ALERT* Check out Milo’s tour dates here: https://www.miloedwards.co.uk/liveshows Trashfuture are: Riley (@raaleh), Milo (@Milo_Edwards), Hussein (@HKesvani), Nate (@inthesedeserts), and November (@postoctobrist)

Transcript
Discussion (0)
Starting point is 00:00:00 So look, it's what you might call a fast news week. I am being relentlessly overtaken by events. There is a, I would say, a terrifying creep of right-wing authoritarianism, a happening in the UK and the US. Today, Nigel Farage announced a plan to to potentially, like, be able to deport indefinitely if to remain holders. And in the US, what can only be described
Starting point is 00:00:39 as, you know, horsed Wessel's sort of state funeral has occurred. But in times like this, sometimes you must stand a thwart history and shout, I'm going to talk about a stupid appliance thing and go back to the whimsy of TF season one. That's right, baby. So, joining us today is
Starting point is 00:00:58 a returning champion. It is Brian Merchant from Blood in the Machine. You know him, you love him. Brian, welcome back to the show. Hello. Thank you for having me. And I think we can all agree that the biggest and most important news today is that Samsung is committed to innovation and enhancing everyday value for our home appliance customers. I'm often saying this. It's true. We got a fridge. We got a fridge or something. We got a fridge. This is like, this is old school TF. This is old. It's like this plan they made teleported here from 2018. I just feel like
Starting point is 00:01:32 I have a feeling like I'm about to read some article introducing Jeremy Corbyn for not bowing low enough at the Cenotaph. But they say, as part of our ongoing efforts to strengthen that value, we are conducting a pilot program to offer curated advertisements on certain Samsung Family Hub
Starting point is 00:01:47 refrigerator models in the US market. Yes. Yeah. Yep. Yeah. Perfect. So, okay, picture your normal fridge, right? Just close your eyes, go into your mind. Palace and picture a normal fridge, right? What's missing?
Starting point is 00:02:03 Adverts. Well, quite. What's missing is deals. What's missing is deals on, for example, hey, maybe Sainsbury's is doing like a nectar offer on like almond milk or whatever, you know, you could get that beamed directly into your kitchen. Maybe it's the Jeep November sales event. And, you know, you don't want to, what if you didn't look at your phone, right?
Starting point is 00:02:25 You would risk missing the Jeep November sales event in many ways. The fridge screen is just big phone, right? Yeah. Big phone. Now, is it still, is it like, is it live streaming? I remember some of these earlier models would show, would have like a live feed of what was in your fridge to save you the inconvenient of opening the fridge door. So if you just have that to you, like are the advertisements interrupting your live feed of what's inside your fridge? Yes.
Starting point is 00:02:51 Yes, that is correct, Brian. That is the main feature of the Samsung. is that it keeps a tally of what's in your fridge so you don't have to open it. Well, because if you open it, you can't watch the adverts that are on the outside of the fridge. Yeah. So you want to keep that to a minimum opening the fridge. You know, I was just thinking my wrist is sort of, I just am opening this fridge over and over. And the tensile sort of strain on my wrist from doing so, I was really hoping I could limit that repetitive action. The whole point is peak living is looking at the screen of your fridge, waiting for, like,
Starting point is 00:03:25 Because you haven't subscribed to fridge premium, waiting for the free roll ad to stop before you get to see what's in your fridge. You see that you have just a few sauces, and then you door dash a $60 burrito. That is, as far as I'm aware, looking at the consumer technology industry around the world, but especially in America for as long as I have, I think that might be the actual American dream. A lot of people turning up at like gamblers anonymous and they got into sports gambling because they bought a Samsung fridge. Are there sponsored fridge influencers that are going to be dedicated to fridge content appearing on your screen? No, yeah, I mean, yeah, not yet, but that is very much my plan if, like, everything else fails. Yeah, it was what, just get on there and be like, hey, you know, you can just eat mustard from the jar. Yeah.
Starting point is 00:04:11 You know, ketchup is a little nutritious. So, as part of this pilot program, Family Hub refrigerator, it was cost like two grand in the U.S. will receive a software update with terms of service and privacy notice changes. Advertising will appear on the cover screen. The cover screen will display it whenever it's idle. So, like, if you bought this and you were like, they were like, don't worry, there won't be ads on it.
Starting point is 00:04:32 Now they're like, just joking. There are ads now. I bought this fridge, so that I wouldn't have to open it, and now I have to open it. Yeah. This feels like it should be illegal. I'm surprised that it's not illegal. To be like, oh, we've turned this thing in your house
Starting point is 00:04:47 into a thing that displays ads without your concern. You didn't buy that, but we've made it. It has to be always on. Yeah. That's the thing, though. You did give your consent because you said on the terms of service when you first signed up to your fridge account, hey, yeah, you can do whatever you want with the fridge. I recognize the fridge is kind of yours. Also, there's anything I learned from the Charlie Kirk funeral that I attended, it was that like really what he died for was that everything.
Starting point is 00:05:07 Like, he died to that everything could be a billboard if he really worked hard enough at it. That man loved, that man loved America. And as far as I'm aware, being an American, loving America is ads. Will AI Charlie Kirk game into our fridges now? Yeah, there you go. You could debate your fridge.
Starting point is 00:05:26 Your fridge is like, prove me wrong. I think, look, the opportunity to engage in what can only be described as large language model enabled necromancy before like making your, you have to stand and you have to like salute Charlie Kirk while he gives you a one man struggle session about like everything wrong with your life before you're allowed to like take your $20 eggs out. out of the fridge. Charlie, I just wanted a spoon of mustard.
Starting point is 00:05:51 That's all I wanted. And now I have, I have to debate you for it. Samsung has created this wonderful opportunity. They don't see it, right? Samsung has created this wonderful opportunity to sort of put a bridge troll in everybody's house between them and their dinner.
Starting point is 00:06:05 This was literally the joke, right? Like, wasn't that, that Adam Sandberg movie was, the joke was that there's like a non-consensual music playing from your fridge or like an ad for the album. Like,
Starting point is 00:06:16 that was like the punchline of the, this was literally the parody just like five years ago and they just did it. You know what I'm talking about? That never that movie where he's got he's like Justin Bieber, the Justin Bieber parody.
Starting point is 00:06:29 There's a heart fridge plot line in that movie. You're talking about Pop Star Never Stop Not Stopping. That's right. That is right. Yes. An underrated film. Yeah, well, congratulations.
Starting point is 00:06:41 We live in the world of Pop Star Never Stop Not Stopping. Brought to you by Samsung and Turning Point USA. This is great. I love this. I love the fact that, like, we're going to reach a point where everyone in the United States has to officially care about the fact that Charlie Kirk died by saluting their fridge, apart from Donald Trump.
Starting point is 00:06:58 He's the only person who gets to openly not give a fuck about it. You see that thing today where he was like, it's a time of whatever. It's a time of great morning and healing and all the other stuff you're playing with. He talked to, it was this same, before he realized that he could use crypto to receive bribes, and he just thought it was kind of stupid. And when he was giving a talk at that crypto conference, He was like, have fun with all the stuff you're playing with or whatever, I guess. Yeah.
Starting point is 00:07:23 It's a time of remarkable, et cetera. At the risk of sort of evoking another very esoteric internet thing. One of my favorite comedy routines is by Andy Daly, and it's a character he does called Jerry O'Hern, who is a stand-up comedian who never actually says anything. He just says, it's like 10 minutes of him just saying, so you got these people now. And I'm like, hold on. I didn't order that, like, for 10 minutes. And I feel like that is what he, that is what Trump was doing. I got to tell you, I don't know.
Starting point is 00:07:54 I don't know, you know. All this stuff that's going on in the world today. It's like, hello. I mean, come on. It's like, hey, wait, excuse me. I didn't sign off on that. Also, 1984 comparisons. I've grown to always associate someone making a 1984 comparison.
Starting point is 00:08:16 with someone who is of unsound mind and shouldn't be allowed to operate a motor vehicle. Yeah, David Badele. However, in this case, what we are creating is a version of 1984 where the telescreens require you to
Starting point is 00:08:32 engage with a raised from the dead AI Charlie Kirk once a day, three times a day, actually if we're eating three meals a day and you're cooking at home, in order to be able to make breakfast. As we're doing 1984 bits, can I, can I piggyback off that with a very mid-2000s comedy bit, which feels appropriate, which is that it's
Starting point is 00:08:51 typical that for the purposes of doing 1984 in America, they had to put the telescreen on a fridge because it's the only thing that Americans check that many times a day. You've got to check your fridge. See if it's running. That's right. Is there going to be screen creep to other appliances? Is this like an untapped market? Now are we going to have it? And if we're going to have screens like on our trash cans and we have influencers. The kettle screen. Yeah. Are we going to microwaves? That's another flat surface that I think could use a screen. Yeah. I feel like cooler screens, the company that I love talking about that went out. That was just like, what if we replaced the freezer doors in like Kroger with TV screens that showed what was in the fridge and freezer?
Starting point is 00:09:34 I feel like if they just held on for a little while and then like bribed the government and then said, hey, you can in exchange for you supporting us financially and mandating cooler screens in every Kroger and CVS around the country, We will let you air sort of 25% of the time. A custom Charlie Kirk AI avatar that like through iris scanning recognizes you from screen to screen so you can never escape this like beast that's coming after you like the furies. Anytime you walk into a store or through your own house or through the world. I think that could be a really, we've taken, we've gone from 84 to Minority Report. But again, exclusively in terms of the screen you. Well, it's the thing that a lot of people don't know about Arrestes is that when he was being pursued by the, by the furies for murdering his mother, he did eventually own them in the arena of facts and logic by proving them wrong. Hello, everybody. Welcome to TF. It says it's all of us. We're talking to Brian today about a few things, including his new series that he's working on on Blood and the Machine, which is all about people whose jobs actually have now been taken by AI and what the experience of that was, especially what I'm interested in talking about as well.
Starting point is 00:10:46 is the business side of it, like how well it worked, what they used it to do, what it could actually do, and so on. But I have a few more bits of news I want to get through first. Another like throwback. We're doing a lot of throwbacks. This is a throwback to like 2021 now. This is like best of TF before we get to the main segment. I want to talk about an old friend who's back in the news. A very old friend. Trevor Milton, the founder and CEO of Nikola, the famous lover of simple machines to make his electric trucks look like they were working, simple machines such as an incline plane. Yeah, rolling it down a hill.
Starting point is 00:11:21 Yeah. I'm sure it got up there with a lever and possibly a pulley. Yeah. The man who turned a simple machine of an inclined plane into, you know, billions of dollars and then no dollars and then a prison sentence, you'll never guess what happened. Now that his sister, I believe, is married to Attorney General Pam Bondi's brother. Oh, my attorney general in law. Well, quite, is that the SEC
Starting point is 00:11:41 has dropped all civil charges against Trevor Milton. Wow. It's not, it's finally gone uphill for him. It's not gone downhill anymore. Brian, do you remember Nicola? I remember this guy. I remember this case. Real, real classic.
Starting point is 00:11:55 I feel like we haven't spent enough time on all of these things. All of these staged demonstrate, Tesla did it. Remember Tesla's autopilot? Oh, yeah. That was someone who was right past all this stuff. Yeah, this guy, real, real class. classic of the genre and what yeah the charges are dry i mean he he donated to the trump campaign right that's it that's it that's it now it's like being a ceo in america now is a lot like red dead
Starting point is 00:12:20 redemption in that you can just go to the train station and pay your bounty and then you just get to keep going to whatever you want like by marilago with a little piece full of cash and you're good that's it now arthur if we can just get enough money together to give all of these crypto coins to President Trump, we'll be, we'll be sitting pretty. That's Dutch van der Biden right there. Yeah, there I can't remember where he is. Yeah, so the SEC has dropped their case against me with prejudice. Five years of outright lies by the media, corrupt prosecutors,
Starting point is 00:12:50 former Nicola executives and short sellers is finally over. They falsely indicted me, silenced me, and then my favorite one is deleted my followers on X. Oh, no, is nothing sacred. They shadow banned him. They shadow banned our boy. Everyone knows the SEC has a backdoor to X and they can whoever they want.
Starting point is 00:13:08 They stole my company, bankrupted my company, debanked me, targeted my friends and family, stole most of my wealth and tried to put me in prison. But now it's over and our creator has made it right. That's right. God himself has come. Look, the Christian nationalist project everywhere is so just such a natural home for scammers and flim flammers.
Starting point is 00:13:29 And especially like American Protestantism, you could just say, yeah, I rolled the truck down the hell. I defrauded a bunch of investors. But like, I know I'm a righteous person and I know God will, God will not only forgive me. God actually wants me to do this so that he can work through me. And all I need is to get these like Rube's money into my company. God put hills there so I could roll the truck down them. It's like a greatest hit now. I think it was really striking about the whole statement is just like how like how unabashedly Trumpian the whole thing is. it's like with litany of grievances that he has that he's been wrong now they're all they've all been expunged
Starting point is 00:14:08 again because of that briefcase full of cash and then like the we have the the the the god talk tacked on at the end is a gesture to sort of what you're just talking about it's it's it's role it's just the soap boxing and the they've all they've all learned they've all learned you just say this shit you just say it with a straight face and you're exculpated you just you you're this is all to do. Yeah, that was, I mean, more than anything else of this era, that was the cheat code. That's the, that's the Trump wisdom. You just, nobody can tell anymore. There's too many posts. There's too many, there's too much shit going this, this way in that he just, I mean, this is what about, you know, frankly, Elon isn't quite as good at. You just put a statement out like that. You wash your hands and
Starting point is 00:14:50 you move all. Yeah. God didn't even need to forgive me because actually, everyone else was wrong, you know, I forgive you for your suspicion of me, basically. And he says, I kind of, I kind of out of this thankful to God for one more day in my life and for such a beautiful family, a wife who never backed down against the evil men behind this. And I mean, this is this matches up quite like closely with a lot of the talk of at the Charlie Kirk funeral, which is the constant invocation of evil men and the demand that the state be used to crack down on to, you know, kill or expunge them. I mean, it's the same playbook. It's just invent whatever you want and then claim a sort of a state of emergency that requires the full force of the of the state.
Starting point is 00:15:31 to protect you, exonerate you, kill your enemies. Because that's the thing. Trevor Milton probably doesn't want to stop here. He probably wants to go after the shortseller. Maybe not have them killed, but certainly have them, I don't know,
Starting point is 00:15:42 harassed in some way or dealt with. Well, he wants to borrow the short sellers and then sell them to someone and then hopefully buy them back at a lower price. It has all that in there with a debanking,
Starting point is 00:15:53 the shadow banning on X. I feel like it is just like a direct transmission of sort of like the modern state of the modern tech billionaire. The Democrats want to make it illegal to roll a truck down a hill, okay, it's very sad. And we've seen that. We've seen what if a truck had rolled in front of Charlie Kirk, it'd still be allowed. It would have stopped that bullet.
Starting point is 00:16:11 So Buck's racing is a proud American tradition. And Nicola, Trevor Milton, a very nice guy, loves a certain president. That's what he said. He was like, I don't know much about him, but I do know he's a very strong supporter of a certain president. And that's me. That's what he said. A certain president, I'm not going to say who. Also, like, no one, like, dropped the case.
Starting point is 00:16:31 on its own merits. He was just pardoned. It's like, and also, yeah, and so his lawyer is Pam Bondi's brother. Excuse me. It's not a marriage thing. But he's not entirely out of the woods yet, though.
Starting point is 00:16:41 He's trying to get a bunch of money back from Nicola. And this is from Bloomberg. Judge Thomas Horde of the U.S. Bankruptcy Court for the District of Delaware overruled Milton's objections to approve Nicholas plan to liquidate his remaining assets. The judge rejected Milton's effort to stop the subordination of his claim, holding that pardon powers don't extend to, quote, innocence. And that President Donald Trump's pardon of Milton's criminal charges was unambiguous and
Starting point is 00:17:03 clear, but he is manifestly not innocent of wrongdoing. He's not able to claim back his legal fees. It's like, no, no, no, no. Anyway, I also wanted to note that he made, Nicola made a feature length, like, documentary about how unfair everyone has been to him. So, I'm going to put a pin in that next time we're going to have a bunch of people on holiday. But there's one more thing, a couple of things I want to talk about, though, before we get to there are sort of meat and potatoes here, which is, of course, it might seem like we're talking about the states, but given that these are global platforms, these decisions that are made in the United States affect everybody in the world, which is the ownership of major tech platforms, media
Starting point is 00:17:43 organizations, is coalescing into a smaller and smaller set of increasingly pilled billionaires. So this is, this is, of course, the purchase of TikTok, or at least the purchase of America's TikTok and the creation of the great firewall of America by Larry Ellison and Mark Andreson and other venture capital investors after, of course, the Democrats fought to, and Biden specifically, fought to force bite dance to divest much of their American holdings in a way that, again, could you have predicted that creating the political space and the pressure to do this would have just, if you weren't holding the presidency forever, created the conditions where the next guy could just be like, I'm putting my best friend Larry on the case. He's going to make sure that no
Starting point is 00:18:30 sneaky Chinese algorithms come and like, you know, turn our children into NPCs or whatever. I mean, Brian, you must have been following this as well, given your, given this is like deep on the tech reporting beat. Yeah, I mean, this has been a truly remarkable story. I mean, the pretexts as you're gesturing towards for this move was, it was, has always been so thin, right? Like, it's all, it's been all couched in sort of a geopolitical angle where the idea is, is that they're afraid that China can track our data. So they have control of the algorithm and they have control of user data and we don't know what they're doing with that data. And this somehow poses some, you know, nefarious and ambiguous threat to American citizens when literally every single one of the
Starting point is 00:19:18 apps that we use that are American made are doing the exact same thing. And all of this data is available for purchase through data brokers. China can get user data as much of it as much of it as it wants if it just needs to call up Experian or any number of the other sort of legal data broke. So the whole pretext has always been a joke. And now it's just it's another, I think, example of, you know, consolidation and increasing state control over a valuable media platform essentially. I mean, TikTok is more like traditional media than most. of the other social media, it's, you know, what you see is completely dependent on what can be considered programming. The algorithm, you know, is just serving you videos. There's not a lot
Starting point is 00:20:02 of interaction. So it's, you know, at this moment, it's, it, I think just the fact that it's being handed into Larry Ellison's hands and over to Oracle, Oracle is going to retrain the algorithm, whatever that means to give it a special, you know, U.S. sauce, maybe more fridge-related content for the Western audience. Smash cut to Republican Senator an inquiry going, my fridge has been showing me the merits of Deng Xiaoping thought. I wanted to know
Starting point is 00:20:32 what you intend to do about that. When can we finally implement socialism with Chinese characteristics here in Sarasota, Florida? My name is Greg Stuby, and I have been brainwashed by my fridge. Now, my microwave has been telling me that I should build a pig iron furnace in my
Starting point is 00:20:48 backyard. Now, I think this thing, you you mentioned, Brian, the thing I think it's worth digging into more is there is this, like, mythical status of the algorithm, capital T, capital A, where they're like, ah, TikTok must be so popular. It must be rotting so many, you know, children's brains because of capital T, capital A, the algorithm. It's rotting Christian nationalist brains as well, because most of them think that the time of the rapture is coming within the next 24 hours. They're like selling shit, you know. You know what? You know, you may be skeptical.
Starting point is 00:21:22 school, but like, those Etsy witches really have, like, made me question, you know, all the other, like, all of the other, like, self-proclaimed mystics on TikTok. Yeah, but, like, there's no doubt about it that, like, the consequences of social media for people's ability to perceive basic reality has been disastrous. And that each iteration of those social media formats has ratcheted up in inverse proportion, the amount of engagement with the phone, which is, I say, is an inverse proportion to ability to perceive reality, right? It's, it's, this is what makes people like ask how the mirror knows what's behind a cloth. How does, yeah, how does this fridge know how many like jars of mustard I have?
Starting point is 00:21:59 Yeah. This is what, that's what that makes you do that. And TikTok is, you know, astonishingly dangerous for like, for the sanity of anybody using it. But it's not because there's some kind of Chinese ploy to try to turn every American from like the big buff guy that signs up to the Marines into one of like the sort of couch potatoes from Wally. it's they're following the same economic incentives as everybody else. An American version of that app would just have about
Starting point is 00:22:25 25% more AI generated Charlie Kirk. That's the difference. And of course, the other difference is, and this is already apparently happening, it would be infinitely more Zionist. In fact, what you are doing is you're looking at what you're thinking China is doing, which is using TikTok
Starting point is 00:22:40 as a tool to advance foreign policy causes. And you're saying, well, we should be doing that. That's it. I mean, and you're, I'm sure your listeners are aware that the whole reason that this really got started was because there was too much pro-Palestinian content on it in the first place. That's what even sort of got this into nominally sort of bipartisan spheres of because there were people who were broadcasting about the genocide in Gaza. And then that sort of like
Starting point is 00:23:08 started making even Democrats go, oh, like we have to do something about this. This is this sort of content. And then that fit north sort of needly into the narrative of xenophobia around around China where they have all this nefarious control and it's an excuse to curtail speech. And so you're exactly right. That is, you know, all that it's just replacing more like sort of algorithmic driven black box sort of fueled algorithmic content from a foreign country. And then just going to do the same thing, just more catered. They're going to attempt to sort of control more of what's seen.
Starting point is 00:23:44 I mean, obviously it's going to be messy still to some extent. But it's, yeah, that this is about anything other than control and trying to establish some level of censorship is a joke. That's all what it was always about. Yeah. And like now you have the special hived off American version of the internet because most people experience most of the internet most of the time through individual social media companies who are able to unilaterally make changes in response to political pressure. Yeah. You know, this is what Mark Zuckerberg has shown over and over again that he's willing to do that. Elon Musk has shown over and over again that he's willing to do that.
Starting point is 00:24:17 Now you, like Larry Ellison, you know, his son bought CBS. It is like, it's like, okay, Barry Weiss, you're now in charge of programming at CBS. So I can't wait to like, you know, have like, yeah, we're going to, it's going to be Borat 2, 24 hours a day. You know, that it's, it's just more of the same. Yeah. Right. It's all of these things are the same thing happening. It's just, you know, and also you go back to every liberal who was like, oh, I'm worried about the illiberal left.
Starting point is 00:24:42 Oh, I'm worried we're not catering enough to conservative voices. Oh, I'm worried that, you know, there's. oh, all this Gaza stuff is a little bit too far. Someone should do something over that. Oh, I'm worried about the Chinese. Guess what your worry did. It created the political space for this to happen. Yeah.
Starting point is 00:24:56 And I mean, it's quite literally now we're going to have a TikTok that is under state control. I mean, Larry Ellison is one of the most loyal Trump allies getting back to the first Trump term. And yeah, with his son owning CBS. And with Disney and other, you know, firms that aren't directly under his control willing to sort of immediately out-of-state pressure from the FCC. We could see where this is going. I mean, it's just, it's happening right now.
Starting point is 00:25:29 This is, it's, it's pretty remarkable how, how quickly. And it's just to all be catalyzed by Charlie Kirk ultimately is, once again. Yeah, what an embarrassing small domino that is. Yeah, what I'm so, God, like, the historian writing this down, a hundred years like, oh, God, just like, just. This guy? Like, really? The guy with really small teeth and a really big gum.
Starting point is 00:25:51 The guy? It's like... The diaper? I am not seen the diaper going around enough. Do you just... You could have been saved by gun control. Could have been saved by gum control. Oh, fucking hell.
Starting point is 00:26:01 Yeah, it was... It wasn't him in the diaper, but it was his organization that deployed the man in the diaper. But before we move on, though, right? I go back to like this. So they say the House Select Committee on China says that any deal between Beijing and Washington must require TikTok to be divested from Chinese ownership
Starting point is 00:26:18 saying it wouldn't be in compliance if the algorithm is Chinese. And this is the thing that's been sticking with me is like, I don't think that there is this idea of what the algorithm is. And I recall, oh, I can't recall who exactly it was, but one of these sort of conservative wingnut members of the house wanted to get firm commitments from some of the tech CEOs.
Starting point is 00:26:42 It wasn't Greg Stuby, it was someone else at that hearing, as to promise to remove all. algorithm. Amazing. Someone's been saying to me that the algorithm doesn't have any conscious idea what it's doing. They say it's a Chinese room. I say, well, that's exactly the sort of thing we're going to be getting rid of. Okay. Why does the room have to be Chinese? Yeah. Oh, I think it was actually from you, I think it was a UK minute. No, it was, sorry, you know what it was? It was Nadine Dorries. It was Nadine Dorries. Oh, finally. Bring her back. Yeah, it was Nadine Doris, and she was talking to Microsoft, and she was like, when will you get rid of the algorithms? Because
Starting point is 00:27:14 the way that she saw it was algorithms are pieces of control that like discriminate against conservatives. She doesn't see that as like, she doesn't see it in any kind of neutral terms, which is half correct. But she's like, there should be neutral technology that doesn't have any
Starting point is 00:27:30 algorithms in it. Get rid of the algorithm. You know what? Based. That's what I'm going to say. Base. Microsoft spell check is biased against me. Basically, you know, the, um, this appears to be what is going on with the capacity to know things. Yeah, are we going to have a different version of TikTok in the UK than in the US?
Starting point is 00:27:48 Yeah, for now, it's been shown that, you know, that a government that is willing to just take that action unilaterally can just say, okay, Nigel Farage is saying, all right, well, we're going to have, we're going to have a British TikTok. We're going to have Brit talk. And, you know, that's going to have, it's just going to be videos of flags. The algorithm is going to show nothing but videos of St. George's crosses, like, cheap Amazon, like, bought flags raised on like lampposts in Surrey. It's going to be that, that video over and over again of Nigel Farage complaint. about the milks at the hotel breakfast bar. That's all you can watch. Woke milks.
Starting point is 00:28:18 Yeah, exactly. All right. Look, I really want to get to this discussion of AI killed my job. So you might be saying, hey, are you going to talk about the UK-US tech deal where it looks like 150 billion pounds
Starting point is 00:28:30 of worth of private equity funding is filtering into the UK briefly, which is this is being touted as a benefit. But hey, what happens when private equity floods something with money? It gets better, right? It gets better? Yeah, every time.
Starting point is 00:28:44 Uh-huh. Yeah, this is being touted as 90 billion of this coming from Blackstone. No one quite knows how the money will be spent. But if you want to know what that money is going to do, just look at our water industry. A huge amount of private equity and sort of proprietary investment flowed into that. And now there's shit in all the rivers. So what is going to be the river shit equivalent of Blackstone's 90 billion pounds? Because for that 90 billion pounds, they want 900. They want 900 billion in a few years, right? That's how private equity works. What you're saying is, hey, I know that we've some issues with being hungry. And this bear, this living bear, contains an enormous amount of meat on its bones. So I've invited it to come live with us. So there's technically more food in the house. However, it may eat you. And the other thing by the numbers that they're doing is investing, of course, in huge amounts of data centers. Brian, again, as a tech reporter, data centers. It's good when those come to your town. Oh, that's a good thing. People love them. Everybody loves them. They create, you know, three to four good jobs.
Starting point is 00:29:44 have, they create a security card job. One technician, you wanders the halls to make sure they know, as it, as it ingests your entire local freshwater supply and sucks down electricity at sky high rates. Yeah, I mean, it's, we're going through this in the U.S. where it's the absolute madness. And there's a lot of, there's a lot of pushback where people say like, oh, well, it really isn't that bad. A lot of times it's that bad. You know, it's the easiest thing to do is to just plug it into whatever local.
Starting point is 00:30:14 lake or freshwater supply is there. And they are just using that to cool down these data centers that are, again, being built where there's no clear demand for these products at all. And yet the six, five or six biggest tech companies in the world are building as many of them as they possibly can, wherever they can. And, you know, the UK is, is enthralled by this. You know, old care loves AI. And it's all, yeah, it's expansion. And it's, you know, the thing about data center, once you built them, it's hard to justify not using them. So there's going to be, they're considered as this major sunk cost. So once they're there, it's going to be harder to get rid of them, although not impossible.
Starting point is 00:31:01 And fortunately, a lot of local groups really do hate them and have successfully shut down some of them. But yeah, that's the state of play right now. Yeah. Is this going to be my fate as an old man being like, it's all data centers now, everywhere. You know, when I was a lad, this was a sports direct distribution center. Yeah, when we were a proper country. Yeah, they employed dozens of precariously employed people in zero hours contracts. Mike Ashley had turned up and he'd empty all the wads of 50-pound notes out of his pockets at the metal detector.
Starting point is 00:31:33 Don't get that anymore now because of woke. If Ilyas Satskever, you know, co-founder of OpenAI gets his way, there's this really remarkable. moment. I think it was Karen Howe, that's a tech journalist, uh, who wrote the great book, Empire of AI, Wood was talking to him. And he was like, almost gets wistful. He's like, yeah, in the future, the world will just be covered with data centers. Just like, no more, no more, no more people, just just, just data centers covering it all. It's, it was like legitimately his, his vision. Can I tell you what that reminds me of as I remember, because I read a lot of articles, you know, by AI boosters. Obviously, this is my job to do. Yeah. And one of the,
Starting point is 00:32:11 made the claim that, oh, you think, you think that using AI to create art is carbon inefficient. Well, over their lifetime, a graphic designer produces way more carbon than a large language model. And it's like, I guess that's true. But so the answer is for them to die? Yeah, you kill the human graphic designers and replace them. Yeah. I remember that was like a one of those, I mean, all these papers in the AI world, like they just, they can shit them out because they I'll just post them right to ARCS or whatever. They don't go through peer review. So you get some of these real bonkers paid.
Starting point is 00:32:47 That was like an actual paper where they calculated the carbon life cycle of a human. We're like, yeah, we should just get rid of all the people. It's why we're carbon efficient. These are me the slop art generator that you have placed in with. Also, I think it's so funny, right, as well. It's like there's hundreds of billions of pounds of investment, quote, unquote, investment. Again, asset stripping that's happening. But it's like there's, yeah, this is going to create seven.
Starting point is 00:33:11 thousand jobs. And it's like, what? Sorry, 7,000 jobs for that many billions of pounds. It's because it's to build data centers, to buy property, to seek rents. Like those 7,000 jobs are almost all rack installers that you go away after you've installed the racks. Or that's like security guards. And guess what? You're on a zero hours contract and you work for G4S. And that's because the money that's actually being quote like what invested in the UK is going to buy chips manufactured in Taiwan, graphics cards manufactured elsewhere, and installed into data centers in the UK, but for the benefit of companies owned in the US. So there's barely going to be barely any step. What we're basically
Starting point is 00:33:45 doing is just renting out our land and our land, water, and power capacity to a bunch of third countries. And guess what the impact is going to have? Even the government says it's like, yeah, it's going to basically have like no major change. It won't do anything. Once again, the theory that the only thing you're allowed to do in Britain is be a landlord. It's fully vindicated. Yeah. But we talk about the AI being like, hey, we should just kill all the graphic designers. I want to talk about your series
Starting point is 00:34:10 AI killed my job. For this, it's not just one article. It's a whole journalistic project. Before we go into some of the examples, can you just tell us about the journalistic project you're doing with AI killed my job? Yeah. Well, you know, listeners of this show will be well aware of all
Starting point is 00:34:26 of the tech CEOs who have sort of made the pitch for AI that it's going to cause a jobs apocalypse. It's going to, you know, replace 30% of all jobs. They're all selling enterprise AI software. You know, the Anthropics, Dario Amaday is maybe the noisiest here for a while, Sam Altman. You know, this is this is the message, like our technology is going to sort of just disrupt every job you can think of. It's like just in open AI's charter, this is how we, this is our project, is building AGI, which we define as software that can replace like any economically valuable or meaningful work. So like that's what they're pitching to,
Starting point is 00:35:05 you know, to investors to Fortune 500 companies to middle managers everywhere. And so there's been a lot of talk, you know, about what is what will the AI jobs apocalypse? We're only, but it occurred to me, we're only listening to, you know, the class of people who are making these pronouncements, the executives, the C-suite. So the idea was, especially after Doge, started using like this logic to go in as it was part of their project of clearing out federal agencies was to say, oh, we're going to adopt an AI first strategy. And then that like, you know, the AI isn't actually doing anybody's jobs or they're using it as a justification to fire people and sometimes explicitly so. And you'd see like duolingo saying like we're AI first now. And they
Starting point is 00:35:53 really did fire all of their human sort of content creator, not all, but a lot of them of their human translators and content creators, again, under this guise that, you know, AI is the future. And my project is to try to figure out what's actually going on. Let's demystify like AI as any kind of super intelligent force and look at it as just like a management technology, a productivity software that's being sold and deployed in these certain ways and how workers are actually experiencing that on the ground by talking to them, by hearing their stories and actually under, because I do think it goes both ways a little,
Starting point is 00:36:34 like, I mean, obviously the C-suite of the AI companies are the biggest jokes here. But I do think that there's something of a tendency on the left to say, like, well, it's all a joke and nothing's going to happen and because the software sucks. And a lot of times it does suck, but it's still sometimes it's enough for management that wanted to do all these things anyways
Starting point is 00:36:55 to make real disruptive. changes in a way that I think is historically predictable because automation technologies even if they're not great are used in in the same way in the same context a lot of times so that's the project and so for I've talked to tech workers I talked to translators and interpreters and I've talked to visual artists so free during you know that in graphic designers and illustrators as well each who are sort of experiencing AI being sort of shoved into their workplaces or their workflows or being embraced by their clients in different ways.
Starting point is 00:37:32 I think the one that I've, the story that I found, I think, the most chilling. And it's one that maybe people don't think about as much because people tend to, when they think about these kinds of jobs that get lost, they automatically think about the consumer facing ones, right? They think about copywriters who write the copy for ads they read or whatever. But one of the people you talk to is like, no, I translate instructions for nuclear power plants. Like, I translate instructions for heavy machinery.
Starting point is 00:37:56 And I'm being told, hey, you have to correct an AI's translation now of the instructions for the nuclear reactor. Good luck. You're being paid a pittance, by the way. So good luck to everyone. Yeah. Yeah. The duolingo owl told me it was fine to put that many rods in the reactor, actually. The duolingo owl told me what to do with fissile materials.
Starting point is 00:38:19 Yeah. I mean, it is the next, again, it's, it's horrifying in different contexts and different, for different reasons. one of the one of the ones that I'm working on a forthcoming installment is in I think the most horrifying probably is is in healthcare where they have AI systems where hospitals have mandated that nurses and sort of frontline healthcare workers who you know are dealing with patients have to turn to the AI for to diagnose patients first and foremost and they have to they can override it but it takes time and there's a process so The hospitals are saying, like, no, nurses, doctors, health care workers who have been in this field relying on your knowledge and your institutional knowledge and you're at everything you know about health care.
Starting point is 00:39:08 No, you have to defer to an automated system now that is going to tell you the answer and we're going to create all these hoops you have to jump through. There are now automated suicide response like crisis hotlines at hospitals where there are people calling in like in the gravest mental state like are now being greeted by automated. systems that are telling them. And the therapists who in LA, I did an earlier piece on this too, where this is, they were on, they were on a hunger strike because of, you know, it was part of a contract negotiation. But this is one of the things that had gotten so bad that it just in purely for a cost cutting measure, it used to be a licensed, educated, experienced, uh, healthcare worker or therapist would answer that line and they can detect, you know, they can tell like somebody who's in a,
Starting point is 00:39:57 in a mental crisis, they're calling up, they're not, you know, they're not always going to, like, give you the straight answer. There are a litany of telltale signs and of, of, of sort of, you know, evasions or things that people are trained to look out for to say, this person's really in trouble. We need to get them some help right now. And now it's, now it's being dictated in a lot of cases, at least by Kaiser, uh, which is a big hospital in California and, and, and some other U.S. States, buy this automated system that is either like, oh, no, they're good. Like, let them, like, put this call outbound or, you know, whatever, they don't need, they don't need help. And a lot of the thinking is, like, there's a real chance that people
Starting point is 00:40:35 have already died because of this, uh, this adoption of, of, of AI and automated systems. So yeah, so there, that, that I think is like on the, on the, the, the extreme end of the spectrum. And it's, you know, there's different iterations of that. A lot of the tech worker stories I heard where either my boss is overzealous about this and it's an excuse to cut a whole department so me and like 500 of my colleagues at CrowdStrike were we're fired or it's like I'm a software engineer at Google and I can't believe the shit that they're just like pumping into crucial digital infrastructure because they're just odd they had this like pro AI mindset where everybody should be using AI and they're just like writing code with AI and like we can't keep
Starting point is 00:41:18 up with it there aren't enough senior engineers who are checking this stuff so it could just break crucial systems and features. And so, like, three, there are three things off the back of that, right? Number one, of all of these things that are having AI slop pushed into them, how many of those are crucial for, like, the running of modern society? How many, for example, I don't know, forms that you might fill to claim benefits might be hosted on Google Cloud or, like, other essential services, how that host on Google Cloud? What if that goes down, right?
Starting point is 00:41:47 And they can't figure out why. Second, in terms of having the AI. be your suicide, like, counselor, how many people are increasingly calling about the growing problem of AI psychosis? Yeah. It's just, it's the same thing that is tormenting you at every turn
Starting point is 00:42:04 where you go, where you want help, you pick up the phone and there fucking is again. Yeah. I love it when sexy Judy Dench talks me off the ledge, actually. That's really good. Yeah. Or sexy Judy Dench talks you on to the ledge,
Starting point is 00:42:18 and then you're like, I better call somebody for help. and then it's sexy Judy Dench at your hospital answering. You're trying to kill yourself. That's great. Here are some methods that other people have tried. Yeah, it doesn't. I do, I think there is like, and I think this is something that also doesn't get talks about a lot.
Starting point is 00:42:33 There is like great enthusiasm for AI amongst a lot of people because it is very poorly understood. And like, honestly, like, I mean, most of my like university friends have like what you might call professional jobs. And they have all told me stories about encountering people in the course of their work who do their entire job using AI. who like, oh, I don't write a single email, I get chat GPT to do it all. Isn't it amazing? And it's like, no, no, it's not good. I hear this stuff and it like,
Starting point is 00:42:59 it's like fucking like sticks their hairs up on the back of my neck. I'm like, sorry, you don't write an entire email in a day and your job is an email job. What the fuck is the point? Like, what? This is where we are headed. And why, I mean, why, yeah, exactly, why have that job? Why, why have the only jobs that can be like,
Starting point is 00:43:16 well, what is your job really? And what is it, what is it doing for, that for our society, if it can be automated with a, you know, an AI. And they're like, yeah, it's great because what I've done is I've demonstrated to my employer that my job doesn't need to be done by a person. No, that's not good. That's not good that you've demonstrated that. Do you not see how that's bad for you?
Starting point is 00:43:35 That job doesn't hurt any one if it's automated by AI, but I have the number in front of me when the Mental Health Association of San Francisco loses 80% of its state funding and lays off 200 of its 250 staffers and then says, well, AI is going to have to step in and fill the gaps because we can't have 50 people taking the calls of 250, then it's like, no, that job can't be done by AI because the other thing, the third thing I was going to say, is that in so many cases, and I think this is very true for all the creative stuff as well, it's not just the things on the page that matter, the words on the page. It's not just the words that are being said to you that matter. It's like, that's imagining that like information being
Starting point is 00:44:12 communicated between two people is a social. It wouldn't matter if it was coming from which person it comes from or who it's coming from or whether it comes from a person at all. there's just a magic series of words you need to hear that will then make you feel better as opposed to connection with a person. It's just, it's just abstracted away. Yeah, and I mean, yeah, and that's the point. And it's being done so in, you know, in the service
Starting point is 00:44:33 of sort of cutting labor costs to like the smallest degree, it's like the narrow. It's, I mean, as I point out in the latest piece, like the, you know, the promise was the AI was going to automate all of these like, you know, dull and dirty and dangerous jobs. That's always aligned with the automation theory. And it's not nearly good enough to do most of that. It can do some of those email jobs that Milo was talking about, but it could also, but
Starting point is 00:44:56 what, so what can it reliably automate? It's like, yeah, it's, it's, it's, it's, it's, it's, the production of images and art. And it doesn't matter if it's any good or not, but some corporation can save, you know, a few hundred bucks a pop by having a mid-journey subscription instead of paying artists or graphic designers and or using or using the AI instead of a paid translator or all of these jobs are things that you're creating art, translating between two languages and cultures, and actually trying to attempt to interface to create, like, this is the connective tissue that, like, makes a culture or a society. And the project of AI is actively attempting to
Starting point is 00:45:37 erode those, narrow them away. Yeah, it does, I think it feels like the, the sort of project of AI, like, sort of put in your terms is to get rid of any idea of, like, humans like, like the interaction between humans are sort of essential to kind of building anything that's worth preserving. You know, so all the things where it's just like, yeah, like, you know, you can get a computer to sort of like simulate a person to talk to you. But like, there's no sort of idea of like needing to, like for these people who advocate this to demonstrate like, okay, how is this better? Like, why is this better? Like, I was speaking, I met up a friend today who's like in a sort of in a situation where his job may, like, he may or may not be the replace of AI and like his employers are kind of like leveraging that to make him work. more. So they're not sort of like kind of saying, oh, you will be automated by AI or you won't. But it's like, oh, like a lot of the things that you did like you've been doing for like five years. Because like he works like he works like a think like a thing tank adjacent thing as like an like an editor of their sort of publication. So he can kind of say that he can kind of say that yeah, there's a reasonable chance. So they might just get like this AI to basically do like half my job. But the fact that they haven't but they're sort of constantly leveraging it is just diminishing his value. And then the result of the knock on effect.
Starting point is 00:46:48 to that is like that he has really bad relations with like his line managers. He has really bad relations with his colleagues now because it's coming out of the sense of fear and this sense of insecurity. And it's just like when he's telling me this thing, I'm just, you know, the whole, like, I keep going back to this thing about like, you know, why isn't anyone sort of saying to like the people who are kind of threatening other people with like, you know, to replace them with AI? Like, how does this actually make things better for anyone, right? Other than to sort of like, cut costs. But you're like, you're creating like a different thing. And that's the other thing too. It's like, if you're like making an.
Starting point is 00:47:18 academic journal. This is a very specific example, but I do think it translates to other things. If you are sort of making like a journal of ideas, which is what this institute is about, then like the reflection of it is supposed to be like these are ideas that are made within like this sort of think tank institution, whatever you want to call it, right? They are the product of conversations that people who work there have with like a network of, um, specialists and scholars and everything. But getting an AI involved in it, it makes it a different thing. And so many of this sort of AI examples, or like the examples of like industries are about to be disrupted or like about to sort of be like altered by AI.
Starting point is 00:47:50 The presentation is like, oh, you can continue what you're doing. We're just making it more efficient or cheaper. We're making it so that you are able to do more stuff by like us getting rid of all the boring bits. But like the truth is it's like, no, you're just like you're forcing everything to become something else, right? Yeah. And this is again, this is classic. Like this is what you would expect from an automation technology. It's being used to speed up work.
Starting point is 00:48:13 It's being used as leverage against workers where it's creating this sort of, You know, it could be, as in your friend's case, it could be like a real threat. Like, they may actually decide to use an AI instead of an editor because it's, we can do some level of copywriting or they may not. But either way, it benefits management. And this is what you get time and time again when you have a technology like AI that is an automation technology that is being deployed and controlled by capital, by management. And they are using it exclusively in these contexts. We see the same, less than full-out replacement, which can happen sometimes, especially in creative fields,
Starting point is 00:48:57 where it just is pretext to say, like, we don't need this anymore. Maybe we never did. But yeah, but to speed up work, leverage against workforces, data collection and surveillance, they can get a lot of data about people, you know, what people are actually doing by inputting it into those systems. Enterprise AI has features built into where, you know, where they're, they're collecting all that data as they would with any other productivity
Starting point is 00:49:20 software. So it has all these, as all these features that are exclusively tailored to management and it's being sort of, you know, deployed from the, from the top down with this thundering sort of mythology that like, well, it's going to happen. The jobs are all going to go away. You better get on board or not. And it's, it's a, you know, dumerism that's, you know, appealing directly to the executive class. So they're responding to it. And that's the world we're living in, where the executive class has responded to this mythology of AI and, you know, has been trying often very unsuccessfully to ram it down their organization and, and sort of extract what benefits they can. And if we want to talk about just sort of to wrap up as we're sort of coming to time, if we want to talk about like the AI industry, you never, you can never, ever, ever talk about this industry without without also recognizing that the whole thing is massively subsidized, enormously subsidized. And so when all of those models get more expensive, right, to run, right?
Starting point is 00:50:19 When compute gets more expensive for like the San Francisco Crisis Center, then what you've done is you've lost all that expertise because you've got rid of all those people. They're not training anyone new. And then even the shitty solution that you brought in to kind of look like the old solution, now it's really fucking expensive because VC money went elsewhere. So this isn't just making things worse. It's not just devaluing jobs. it's destroying whole categories of activity.
Starting point is 00:50:46 Yeah. Yeah. And it is. It's a time bomb. Yeah. I mean, it would be very, very well. I mean, it's not being price. Even people in the, like, in the business side of this say like, well, we're not pricing
Starting point is 00:50:56 things in like compute and energy costs. Like we, we're still, it's still entirely subsidized by, by VC by some of the big, you know, Google and Facebook. They can, you know, they, they, they can subsidize this, you know, for, for ages. It doesn't, you know, they don't have to produce, you know, a product that is going to, you know, make a return anytime soon. They can keep doing this open AI. I mean, that's why open AI has to keep going to soft bank and to the Saudis and making increasingly aggressive entreaties. So, yeah, I mean, open AI, I think, is probably the most exposed because it doesn't have these established business lines, you know, of ad revenue, basically, from meta and Google that are that are also competing.
Starting point is 00:51:41 here. And when it goes belly up, you know, I mean, who's to say? Who's to say? We're in this, I mean, it ties into the sort of what we were talking about at the top of the show with what's going on with like the media consolidation and the Trump, you know, that's a Trump administration loves AI, right? Like, I mean, that should tell you everything you need to know. Look at the White House Twitter feed when it's just like AI generated shit. There you've got their AI action plan. So I think one of the big questions is what happens if when the bubble starts to a burst and the Trump administration decides, you know, how much it wants to sort of preserve AI or how much it actually what it wants to do because then you could get into even more
Starting point is 00:52:21 interesting formations where you could clearly this state is no longer shy about taking stakes and companies or propping them up. So, you know, we got we got a lot of, you know, brave new frontiers in front of us here. Yeah. Well, look, Brian, a lot of people come on this show and make very well-reasoned arguments about why, you know, AI is at the heart of a brewing crisis in our economy and so on and so forth. But then equally, you know, most of the people who talk about AI make a convincing counter argument, which is, no, it won't. Yeah.
Starting point is 00:52:49 It'll be fine. Yeah. Don't worry about it. It'll be great. Anyway, look, I want to say, Brian, always a delight to have you on the show. Well, again, not a delight with what we're talking about, but a delight to talk to you in particular. Well, yeah, thanks for having me, folks.
Starting point is 00:53:03 When people want to read AI killed my job, they can go to Blood in the Machine, which will link in the show notes. Anything else you want to plug and promote? No, that's it right now. Blood in the Machine.com is the newsletter. You know, there's some interesting things going on where there's some efforts to sort of hedge against some of this stuff. My most recent one, there's what we should also point out is that there's like a kind
Starting point is 00:53:30 of like a growing and youth-led kind of movement against a lot of this shit, which is really refreshing to see. So there's a bunch of stuff going on in New York. And actually in London next next week as well, sort of the Luddite themed events and sort of protest against AI and then things that are percolating. But yeah, check out the newsletter. There's some more info on that. But it's been encouraging to see more sort of organized movements kind of getting together to oppose this stuff. So yeah.
Starting point is 00:54:02 Yeah, fantastic. Perfect. Also, some shows coming up Tuesday. Today, if we're listening to this day it comes out, I'm doing a taping warm-up show for Sentimental in London. Thursday, the 25th. I'm at the Fire Station, Oxford, doing How Revolting. 27th, London, taping for Sentimental. Few tickets left for that. 28th. I'm in Cardiff doing How Revolting. 9th of October. How Revolting in Brighton, 11th, Southampton, 16th, Sheffield, 18th, Liverpool, 19th, Manchester, 20th, Leeds. Yeah, those are some cities. More dates to come. There are so many cities. It goes on from there. those are the upcoming ones. Go to my website. Please come to the shows. Be great to have you there.
Starting point is 00:54:40 Yes. All right. Thank you, everybody. And we will see you on the premium episode in a few short days. Bye. Bye. Bye.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.