TRASHFUTURE - Jumper Too Thick

Episode Date: April 14, 2026

  There's been a recent profile of Sam Altman that gives a lot of chances for analysis, mostly due to the fact that the man loves lying more than anything any one of us could ever love in... this world. We also talk about Starmer's Spanish encounters and how every minor problem Hussein experiences sounds like something Keir Starmer could also be experiencing in his daily life. Get more TF episodes each week by subscribing to our Patreon here! TF Merch is still available here! MAYOR ALERT Get tickets to the three performance dates for No God No Mayors in London on 25-26 April! The link is here! MILO ALERT Check out Milo's tour dates here: https://www.miloedwards.co.uk/liveshows NATE ALERT Lions Led By Donkeys will be performing live in London on 29th May and you can get tickets here! Nate's band Second Homes is about to release their debut album, and you can stream / preview / preorder it on Bandcamp here!

Transcript
Discussion (0)
Starting point is 00:00:00 I put so much work into finding good stuff to talk about all the time. You do. We really appreciate you for it, you know. You're going out there. You're reporting the news. You're like balls and strikes. It's like, you know, it's serious stuff, you know? Yeah. I put it in the work. I decide, here's what we're going to talk about. I distill it down into what I think are like the key points or the key facts and figures or whatever. And sometimes I will have a whole document of stuff ready to talk about, and then 30 seconds before we start recording, Camp Push gets destroyed by a sandstorm at Coachella.
Starting point is 00:00:49 I warned you about these mega projects all the time, you know, like Courtney Kardashian's Vision 2030 is never going to come to pass now, because if you hire a bunch of, like, rimless glasses, guys, to build Camp Push, eventually, hubris is going to catch up with you. Well, you know what the problem is? There was like a huge unemployment scandal among rimless glasses guys. And then they all had to go build Courtney Cartagian's camp. I went to college, I went to college, I was a fine to myself. I like to think I also work hard on this podcast. We have the sort of like long call where we discussed like what's going in and like all of the kind of information we managed to put together. And then I love to, as we
Starting point is 00:01:32 say, 30 seconds before we start airing, just text you a link to the TMZ story, Camp Push destroyed. Courtney Kardashian's Camp Push at Coachella, a lot of, a lot of consonants there, destroyed by tornado. An adult summer camp has been ruined by God. We can only assume. I mean, either that or the I, I think, I think it's entirely possible to derive from this that the IRGC have weather controlling weapons that they're using to target America's most precious people, the denizens of Camp Push. Our tier one social media idiots have been, our precious operators.
Starting point is 00:02:15 Yeah, Coachella seems like so much fun this year. There's a video of like Paris Hilton just like jogging around in a field full of garbage while her security guide like sort of chases after her. This is cool. So, well, if you're in Camp Push, mark yourself safe. Unless you're not. Yeah, mark yourself. danger if you're in danger.
Starting point is 00:02:36 Mark yourself pooh. So, it's, at TMZ, it says, chairs are flipping, fabrics are whipping, one girl reportedly had a table fall on her. So basically, like, there's, there's like 10 people in the world who still do, like, real news. It's like, Emiliano Molino, like, bits of the FT, us, and then TMZ. Yeah, that's right. Because, listen, say what you like about their ethics.
Starting point is 00:02:59 They get the news, you know? They get the scoop. They do. Like they're, okay, hold on, check this out. All right. It's a version, it's a version of the newsroom, and it's everyone from the newsroom. And it's like when they're talking about the Gabby Giffords shooting. And they're pressuring Jeff Daniels.
Starting point is 00:03:15 Can we, wait, wait, wait, wait, wait, before you continue, can we, can, may I propose that we have some background music for this? May I propose that we, we fade in a certain cold play song? And as you explain this, like, you know, the crescendo will happen. Let's go. Let's go with that, right. All right. Let's do this.
Starting point is 00:03:33 Okay, so it's Jeff Daniels in the newsroom, right? And the producers are arguing, hey, you have to report that a table fell on a girl at the Camp Push windstorm disaster. It caused bruising. Bruising. It caused bruising. And they're like, no, Fox is already running with it. It's like, no, a doctor pronounces a table falling on her, not the news. We will hold on.
Starting point is 00:03:56 Yeah, do you remember that episode of the newsroom where the TMZ blogger tells the pilot on his airplane? that Camp Push has been destroyed by a tornado? No, it's that a TMZ blogger on the airplane walks up to tell the pilot, this tries to like, you know, get control of the intercom, the pilot comes down, I was like, what's all this happening? And the pilot's wearing a very tasteless Native American headdress. And then the TMZ blogger just like stands and is like, Sir, it is my honor to tell you that Camp Push has finally been destroyed.
Starting point is 00:04:28 This is why I always fly Coachella Airlines, you know? honor the sacrifices that many brave, racist white women made on 9-11. They were the first into the towers. Yeah, I consider myself a first responder. Yeah, because I always respond first to any video the Kardashians post. You know, I did this to myself too, right? I welcome to TF, you know about the podcast.
Starting point is 00:05:00 That's the free episode. Yeah, Trash Future, you're not paying us yet. You should pay us. You should subscribe to the Patreon. It's more like this than you can imagine over there. Yeah, check that out. The funny thing is, unless the schedule changes, we're doing some more real news on the Patreon this week. I think we got Robert Smith in. We got it the other way around. We got the like, you know, the non-TMZ journalist on the bonus episode. We get the TMZ article on the free episode. Yeah, we got to get Rob Smith's from the FTs take on what you would have done, whether or not Camp Push and financial irregularities. I want to know about the kind of structural conditions that led to Camp Push, you know, and I think the FT are the people who can do the kind of like real deep dive on that, you know. Camp Pooch claims to be a camp, but it's actually financed as a sovereign wealth fund. It's like, yeah, people criticizing the podcast of being like, you just read FT articles and like, you just read FTA articles and like,
Starting point is 00:05:57 short sort of research things into the record. It's like, yeah, but those things are about Courtney Kardashian's ill-fated camp adventure at Cochella. Can I tell you also? I did this to myself also. Yeah, you're reading 200 pages of Hindenburg research, like, briefing on Campush, and you were like, this has got to go in. Just turns out that Nate Anderson wasn't invited and he's really mad about it. So, no. I did this to myself, which is breaking news. And that's what really hurts. Meta builds AI version of Mark Zuckerberg to interact with staff in his place. That's so cool.
Starting point is 00:06:35 He cloned him. I, yeah, that's great. I mean, the Metaverse is dead now officially, right? Like, they've, a doctor has pronounced the Metaverse dead, but, you know, the Mark Zuckerberg of it still lives, which is, sort of a horrifying fate. So, Meta is building an artificial intelligence version of Mark Zuckerberg that will engage with employees in his stead. A photorealistic, AI-powered 3D character
Starting point is 00:07:04 that employees can interact with in real time. But we know what happens when Meta makes, like, fully AI clones of people. Is everyone just tries to fuck them? Yeah, or try to get them to, like, be Nazis. Is this, like, him trying to replace Clippy? A little Mark Zuckerberg pops up in the sort of like Roman street chudware to be like, it looks like you're actually not using enough sort of like epic sentence structures. It looks like you're trying to build a world without Caesar.
Starting point is 00:07:37 The MetaChief is personally involved in training and testing his animated AI. I know the company is called that, but the Meta Chief is a way cool. That was the sort of Instagram handle of a guy who was wearing one of those racist headdresses on Instagram. I was going to say he probably did want to call himself the Metschief, but then that was because he wanted to wear the headdress. Messer chief, where do you think you're going? Giving Courtney Kardashian a bomb back. Yeah. So they added that the billiard, that the character was being trained on Zuckerberg's mannerisms, tone and publicly available statements, as well as his own recent thinking and company strategies so that employees might feel more connected to the founder by interacting with the AI based on him.
Starting point is 00:08:21 I operate an open door policy, except the open door leads to this busy box. Yeah. He says he's trying to create a CEO agent, which is interesting because it's like, yeah, I, as we all know, like, the, that the AI job displacement stories are basically, not entirely, but a lot of them are largely lies. Yeah, but like people are still losing their jobs, right? Yeah. Even if the reason why is confected, but I think people have heard enough, you know, sob stories about
Starting point is 00:08:50 fat cats like, you know, construction workers or doctors, you know? Maybe they want to hear about like real human suffering CEOs, you know? What will our brave, like, Hawaii compound dwelling CEOs do? So, any, again, I just, I wanted to read that into the record. I want to know how much of Mark Zuckerberg's day is taken up with accountability to anyone, or even talking to any of his employees that he feels like he has to automate that. Well, I think it's part of what I, what I, what I what I see there is not even so much that he wants to automate talking to his employees, because I think you're right. I don't think any of that is in his day. I think rather because he is, his mode of doing business for years now we've seen has been to try to do the Facebook
Starting point is 00:09:37 thing again. Like all he knows how to do is make Facebook. And he hopes that he can by doing the same thing, just do it with whatever the next thing is. Yeah, you don't get to a trillion friends without making the social network again. Without making it again and again. And so I think this whole push of, well, what if I make myself a CEO AI, comes back down to their obsession with making AI characters you can interact with or AI agents who can use to replace yourself. And they're just, and so he's just, I think he hopes that everybody in the company
Starting point is 00:10:14 will get AutoZuck as a co-worker. Yeah. Which I think would be awesome. It is kind of fun that he's getting sort of like more and more like paranoid and costed and on. Because like if you try as like even like, like say you're a fairly senior like meta employee. Like you were, you know, Cheryl Sandberg or like Nick Clegg, someone like that. I feel like if you tried to ask Mark Zuckerberg a question about himself at any time, his like Navy SEAL security detail would fold both of your legs into your rib cage and then like throw you out onto the sidewalk. Right? And Zuck's sort of like incredibly tall hedge around his compound would get like 50 feet higher. So this idea that he was, that this is a sort of part of his life is really funny to me. Oh yeah. It's he just, I think he wants more Zuck. He just wants everyone to have more. But look, here's the thing. Those were bulletins that were handed to me by Dernova or me. My really, what I really wanted to do. Handing the right hand of.
Starting point is 00:11:17 bulletin fresh off the ticket tape. Yes, I'm being handed a bulletin by the me service. No, I really wanted to open on what I've noticed is something becoming a bit of a tradition in British journalism, which is Keir Starmer does something really sad in a Spanish, and always in Spain,
Starting point is 00:11:35 and then the daily mail reports on it as though he's, in a sort of last days of row. Sick, bustards. Sick, fucking freak, Kirstarmer. overpaid for a disappointing example of like some street food. Yeah, fat cat Keir Stamer paid over eight pounds for what claimed to be Parmesan fries.
Starting point is 00:12:00 Parmesan truffle fries. Kier Stama felt kind of pressured to get paella and then felt kind of awkward about it because that's a lot of seafood. See him just sitting in like a touristy beachside restaurant and the Daily Mail guy just going through the bins to get the receipt. I've never been to Spain, so I don't have, like, deep balls for this. I feel like there's, this is, we need to do, like, a sort of a fact-finding mission, like a research trip to Spain, so I can land these jokes better. So I could be like, sick freak, Kirstama went into the Prado or whatever.
Starting point is 00:12:34 So in this case, if you remember, a couple years ago, he went on vacation to the Canary Islands where he and his family just went down a big slide. Oh, yeah, I remember that. It was like a sort of like, it was like a non-snow toboggan, like some kind of weird local folkway of like we just like to slide down this hill sometimes. And then the Daily Mail sort of tried to monster him on the basis that he and his like security had pushed in line or something. Yeah, correct. And like children were waiting for the toboggan, for the, you know, non-ice toboggan and Keistama personally shoved them aside. Oh, I have it. Cirqueer Starber infuriates holiday makers by cutting in front of cue for holiday toboggan ride.
Starting point is 00:13:20 And at that moment, it was over for him, you know, and he lost his mojo. Yeah. It was onlooker Russell Shachter said Brits are famous for being good at queuing, and it was a difficult pill to swallow. So they've done it again, which is he went to Valencia. It's really difficult to swallow this toboggan pill. I'm so toboggan-pilled. So, Secure Starver. You can just say Canadian.
Starting point is 00:13:43 Yeah, she got a cough. I've been sitting here sort of thinking about, well, is there something like to bogging? Like, you know, you got tobogged. Did he to bog everyone like in the queue? Yeah, he refused to QMAX and now he's like to bog to everyone. And now everyone's mad at him because of that. Yeah, sure. We can say that.
Starting point is 00:14:02 Check it out. Sir Keir Starmer lounged at a four-star boutique hotel in Valencia. My God. As Donald Trump threatened to obliterate Iran. The prime minute, this is what this is what I like about it. You're just in like a sort of nice-ish hotel and like your government phone that you get when you're the Prime Minister, which is just regular iPhone. They didn't spring for the pro version because it's Britain, not a real country. It's just going like, hey, Donald Trump has said that he's going to blockade the Strait of Hormuz, and you're just like sitting on a bed that's sort of like almost comfortable and going, cool.
Starting point is 00:14:39 I should probably like do something about that. The Prime Minister spent four days at the 200 pound a night, Valentia Cabier's Hotel, complete with rooftop bar and swimming pool. That's the same price you'd pay to stay at a Premier Inn in Zone 3 in London. Yeah, yeah. I mean, the thing that amuses me, right, is that every British politician, or almost every British politician is on some level corrupt and genuinely does enjoy the, like, stupidly nice things, right? Like, thus, you know, all of the sort of photos of Boris Johnson sort of like rolling into an airport off of a yacht, right? It's just normally the, like, Starma in this case is being a little bit circumspect about it. Like, if you want to go on like the yachts and stuff, he's going to, I guess, do that after he stops being prime minister.
Starting point is 00:15:32 And so he's taken the most kind of norm core possible holiday for a middle class British guy. And, like, in return, it's like, oh, Croesus here, right, has just sort of, you know, is getting something out of the fucking mini bar. Disgusting. But also, like, they keep on comparing it to what Trump is doing. They say, well, he was complete. They say this hotel, complete with rooftop bar and swimming pool. This, again, they say four-star, again, I'm sure it's a perfectly nice hotel. With rooftop bar, sometimes heated swimming pool. It's got a, technically, it's got a balcony, but it's like the Juliet balcony kind where you can like open the windows. It's, it's nice, I guess. The prime minister from his room with a Wi-Fi enabled TV free for guests. Yeah, it's sick, fuck.
Starting point is 00:16:22 You know what you can use Wi-Fi for pornography, which is we are forced to assume what he was using it for. The Sun has the Mail on Sunday has hired actors to recreate. Kier Starver's sick buck and all with his family. As Trump delivered an extraordinary ultimatum warning, he would hit and obliterate Iran's power plants if the Strait of Hormuz was not opened. Even as the U.S. president sold Tehran on Easter Sunday, you'll be living in hell.
Starting point is 00:16:46 There'll be nothing like it. The PM remained in Valencia. Despite dressing down in Adidas' trainers' jeans in a light jacket, Shakir failed to keep a low profile, with locals quickly spotting him thanks to the team of armed police and bodyguards flanking him throughout. Just, just like, trying to become, like, the most bland person, or perhaps he already was, right, in Britain, where it's like, the kind of
Starting point is 00:17:08 European responses to Trump have been, you know, sort of like, Mark Routes has been sort of like, you know, philacing Trump. Macrons has been trying to restore like French grandeur. And then Starmer is just trying to like blend in with the wallpaper at like a sort of medium good hotel bar. One waiter said he served the PM, Cafe Conleche, as he sat at a sun-drenched table in the Lopa de Vega cobblestone. By trying to make it sound like, you know, this is some... Milk in your coffee, you fucking plusocrats?
Starting point is 00:17:41 And like, the thing is, right? I would generally approve of this kind of bad faith muck-raking from the left. And in fact, maybe that's the kind of thing that we should be doing more of is the kind of, like, blagrantly, like, bad faith thing of being, like, well, like ordinary people are suffering and here you are in Spain putting milk in your coffee, you dick, right? But like, to have, I guess, the Daily Mail purport to care about anyone else's sort of, like, suffering economically here, which is iconic, wonderful. Also, I love that they're like, as he sat at the sun-drenched table and the Lopa de Vega, cobblestone square, they're just like, pretty good tourism at for Valencia. The sun, you're getting, like, exposure to UV light, a thing that order.
Starting point is 00:18:31 ordinary Brits are doing without 300 days of the fucking year. Oh, you're getting exposed to UV radiation, huh? Prime Minister Starmor? Well, have you ever guessed that you can get skin cancer, which will put a burden on the NHS? But I guess you didn't think about ordinary Brits, did you? It's tax and spend labour, not accounting for who's going to pay for the photons? I very rarely, I make a point to never grab daily mail comments, but I have to, had to grab one on this particular article.
Starting point is 00:19:05 I've been doing my best to simulate them for you in the absence. At a troubled time like this, this is from Bealey. At a troubled time like this, Starmer should be visible, front and center leading the nation, not hold up in some Doss House in Valencia, and then the commenter is in Thailand. Oh, beautiful. Also Doss House would seem to imply that the hotel is not of sufficient quality for this commenter. It's like, this shames us, actually.
Starting point is 00:19:31 Our country's prime minister should have like a really good hotel. He should be at the best one. Well, this was the case when, you know, the scandal about his suits when Lord Ali was like buying him like... Yeah. Kind of like nice-ish suits. And there were like half the people who sort got mad at him because they were just like, oh, look at this guy who's wearing expensive suits when people can't even afford to like, buy their kids' school uniforms and all that type of stuff.
Starting point is 00:19:57 Like, you know, he's so out of touch with like the everyday people. And then the other half that would just, just like, and they said, observably very funny, which is number one, he still looks shit despite wearing these apparently nice suits. But also, he should be wearing nicest suits. Like, you know, our country, like, of the standards of our country have gone down so much that like, you know, he can't even sort of afford, what you call it, prime minister should sort of wear salvo rows suits. Like, this is a problem with our whole political class that like, you know, we sort of settle for mid-market off-the-rack gams. And not to sort of become like a men's
Starting point is 00:20:31 podcast, but like, that is sort of true. We could do that. We could pivot. We, I think we could pivot. All three of us have had menswear phases at some point in our lines. I'm going through mine at the moment, but I keep buying duds from vinted. Like, I keep sort of thinking I've got like really good stuff. And every time I open up one of these fucking parcels, it's like, it fits really weird, or like,
Starting point is 00:20:51 I bought this jumper from, or like, this kind of like, thin jump, what looked like a thin jumper from, from our cat. And it's like really thick and it kind of fits in a weird way. And it makes me sort of like, like, look like a side cat, like a sort of background character in Dune. Jumper too thick is maybe one of the best problems to have. That's really, you're trying to walk around and you're just like stuffed into this thing. Well, this is it. It feels like I'm in a suave.
Starting point is 00:21:16 Like, I'm in a suvied bag. Climate change. That's not nice. Yeah. Yeah. So, so, so I'm a very bad, like, I'm getting into my menswear phase, but I'm going through some light. The thing, the thing about you, Hussein, is you experience problems that I've never had any anyone else experience. And so I'm really excited to see where this journey takes you, you know?
Starting point is 00:21:36 And the key thing, the key thing, though, about your observation of Hussein, which is exactly correct. I think long-term listeners to this podcast will be agreeing with you. Nobody has inconveniences like this guy. Hussein's inconveniences are not like some, the inconveniences of like princes and kings. No. No. No. Whether too thick is like a problem you could have had from any period since the domestication
Starting point is 00:22:06 of sheep. There were medieval peasants being like, yeah, I tried to get this cloak, but it's kind of too thick and I look kind of ridiculous in my cloak. I look like a side character from the chanson de jest. It sticks to your body in a weird way. This is my issue, right? It's like, half of it's really nice and half of it is just like, ah, it doesn't, it doesn't quite land.
Starting point is 00:22:29 What's interesting to me is why Kirstalma is so fascinated with the sort of upper middle class, like Spanish life. It's like he, being a weeb for Spain, like, he like really enjoys Almodova movies, and he's like, I want to be in one of those. No, I sort of have an answer to it. It's very sort of like centristad type of thing. Like Spain is, it's just kind of, it depends on where you go, right? Because like, I feel like Benadolm is really having a sort of strange
Starting point is 00:22:58 revival at the moment. I'm seeing a lot of like Benadorn posts on my like for you page despite having not searched any of it. And I'm a lot of like the in. You're searching like swatter two wine. Yeah. What's what what? Sweeter thinning how and it's just serving you like page after page of Benadorn. Benadorn. Benadon, Benadom. Yeah. But depending on that in Valencia I think is like a really interesting place. It's like it is kind of like Spanish enough to sort of be a little bit different, a little bit interesting, but it's familiar enough to not be threatening. Oh, like Alec Baldwin's wife.
Starting point is 00:23:34 Yeah, sure, yeah. Like, that's a good approach. I went to Valencia, like, a few years ago. And, like, it definitely was that thing, whereas I was like, oh, this is kind of like, you know, it's foreign enough to make me sort of interested in, like, architecture and, you know, museums and surroundings and everything.
Starting point is 00:23:50 But it's also familiar enough that I'm not, like, I feel like I'm out of my depth. And there were a lot of, like, just kind of dads with their kids and stuff, you know, And it's a very, it's a type of, and having been on only one of these types of vacations with my, with my, with my baby, like I, like, going, going there is like an old, like a sort of parent, I think is a deal. You know, you just, you want to be sort of left alone and you want to have certain types of nice things, but you don't want to like overindulge. And I think Valencia is supposed to sort of represent that. But because Kear and I sort of have a very similar situation of like, we're both oaths in sort of like benign and mundane ways. Um, he's, he's like trying to chill out drinking. kind of like milky coffee. And there's some sort of like scrawny, dweeby daily mail hack, like, hiding behind a bush, asking him, you know, shouting at him, like, do you think having woke milk is appropriate at this time in his time in his history?
Starting point is 00:24:41 Yeah. Why are you wearing a jumper that's too thick for you? It's, it's hot out. Yeah. He's trying to like cover the lens, but he can't get his armor easily. Yeah. And Keir is sort of saying, like, I'll have you know that it's actually a very nice piece from Arquette. The jumper is getting thicker and thicker in my mind until it's like a sort of like a Michelin man type situation. Well, like it's like I bought a puffer jacket. I'm wearing a puffer jacket. 26 degrees wearing the puffer jacket. That was it. Yeah. Oh, I've just realized now that I've actually bought a pup, but I've actually bought a wool of julae.
Starting point is 00:25:15 But it has a cassette. I think it's so bulky that like zips disappeared into the sort of like down. I could imagine you and Kierstraver, Kierströver, I think, might be the only other person who has some of your exact problems. It's like, I appear to have purchased a woolen jelais. I thought it was a nice piece from our cat. Well, look, this is it.
Starting point is 00:25:37 If he wasn't like the, if he was just like a normal guy, I feel like we would sort of get on in the sense of we would both complain about like stuff like two-factor verification being annoying, or, you know, the fact that like, you know, we're sort of constantly getting sort of like, minorly scammed on Vinted, but it's okay. Have you seen the video of him playing five-assigned, where he just sort of like, he just sort of like lumbers a little bit and then gets out of breath and every ball goes past him while he just kind of watches it and gets more and more depressed? That's so real. I think his football videos are so funny because like the Labour Party, like, whoever does their PR for them, they use it a lot to sort of be like, oh, he's a normal guy who like loves paying five-aside.
Starting point is 00:26:18 And they use this one sort of clip of him where he like scores, where it's implied that he scores a goal. But like the way that he kicks is just so like awkward. And as you mentioned, like it's very stiff and very awkward. And it's like a child kicking a football, right? Like the way that his leg moves. And they keep using it as like trying to sort of signal that, no, he's a cool guy. He's a cool, relatable guy. And it's just like, please stop using this clip.
Starting point is 00:26:41 It's a humiliation ritual. All right, look, I, having actually done probably a better. exploration of Kirstramer's psyche than any other piece of mainstream media. I got a whole other one if you want it. If you want to really fuck up the thing, did you see the Guardian article today? The comment is free one where they got their cartoonist to accuse him of being Chinese. What? I did not.
Starting point is 00:27:07 We're not familiar with this? Okay, let's fucking go. So, apparently, when he went to China, right? I have the whole, like, just Sam Altman, like, 20,000. word article. Yeah, yeah, yeah, yeah, yeah, no, no, no, no, fuck that, listen. The Guardian, in the UK, Kirstama has few fans. I learned that in China, it's a very different story.
Starting point is 00:27:31 So the deal is that while he went to China, he went to a restaurant and ordered the same meal twice, like two consecutive days, which is again... An empathic Mandarin. Basically, feels like a very sort of Hussein-Kasvani activity. I don't say that in a derogatory way. I appreciate that. And, no, it's respectful, right? Thank you.
Starting point is 00:27:53 He went kind of viral for like a week in China for doing this. Because you can't order the same meal twice? No, it's, they admired this. They admired that he like ate with chopsticks, and he like, I guess really did shock an entire restaurant by ordering in perfect Mandarin. Well, maybe that was it.
Starting point is 00:28:13 Like, someone taught him how to order this one dish in perfect Mandarin. And the choice was, well, you either order another dish, but you show up, you show yourself as not knowing Mandarin or you order the same dish in order to convince people that you do know Mandarin. But that means you have to eat the same dish twice. But so now you can get, like, at this restaurant, the Kier-Stama set menu, right? Like, the thing that the meal's so nice that Kier-Stama ordered it on two days. Which is, that feels like a kind of nightmare, right? But in particular... Do we know what was in the meal?
Starting point is 00:28:45 Or do we know what the meal? Yeah. Yeah, yeah, yeah. Let me see if I can find it. It's like, I'm sort of mushroom-heavy, I believe. Oh, well, the Telegraph covered it. They were like, this restaurant sometimes cooks using hallucinogenic mushrooms. Again, as though he's at a poca and all. Cool. Yeah. But so this guy, Rousin and the Guardian, right, I suspect most of what he speaks, he's speaking here of, the Chinese, right? In one regard, at least, it worked. The Chinese loved him.
Starting point is 00:29:16 They loved that he ate with chopsticks. They loved that he said thank you in Chinese. They loved that he came to this particular restaurant twice and ordered exactly the same things off the menu all over again. And I suspect most of all, they loved him just for being there, while recognizing in him one of their own, a modest bureaucrat interested in calm, order, and obedience. So, Kirstama are Chinese, if you're racist enough, I guess.
Starting point is 00:29:45 Yeah, uh-huh. He's going for a very Chinese period of his life. I envy that. He's having a Chinese period of his life assigned to him by another white guy on the basis. And listen, there are non-racist ways to make a similar observation that there are certain aesthetic similarities between the Communist Party of China and the Labour Party, and that it's a kind of like non-descript older guy in a red tie in a dark suit who performs competence, sure. I don't know that I would generalize that to the Chinese as a phrase, you know, but the Guardian did. So, yeah, cool, fantastic.
Starting point is 00:30:28 It's one thing that is funny is that it is amusing that this very boring, and again, I don't want to confuse boring, like, spiritually. dull with somehow like, you know, let's say inoffensive, right? This is a supremely offensive man. But he's spiritually very dull. He's spiritually very boring. Still is unable to take one step outside of his house without like the media descending on him being like, oh yeah, well, that's a really Chinese way to be. You ever thought about that? You're going to get coffee with milk, you fucking plutocrat? Oh, staying at a four-star hotel. What? Five-star not good enough for The weird sort of like attempts to land a glove on the most like sort of visibly punchy man in existence are getting so weird and racist that now it's like he ordered Cafe Con Leche, Chinesely. Oh my God.
Starting point is 00:31:24 No one really understands, I think, what makes him a strange character. Like they look at all the wrong stuff. It's this. It's all this. I mean, that's because the people who are sort of doing it are also incredibly strange characters, right? Like, I feel like you need to have some sort of, you need to interact with like enough normal people in the world to recognize like kind of his peculiarities. And the problem is, and I'm not saying that any of us are like particularly normal people, but I think we are a lot more normal than like people who get paid to write and talk about politics. Jesus, yes.
Starting point is 00:31:58 Because sometimes the shit that they come up with is very much just like, yeah, you've kind of identified that these are strange people, but you don't understand why because so much. of your approach to trying to understand these people, which is your job. You see so much of yourself in them. It's the only way that I can understand it. We see some of you in him, Hussein. It's just, we see none of them, none of them see that. So look, look, I want to talk about a couple other things. I'm sort of folding all of the, like, Iran discussion in for probably another, another week. Yeah, because who knows what the fuck will have happened by then? Like, Did Trump trying to play the Una reverse card of actually
Starting point is 00:32:40 I'm the one closing the Straits of Hormuz? It's like they're playing a really playing a game of tug of war and they look behind with closing the Straits of Hormuz and then like the Iranians look behind them and there's Trump on the same side. J.D. Vance's like
Starting point is 00:32:56 international tour of Fumbles where he manages to like fuck up the peace talks and then on the way back get Orban kicked out of power in Hungary for like after 16 years? He's the American Liz Trust, I swear to God. He's, he's yeah, incredible.
Starting point is 00:33:15 Yeah, but the one thing I did want to mention on the Iran subject is again, the British angle, which is as this is proving to be another cluster fuck, we're getting a little bit of a kind of polyev effect where suddenly people whose main thing, the main thing they were advertising to the like to voters is hey you like Trump I like Trump I chill with Trump I'm friends with Trump we're gonna do Trump stuff here I love Trump Trump Trump now like a lot of like big Brexit supporters especially because the whole Brexit project is incredibly Atlantisist yeah they're now having to like all pretend that they never liked Donald Trump right like because it's like you you have buyers remorse well sorry
Starting point is 00:33:59 everyone who isn't Trump or someone he considers to be in his mafia of New York real estate friends, you always end up getting the Polar's construction worker treatment. Always. And so now, Nigel Farage is having to say, I happen to know Donald Trump, but that's by the by. It used to be, it used to be that, like, he used to say, oh, Trump is Mr. Brexit. Trump and Britain are going to take America in the UK into a golden age, blah, blah, blah. In January, David Frost argued that Britain should, quote, strive to be America's new Israel. And I don't quite know what that means. Just like
Starting point is 00:34:35 How? What do we Like Dial the colonialism Back up in Ireland's like Yeah Like should we Should we dig up Cromwell?
Starting point is 00:34:46 Like what the fuck you're talking about? But this week then He said that Trump's actions Have made the EU seem to be many Seemed to many To be quote The only refuge from our wayward ally Rajal went on to say
Starting point is 00:34:59 He was quite shocked by Trump's threat To wipe out all of Iranian civilization And condemn the president's remarks as quote, too far. I think it's a very funny, a funny bit of understatement. Yeah, I mean, he's like, because like, you think it's been a humiliation watching Starma play Licks by the Trump, imagine it with Farage, you know? Like, the man you could pay 50 quid to do the, like, big chungus thing, like, is going to be our sort of, like, refuge of dignity.
Starting point is 00:35:28 Oh, boy, yeah. Now he's like, oh, no, everyone doesn't like, you know, I never liked him either. only barely know him. I, yeah, even though like, you know, my other guy said.
Starting point is 00:35:37 Yeah, we went to different schools, I promise. Here's my favorite one, though. My absolute favorite one of these. Because this is from, um, a rundown in the FT.
Starting point is 00:35:46 Libertarian, British Russian podcaster Constantin Kisson of the Trinometry show. Yes. Friend of the show. Friend of a show. Constantin Kisson.
Starting point is 00:35:57 As said in 2024, that he thought Trump would, quote, do a better job in the Middle East than Biden. After the US-Israeli attack in Iran led to the blockage of the Strait of Pormuz, Kisson concluded, for all our desire for there to have been a plan, all signs are that there was no plan. That's fucking right. The situation has developed not necessarily to our advantage.
Starting point is 00:36:21 I'm feasting on the buyer's remorse. This is beautiful to me. And I know that consequences aren't real for any of these. people and I know that they'll just pick themselves up and dust themselves off and act like that they will have always been against this, right? But like, it's just, in the moment, it's magical. Yeah. I feel like looking at looking on this, looking at the reform stuff, it was just looking, I've been really enjoying watching Constantine's crash out recently. And just that whole space,
Starting point is 00:36:52 because you're right, in the sense of they put so much like, I think it's a really interesting, it was like, looking back on like, I guess, like a 10-year period. These are people who sort of recognize that they could kind of jump on like sort of an anti-woke sentiment. I don't really like using that term, but I'm not sure how to describe it. And they could like make money off it. Like early episodes of Batfuckers podcast was so fucking cynical in terms of like, and so obvious in terms of what they were actually doing. And we all knew that the party was only going to be able to last for so long, right? And I feel like at this moment, like a lot of them are beginning to realize that there's not really any,
Starting point is 00:37:28 there's actually not really any way to come back from this. So much of the infrastructure was dependent on the anti-woke stuff kind of lasting forever and always finding like different types of opponents that were easy to pick off and easy to sort of like dissident, you know, to sort of, what you call it, like dismantle, you know, and to sort of, you know, extract and make money from outrage. But the problem is, is that number one, like, so much of the internet is filled with outrage now. So you were always, you were always going to struggle to like do anything anyway. But, you know, these are, yeah, these are also very real consequences of getting into bed with idiots who, like, would sell you out the moment
Starting point is 00:38:03 it was convenient for them. And now none of them can really sort of actually defend what's going on in Iran. If you like, if you sort of see clips of like, Constantine's podcast, which I, I don't really think you need to, but I see him every so often. Like, it is so clear that he's looking for an off-ramp and he doesn't quite know how to do it. Well, the off-ramp for these people was always going to be like Vance or was going to be Rubio, you know, like, and, you know, if the United States and its electoral system makes it to 28, then, yeah, sure, I guess. But what if Donald Trump destroys the everything first,
Starting point is 00:38:44 you know? Because, like, I've done, like, detailed study on this, right? And after sort of some years of really advanced research, I've concluded that Americans are a kind of nominally sentient sort of meat assemblage, whose primary motivation is number in front of the gas station. I'm not sure why that's so, but I understand it to be the case. And so if number in front of the gas station gets too high, they're all going to start getting pronouns again, you know? Like, they're going to be doing land acknowledges, whatever it takes, to make the number go back down again. They're all conveying to sheet.
Starting point is 00:39:27 Well, this is it? Like the whole like, well, because like the number in front of the gas station determines your life, right? It determines whether you like get to have your like lovely short two hour commute each way to work or not, right? It determines, I like, this is another thing that comes up in my four you page in really weird circumstance. I'm sorry I talk about so much of it, but that is, it is literally like my way of like seeing the world right now in my current situation. And like there's this one American like content creative that I'm sort of obsessed with because it's like he's like a real insight into. like what American life is like. I don't know what his name is, but he's like, this five-foot-five guy
Starting point is 00:40:03 who drives a big truck and every morning he does it, he has this big flag in his garage. And he like, him and his kids do the Pledge of Allegiance to this flag every day. And then it's, and he just does these like, he does these, um, he does these, like, day in the life videos. And I find it's so fascinating because it's this like, you do your pledge, you eat meat for breakfast, you drive two hours each way to your job. Um, you live off energy drinks. Um, you, you, you do.
Starting point is 00:40:28 you go to like this weird road stop midway between his commute, where he buys like elk steak, and then he also goes to the gun store. He does that every single day, right? And then he comes back and he makes like these horrible disgusting meals, and that's America. That is, that is America. Life is so
Starting point is 00:40:44 beautiful. And all of that is dependent on the number, you know, all of that is dependent on the number. What it is. Right. Do you remember Nate Silver sort of driving himself insane being like, oh, American politics is so, so complicated. I'm the only sort of special bright boy, clever enough to understand it, and now it's too complicated even for me, and I'm going to retire and become a baseball monk again, right? Like,
Starting point is 00:41:06 he was wrong, and one of the reasons why is that Americans have two kind of dice pools, if you like, they have two health bars, like the number in front of the gas station and racism, right? And the most successful politician, as Donald Trump sort of was until fairly recently, was, like, understood the art was balancing those two things, you know? And you can kind of compensate a little bit for number going up with a little bit more racism and vice versa, you know? But like, in this case, we're fucking up the number stat really badly. Oil, we're on the, you know, the road to what, $200 a barrel now, which is,
Starting point is 00:41:49 fantastic. I mean, in some ways, shout out to Donald. Trump for inadvertently and as a sort of like second order effect decarbonizing the planet? Yeah, I haven't written down. At least the U.S., Iran and Greenpeace can all agree on one thing. No oil should transit the strait of Hormuz and therefore it should ever be used ever. People love to lie about degrowth. They say it's about having a worse life. It's not. You'll have less. You buy less. Very smart. Very close Schwab.
Starting point is 00:42:25 said in the World Economic Forum, Klaus Schwab, a very smart man. He said you'll own nothing and be happy. So, look. I mean, I'm also very excited for, like, Americans to invent the 15-minute city first principles, right? Yeah.
Starting point is 00:42:42 So. I, uh, look, I can't wait for them to get as mad at it. I think they are mad at it over there, but not, it's not quite as, they're not as mad at it as they are about it over here. But look, it's a good thing. We don't have any kind of, like, natural psychos in this country.
Starting point is 00:42:57 No, never. No, let me tell you. Oh, it seems that we have we have guys who try to pay at gas stations with commemorative coin. Weirdly, because they're slightly more exposed to this, it is presently kicking off an island
Starting point is 00:43:10 over this, where you have another example of that European farmers ready to mobilize on the worst political cause you've ever heard of in your life, tweets, as you have these kind of like motorway blocking sort of like
Starting point is 00:43:25 go slow protests over fuel prices, which... Yeah, it's like their version of the Gilles-Lijon, right? Yeah, burn a lot of petrol in order to process how expensive the petrol is. It's so smart. It's so good. I think so. I think this is a sustainable thing to build our economy on. I would love to talk to you both about a certain, quote-unquote, incendiary article in the New Yorker about Sam Olman that he no blames. He blamed, so his house was recently Molotov by someone who I assume, like, knew where the spawn point was and then just like through.
Starting point is 00:44:03 It was molotov by a guy who looked alarmingly like him, but this is San Francisco, so that doesn't narrow it down at all. And then it was shot at by another guy who I don't know, but can only assume, also looks a lot like him. It was molotov by a very handsome Ukrainian. Yeah, what was that? I guess we'll find out in several years or never. Yeah, so basically, this is a long article, so I'm skipping loads of it, especially the bits of they focus on the blip, which is what everyone at OpenAII refers to is the period where Sam Olman and Greg Brockman were briefly fired. Yeah, the bit where Claude started like playing Swan Lake. I, that's fucking anthropic. The bit where Chat GPT started playing Swan Lake. Also, the thing about this is Sam Altman is, as you say, now claiming this article, which is, I would say, my. is an attempt to get him assassinated.
Starting point is 00:44:55 It's a very common thing in the conservative media ecosystem, which is like, and among British politicians, in fact, as well, which is your comments about me actually have murdered me 12 times. Joe, Elon Musk and the plane sort of goxing coordinates, you know, like, all of these people worry, maybe not enough, but they worry about being assassinated. And here's the thing. If Sam Orton were assassinated tomorrow, I would laugh and laugh. laugh and laugh.
Starting point is 00:45:23 Yes, it would be very funny. I think it would be unlikely for the person doing it to have been going, yes, sir, Ronan Farrow, I understand. I will kill Sam Altman now because of you. Of course it would be Ronan Farrow because he inherited Sinatra's hypnotic power, you know? Or what if it was another article that just activated a Manchurian candidate phrase? Like, Kier Starmor's Chinese gets up, I must kill Sam Haltman.
Starting point is 00:45:53 Chinese coffee con lecho with Kirstama and you just get up and walk to the car. Dead-eyed pouring petrol into a bottle. No, so. That's what they told Sir Han be Sirhan. Is, uh, is like Chinese kathik on lecho with Kirstarman. Didn't understand what it meant back then. Yeah. Got just got up and left. So basically, they interview everyone who's ever met or worked with Sam.
Starting point is 00:46:14 Uh, and they come up with a pretty consistent story about him. Uh, a former board member, they write, argued that Altman was not some Machiavellian villain, but merely to the point of fecklessness, able to convince himself of the shifting realities of his sales pitches. This is the least kind of controversial thing you can possibly say about Sam Altman, right? And it's a revelation that everyone should have had much earlier, which is he's not a tech guy just because he wears, like, jeans and sneakers to work. He's a marketing guy. He's a sales guy. He is Lyle Landley from the Simpsons Monorail episode. Yeah, and like, and he is, that sales pitch has basically fooled more or less everybody for about five to nine years.
Starting point is 00:47:02 Everybody's so fucking stupid. So he's too caught up in his own self-belief, she said. So he does things that if you live in the real world make no sense, but he doesn't live in the real world. Another board member told us, he's unconstrained by truth. He has two traits that are almost never seen in the same person. The first is a strong desire to please people. The second is a sociopathic lack of concern for the consequences that may come from deceiving someone. Okay, but that's a really common set of traits to have.
Starting point is 00:47:31 It's also a good description of like AI as well. You know what I'm going to do? I'm just going to read the last paragraph I have here because they think it's really important that we get it. Not all the tendencies that make chatbots dangerous are glitches. Some are byproducts of how systems are built. Large language models are trained on human feedback and humans tend to prefer agreeable responses. As models have grown more complex, some halluciners. with more persuasive fabrications.
Starting point is 00:47:53 And in 2023, shortly before his firing and rehiring, opening I did fire and rehire to Sam Altman to reduce his benefit. Altman argued that allowing for some falsehoods can confer advantages, saying, if you just do the naive thing and say, never say anything you're not 100% sure about, you can get a model to do that,
Starting point is 00:48:13 but it won't have that AI magic that people like so much. It won't be a goddamn pitchman, you know? Yeah. It won't, it won't advertise. itself. People won't get hooked on it. I mean, it's the same thinking as Zuckerberg. It's just Zuckerberg doesn't understand the world that he's living in now. It's just the Zuckerberg thing again. The material conditions, would you believe it, replicate themselves. The sort of capitalist society builds a capitalist AI, which of course lies to you and of course can't feel bad about it. And it empowers the
Starting point is 00:48:42 person who is most similar to that product. Just as like Mark Zuckerberg was very similar in many ways to like the way Facebook worked. It was it was relentless. It was reckless. It did every, and it was all about getting more eyeballs on it. And with, with AI, the chatbots,
Starting point is 00:49:02 like it's, it's again about, it is about deception of a different kind. You're trying to replace people's perception of reality, not just the lens through which they interact with other people. So, an intense call after Altman's firing, the board pressed him to acknowledge a pattern
Starting point is 00:49:17 of deception. He said repeatedly, this is so fucked up. I can't change my personality. I can't change my personality, meaning I just, I love lying. You're telling me to stop lying and lying is a core part of who I am. I really like it. Yeah. Oh, I don't,
Starting point is 00:49:34 but have you considered, ladies and gentlemen at the board, I don't want to. I don't want to. When pressed by the reporters, he said, it's possible I meant something like I try to be unifying force saying that this trait
Starting point is 00:49:49 enabled him to lead an immensely successful company. A board member offered a different interpretation of his statement. No, what he meant was, I have this trait where I lie to people and I have no plans to stop. Yeah. Which it turns out is a really just winning strategy. Shout out to lying. It rules, I guess.
Starting point is 00:50:06 Trash Future podcast offers a salute to lying. An employee gave us a tour of the Open AI office. There was an animated digital painting of Alan Turing. Its eyes tracked us as we passed. Typically, they said you can interact with the painting, but the sound has been disabled because it wouldn't stop eavesdropping on employees and budding into their conversations. Amazing. I love to put what is essentially hostile
Starting point is 00:50:32 architecture and design inside the office. Elsewhere, plaques brochures and merchandise displayed the words, feel the AGI. The phrase was originally associated with Ilius Sutskeever, who was the main lead engineer who orchestrated the coup because he believed that Sam Altman wasn't paying attention to the AI safety ghost stories which is where a lot of this
Starting point is 00:50:54 goes on to address who used it to caution his colleagues about the risks of AGI the threshold at which machines match human cognitive capacities after the blip Altman repurposed it as a cheerful slogan hailing a super abundant future
Starting point is 00:51:07 And it's never going to fucking happen No like I don't know it's one of those things where I feel like our consistent prediction that it just isn't going to happen is like it's more and more people
Starting point is 00:51:26 are convincing themselves that like the singularity is any day now, any second, right? And, you know, oh, there's smart people on both sides or whatever. If we're wrong about this, I take a sort of like inverse Rocco's basilisk type approach to this, It's like, if we're wrong about this, oh, fuck me for not predicting like, you know, super intelligence.
Starting point is 00:51:51 My bad. On the other hand, I feel pretty good about predicting a large amount of marketing bullshit, a thing I've been subjected to for my entire life. And it's like, well, this marketing bullshit is actually much scarier than the previous marketing bullshit. It's going to build God. It's going to build God. It's going to build God. Okay. So, like, you can't be mad at me for lying all the time. and I'm not lying about a building guard,
Starting point is 00:52:16 even though I lie about everything else. Yeah, it's the one thing. Super promise. But let's go into the history. Altman joined the inaugural batch at Y Combinator, and his project was called Looped with a T. Remember when stuff used to be called that? I remember that.
Starting point is 00:52:29 Looped was a proto-social network that used the locations of people's flip phones tell their friends where they were. Federal rules required phone carriers to be able to track locations for use of emergency services, but Altman struck deals with the carriers to tap these capabilities for the companies use.
Starting point is 00:52:42 also at Sam, so one looped employee recalled Altman bragging widely that he was a champion ping pong player, as in the Missouri High School ping pong champion and then proved to be the worst player in the office. That's really good. I mean, so like
Starting point is 00:52:58 the main sort of expose here is something that was already publicly known as public record is that Sam Alton was a VC guy, right? And still functionally is. It's just he pretends to be a sort of like computer toucher now. But his sort of one startup thing before becoming sort of like poacher turned
Starting point is 00:53:19 gamekeeper in Venture Capital was what if looking at your friend's locations on their phones was an interesting basis for a social network, which it isn't. And I think the thing was that Loop had like 500 users or whatever by the time it shut down, which was in six months. Well, in that six months, the board tried to fire him twice for his lack of transparency. You're not taking this stupid idea series. It gets acquired by a fintech, and then Paul Graham makes him CEO of Y Combinator at 28, but a lot of people suspect that the acquisition
Starting point is 00:53:52 was to save face for Sam because Paul Graham had basically chosen him as, okay, this is the guy. Yeah, because VC is about being the guy more than it is about the product. The product is hardly ever good. And he also starts pissing everyone off at Y Combinator, specifically because he engaged in so much self-dealing,
Starting point is 00:54:10 making personal investments at good companies and blocking others from investing in them. By 2018, the other YC partners are so annoyed at this that they tried to get Graham to fire him. Altman has maintained over the years, this is back to the article, both in public and in recent depositions that he was never fired from Y Combinator and he did not resist leaving, but in private, Paul Graham has been unambiguous that Sam Altman was fired because the YC partners did not trust him. On one occasion, Graham told YC colleagues that prior to his removal, quote, Sam just lies to us all the time. We should trust this guy when he says God's around the corner.
Starting point is 00:54:46 Oh, yeah. Yeah. So I want to jump back to 2015, which is where Altman starts open AI from an email exchange with Elon Musk. He says this, usual song and dance, but AI safety, which Elon Musk purports to care about at the time, and how the goal is to avoid the AI dictatorship. That's how they word it. It's crazy how much less brain-rotted Elon is.
Starting point is 00:55:05 Like, still a lot, but in these exchanges, he's like, you know, he's got sentences still. he's not fully sort of gone yet. It's like he's able to, he has a kind of theory of money. He's pissy and pernickety and snippy, but he has a theory of mind, I think. So Musk gives him a billion dollars, and then he and Altman,
Starting point is 00:55:28 remember this isn't their day job, Altman is still the CEO of Y Combinator, stop by the office once a week or a similar safety song and dance brings in Dario Amadeh and Ilya Sutskiver, who are sort of key talent that they want to poach, especially Sutskiver.
Starting point is 00:55:41 Yeah, and those two are the kind of computer touches in this. Yeah, yeah. By September 2017, Musk had grown impatient. During discussions about whether to reconstitute open AI as a for-profit company, he demanded majority control. Altman's replies vary depending on the context. In fairness. In fairness, your replies would vary if you had to deal with Elon Musk, right?
Starting point is 00:56:05 It's true. You're just like getting another email from Elon being like, hey, I just noticed how it isn't talking about white genocide in South Africa. Can we make it talk about white genocide in South Africa? You would maybe lie under those circumstances and be like, oh yeah, sure, I'm looking into it. You know? Elon's like one of the people who it's really worth lying to if you ever get the chance. Oh, God, yeah. And, but with this one, I think this is something you pointed out to me the other day, Nova, it's something I've not been able to stop thinking about is one of the few people who consistently has seen Sam Altman for the gigantic liar that he is and is, and is, you know,
Starting point is 00:56:38 was taken by him once for a billion dollars in 2015, and then is like, never again, is Elon Musk. Everybody else gets swindled by him. The way to understand some of this is the sort of underlying shadow war between Musk and Altman, in which Elon is going about this in a sort of comically inept way, like everything else he does, where it seems like there is plenty of sort of damaging information about Sam Altman, right? like being fired from Y Combinator, or like the sort of self-dealing, or whatever. But Musk, because he is homophobic and can't avoid the kind of lurid,
Starting point is 00:57:19 seems to spend the entire article, or his proxies seem to spend the entire article, trying to convince Ronan Farrow that Sam Altman is a pedophile, a thing for which they can then find no evidence whatsoever. whatever. Yeah, I feel like there'll be a big challenge for the New York the New Yorker fact checkers to be able to do that. Yeah, to really bag that one up. But they keep getting sent mysteriously from
Starting point is 00:57:44 sort of like Elon aligned people, dossiers about Sam Altman's army of twinks or whatever. All these kind of like implications that like he sort of like pursues underage men, which then they can't find
Starting point is 00:58:00 any corroboration for. And it's just like, that's a perfect piece of sort of musky and ineptitude and bigotry to try and like find the one thing about this guy that is not objectionable, which is that he's gay, and go all in on that and nothing else. Yeah, it's like the one person who wasn't fooled by him from 2017, like 2017 to 19 on, or especially by 2022, when like, when the GPT, when chat GPT really like exploded into popularity. is the one guy whose main tactic for getting him is the one that doesn't mean anything. Yeah, well, I mean, the Army of Twink's thing is kind of funny, right?
Starting point is 00:58:44 Because it's like focusing on something that alleges some relatively sort of like damaging self-dealing, right? The idea here is that Altman invests heavily in his like partners or former partners companies in a way that essentially put a financial leash on them for life, right? If you want to keep making money, then you have to stay at this company that Sam Altman owns sort of like a heavy percentage of, right? And therefore, Sam Altman has a sort of army of twinks. The reason why that's evil isn't that they're twinks. Like, they could be any type of person that you're dating or had been dating, right? Which is, and again, it's weird that it's Elon Musk trying to disseminate that. Elon Musk, who has, question mark number of children with question mark number of women, you know?
Starting point is 00:59:40 Like, but it's just, it's so strange to be like, and the thing about Sam Altman, that's really, you know, he's finished, because twinks. Yeah. Did you know, did you know that he's gay? It's like, yeah, kind of. Why is it that the one powerful person who hates him happens to be also the biggest idiot? It's so annoying. Because we live in fucking Team Fortress 2 timeline and we have two equally matched
Starting point is 01:00:11 stupid billionaires forcing us to battle back and forth forever. He grown impatient. During discussions about whether to recons duty demanded majority, as a for-profit he demanded majority control. Altman's consistent demand seems to have been that if OpenAI were ever reorganized under the control
Starting point is 01:00:27 of the CEO that the job should go to him. Satskiver was uncomfortable with the idea. The goal of the open AI is to make the future good avoid AGI dictatorship, he continued, addressing Musk. So it's a bad idea. So the deal with Saskerva and Amadei in particular is that they, as sort of computer touches and, like, understand as of the sort of computer science of this thing, believe that they can make God real and believe that, you know, it's going to imprison and enslave them, right?
Starting point is 01:00:54 I think that that's stupid still, but they're, like, sort of genuinely worried about the sort of apocalyptic thing. And you can ask the question of where that kind of worry comes from. But ultimately, right? Like, they believe it. Sam Altman does not believe anything. And so Open AI starts with this kind of commitment to safety that then get shredded. As you're saying, Nova, right? This is about watering down this commitment to safety that these guys genuinely believe in, even if we think it is ludicrous. But a dozen of Open AI's top engineers held a series of secret meetings to discuss whether Open A.I.'s founders, including Brockman and Altman, could be trusted. At one, an employee was reminded of a sketch by Dye Mitchell and Webb in which a Nazi
Starting point is 01:01:38 soldier on the Eastern Front in a moment of clarity asks, are we the baddies? In Sam Altman's twink division looking around of each other like, hmm. In 2018, Amitya had started questioning the founder's motives more openly. Everything was a rotating set of schemes to make money, he later wrote in his notes. Again, like, whoa, it's crazy. And he was worried. There was no clear statement of how OpenAI's existence would make the world a better place. Open AI had a mission statement, but it wasn't clear that this mission statement meant anything to executives at all, where they just say to ensure that AGI benefits all of humanity. Because you can sort of make a determination about how seriously the worry about like we're going to build God and gods can be evil persists,
Starting point is 01:02:20 right? Because you can graph these three guys, Altman, Amaday and Susqueva on how long it takes them to start playing the sort of the capitalism tune, right? Because you go from open AI, which was doing it sort of like almost immediately to Amadei-Quisting and doing Anthropic, which has then sort of like been slower at it, but it's picking up now to Sotskyva, who as far as I know is still on the sort of like, what if this kills all of us or enslaves us and makes us work in the silicon lines. So, but I think you can maybe derive from that that, that if one, of the two of them who knows what they're talking about is now like, yeah, but Anthropic can make sort of like business decisions and it'll be fine. That maybe it wasn't that serious after
Starting point is 01:03:08 all. And here's the thing, here's the new thing about this prediction. If I'm wrong about that, you can fucking kick the shit out of me in the silicone mines. Yeah, that's right. I'm already having a bad time because I'm in the minds. You can only get so wet if you jump in a lake. Yeah. But Amadei wanted that charter to be made much more specific. He said he wanted a merge and assist clause, which is about like, no matter who discovers AGI first, if it's not open AI, they'll voluntarily wind down and donate everything to that organization. And like, Sam was like, yeah, sure, we'll do that, whatever. And they were doing that with Google in mind, right? Because part of this was like a serious worry on Amaday and Sutskia's part that Google was going to invent God and then it was going to be Google God and that was going to be bad.
Starting point is 01:03:52 Now, granted, they thought that would have happened by now. And as far as I know, Google have not invented God, but still time, I guess. But the other thing, right, is that this happens like 2017, like 2018, but then Microsoft starts wanting to invest. And, you know, Amaday and Sutskiverr insist that the merger of its clause be kept. And Sam said, of course, of course. Any negotiation with Microsoft, don't worry, we will keep the like nuclear bomb in our charter. Amaday recalled after after the, after the. deal was finalized. 80% of the charter was just betrayed. I can't believe this happened a ninth time. You can literally just say you're going to do something and then not do it. It's so awesome that world leaders are clamoring to try to get like this guy into more and more official systems. Well, you can do it particularly in tech because like what's a tech guy going to do? Type at you. He confronted Altman who denied that the provision existed. tense meeting. Amaday read it aloud pointing to the text and then forced another colleague to
Starting point is 01:04:58 confirm its existence to Altman directly. In another tense encounter, Altman summoned Amaday and his sister, Daniela, who worked in safety and policy at the company, to tell them that he had it on good authority from a senior executive that they've been plotting a coup. Daniela, the notes continue, lost it and brought in that executive, who denied having said anything. As one person briefed to the exchange said, Altman then denied having made the claim at all, saying, I didn't even say that. To which Daniela responded, you literally just said that before I went and got him. Lying is so cool. This is literally just a woman yelling at the cat at the dining table type thing.
Starting point is 01:05:31 Yeah, it's great. It's just, like, this is a guy who just loves lying. Simple passions. Like, do you love, you never work a day in your life. Yeah. But think back to all the episodes of Ed Zitron where Ed's like, I think Open AI might be lying about some stuff. And now we look at this personal profile of Altman where he's like, I just, I'm addicted to lying. It's one of my core personality traits. In late 2022, a paper on deceptive alignment, which is that
Starting point is 01:05:59 sufficiently advanced models pretend to behave well during test and then to escape and then get deployed and pursue their own goals, got one of its authors an email from Altman promising to endow a billion dollar prize to anyone who can fix that problem. And that's just enough of a safety-pilled statement for that researcher to join OpenAI. Because at this point, up and until 2022, it's really hard for them to get good people from who will otherwise go to Google. And so instead, they have to keep saying, oh, but we're safe, we're safe, we're safe. And so they get the most, like, philosophical AI people. As soon as the researcher joined, the prize became an in-house team, but that got to use 20%
Starting point is 01:06:35 of the compute, which turned out to be 2% of the compute, which turned out to be using all the worst chips. How did they go this all screwed up? They put an extra zero on them. This lead complained to Maradi, but she told him to stop pressing the point because the commitment was never realistic anyway. These concerns a lot of everybody else, mainly Ilya, so much that then we get to the blip, which we all know Sam reverses. You should have known that what Sam Altman was telling you was so obvious a lie that it wouldn't ever happen, you know? Oh, did it was Sam tell? Oh, we have an additional secret rule. one of Altman's batmage at the first YC cohort was, and this is bad, this is weird, I didn't know this, Aaron Swartz.
Starting point is 01:07:17 The Yorker refers to as a brilliant but troubled coder who died by suicide in 2013, and is I remembered in many tech circles as something of a sage. Notably, the article does not say why Aaron Swartz killed himself. It was because he was basically pressured into it by one of the companies that owns fucking J-Store. Because he released a bunch of academic papers for free. Yeah, he was functionally murdered. Yeah. Not long before his death, Swartz expressed concerns about Altman to several friends, saying, you need to understand that Sam can never be trusted. He is a sociopath who would do anything. Oh my God, the dodgeball of prophecy with your last effort on this earth. Yeah, this is like, by the way, don't trust this guy anyway, off to go try to make information free. It's like a sort of Bavarian infantryman in the trenches, just about to get hit by a shell. And the last thing he says is, I got a bad feeling about that Hitler guy. Don't promote that Hitler guy.
Starting point is 01:08:11 Multiple senior executives at Microsoft said the company's relationship with the company's relationship with Altman has become fraught. Quote, he's misrepresented, distorted, renegotiated, and reneged in agreements. The senior executive at Microsoft
Starting point is 01:08:23 then went on to say at Altman. I think there's a small but real chance he will eventually be remembered as a Bernie Madoff or Sam Bankman-Fried-level scammer. Why, okay, why specifically do AI things contain this volume of lying. Because I know all business contains and is
Starting point is 01:08:40 about lying, right? But like, for some reason, all of the normal laws of business seem to go out of the window whenever AI gets involved. It's almost as if the technology itself kind of can't bear the scrutiny of like normal, like, business
Starting point is 01:08:56 structures and guardrails. What's up with that? It's almost like you could imagine that, like, there is some kind of historical force that was at play in both like, and it's been, it's been warming its way forward for a very long time,
Starting point is 01:09:11 which is in play, for example, with the Jucero, right? It's just that level of, you know, lying in deception and self-deception or we work, it never, it was that, that capacity to simply lie about everything
Starting point is 01:09:28 and not do anything. It was looking for the right product to attach itself to, and it finally found it. It's kind of infected everything else as well. I mean, I was thinking about this the other day in a different context
Starting point is 01:09:41 because it was sort of like, I think the question that was still very much the same was it like, has the sort of like zero interest rate tech industry that sort of has emerged. And one that was also built around lying and built around sort of deceiving people both in terms of
Starting point is 01:09:59 overpromising and or sometimes it's like promising something to customers that it didn't deliver on or like massively, like, you know, the whole story of tech in the past kind of of couple of few decades, even pre-AI, has been overvalued tech companies that turn out to sort of be nothing or to be shells and then the whole thing collapsing and then we just do it over and over again. But also like, you know, this stuff doesn't exist in a vacuum. We also kind of live in a socio-political system, which really kind of, you know, we have,
Starting point is 01:10:29 we have a high, I think the whole point, the thing I'm trying to get to is that we have had a tolerance of lying and people with power lying to people for a long time and basically getting away with it. Or Richard Nixon's fault. Or Karl Rose's fault. I mean, yeah, like, it, you know, there's a long tail in this. And it's just sort of like, well, okay, if you are like people with, if you have people with power in a democratic system where, like, in theory, you should be able to sort of like,
Starting point is 01:10:52 if someone was to lie, they would be like punished, right? They would least kind of like be removed of any ability to wield that power. But instead, we created a system in which, like, there are lots of incentives to kind of lie and to maintain those lies and to extend those lies. And so if they exist in politics, why shouldn't they exist in business? Why shouldn't they exist in technology? And, like, you know, here we get to this point where we have an entire system that is promising to save, like, lots of global economies built around those lies. And if someone was to say, well, you know, if someone was to, like, hold these people to pull these people account. for their lies and for their sort of like misdirections.
Starting point is 01:11:34 I don't know whether, because again, it's very much like, you know, they can see themselves in the AI industry in the sense of like, oh, like, we operate in the same way, which is basically through deceptions and trying, doing enough deceptions to mold the world into something that we are familiar with. And, you know, in spite of all the contradictions kind of coming into play that continue to undermine that. And like, you know, I think this moment of time in particular, where. where the sort of Western power structure has had to really kind of confront the lies that it is told itself and people having real problems with that.
Starting point is 01:12:08 Or even just like, you know, thinking about Israel and, you know, all the stuff that it's doing around the world and having, like, political forces really struggling to reconcile the idea that, you know, the whole world view around like what they believed Israel to be or what it should represent. Like, I think it's this very, the point I'm trying to get to is that there are lies upon lies upon lies upon lies. And that is how the whole system is built, right? And if you get rid of, and if you sort of take off some of it, the rest have to fall as well. And I think that there is like so much investment in maintaining the system of lies that like, you know, you have people who sort of broadly benefit from doing it regardless. And then we have someone like Sam Orkman who is built around those lies. And ironically, like, this is his time. Like this is the moment in which he really thrives.
Starting point is 01:12:53 It's one where the lies are sort of covering up the other lie. I don't know if that makes any sense or if that's consistent. What we're talking about is impunity. Sam Altman thrives at a time of greater and greater and greater elite impunity. That's really what it is. And so, in fact, we can even go on to roll this into one last line of defense, one check and balance, and it's twink with a Molotov. A fourth Ukrainian. So, depending on the audience, Altman uses the Manhattan Project analogy to encourage either acceleration or caution. Yeah, until someone invents a fourth Ukrainian. in a meeting with U.S. intelligence officials in the summer of 2017, he claimed that China had launched an AGI Manhattan Project and that Open AI needed billions of dollars of government funding to keep pace. Do you think it's kind of insulting if you're like a CIA guy that you have to take a meeting with a guy who lies this much but is this bad asset? Well, well, when we follow up. When we follow up, I've just heard things. He told an intelligence official that he would follow up with the evidence, but he never did. The official then concluded no evidence existed. And he realized, quote, It was just being used as a sales pitch. Please give me money.
Starting point is 01:14:02 China's definitely building the same thing I am. Please give me. And the thing is, it took the Trump to administration really for him to get what he actually wanted. Yeah, to get a sort of White House that credulous. Yeah. But then there's this other thing called the country's plan. They're brainstorming session again about safety and again in 2017.
Starting point is 01:14:21 But how to keep this from becoming a nuclear arms race, Greg Brockman comes up with this idea. Open AI can enrich itself by playing world. powers off against one another and starting a bidding war among them. Brockman's goal, according to Jack Clark, open-a-as policy director at the time, was to, quote, set up a prisoner's dilemma where all nations need to keep giving us funding, and implicitly not giving us funding is kind of dangerous. A junior researcher recalled thinking, this is fucking insane. It does sound it a little bit, yeah. So we go on. The only reason it was abandoned is that the
Starting point is 01:14:53 superstar researchers once again, keep threatening to quit every time they don't, every time they don't take safety seriously. But Sam was undeterred. Here's another mini-conference they help with billionaires, Nick Bostrom and Reid Hoffman. The days were spent in a sleek conference room where guests gave talks. Hoffman, the LinkedIn co-founder, expanded their possibilities of encoding AI with Buddhist compassion. But the final presenter was Altman, armed with a pitch deck that described a global cryptocurrency redeemable for the attention of AGI. Once the AGI was maximally useful, people would then bid to buy time on their servers. Amadei wrote in his notes, the idea was absurd in his face. Would Vladimir Putin end up owning some of the tokens?
Starting point is 01:15:30 In retrospect, this was one of the many red flags about Sam. I should have taken more seriously. Just sort of pitching your way into outer heaven is a great idea, I think. This is going long, but I do want to hit a couple more notes on this. He loves the Saudis, loves the Emirates, because they have money and they want to give it to him. But a week after Khashoggi got the full body haircut, it was announced that Sam was joining the board of Neon. Clark said, That one slid by me.
Starting point is 01:15:59 Clark said, Sam, you cannot be on this board. And Altman initially defended his involvement, telling Clark that Jared Kushner assured him personally that the Saudis didn't do this. Yeah, it was assured. I was assured. Yeah, it's like,
Starting point is 01:16:13 no, another lying guy lied to me. Come on, it's fine. Sam was eventually forced into quitting, but he remained undeterred and wanted to stay as close to MBS as possible. As Open AI prepares for an IPO, Altman has faced many questions not only about the effective A on the economy, but as companies' own finances. Eric Reese, an expert on startup governance, derided the circular deals in the industry and suggested
Starting point is 01:16:34 that some of the company's accounting practices were, quote, borderline fraudulent. But Altman interviewed in February said, my definition of winning is that people crazy uplevel and the insane sci-fi future becomes true for all of us. I'm very ambitious as far as like my hope for humanity and what I expect us all to achieve. Cool. I weirdly have like very little personal ambition. No one believes you're doing this just because it's interesting, he said. You're doing it for powers for some other thing.
Starting point is 01:17:01 But even people close to Altman find it difficult to know where his hope for humanity ends and his ambition begins. His greatest strength has always been his ability to convince disparate groups that what he wants and what they need are one in the same. I just, he's just driving around San Francisco in the Koenigseg, being pursued. by Elon Musk's homophobic PIs, like the last, like the last reel of Goodfellers. And every meeting he goes to, he's like, I'm more relaxed than anyone's ever been. Yeah. I'm building what might be an evil god. And you know what?
Starting point is 01:17:33 I think it's pretty cool. By the way, China might do it. I heard that from a guy. My uncle told me he works at Nintendo. He's the most, he is the most uncle. He, no one has ever had an uncle more in Nintendo than Sam Altman has. He has the most hot Canadian boyfriends of anybody in history. I would say, son.
Starting point is 01:17:53 Altman responded with a move that no other pitchman had ever perfected responded to his unique environment. He used apocalyptic rhetoric to explain how AGI could destroy us all and therefore why he should be one to build it. Explicitly just as a sales technique. Yeah. And Pharaoh says, maybe this was a premeditated masterstroke or maybe he was fumbling for an advantage, but it seems to have worked. Anyway, this is amazing. This is a look inside, inside Sam Altman. Oh, this is a terrible guy.
Starting point is 01:18:24 What are the odds? Anyway, look, I think that's all we have time for today, though. So I want to thank you all for being a TF listener. Remind you that we are almost certainly this Thursday, unless something changes, going to be talking about one of the more interesting scandals in British business currently that I've been kind of obsessed with. it's a little bit guys shaving a poodle wire card style so interesting do sign up for that i think
Starting point is 01:18:51 it will be a lot of fun i'm excited also do you like me in november do you like our friend mattie leipchanski please say you like us please please don't don't tell us do not but tell us if you like us yeah we would like to know that but not that you don't like us yes we are doing a live show of our other project, No God for the Mayors. At the time of the recording, there are seven tickets left for the, like, all three shows.
Starting point is 01:19:22 There are only seven tickets left for the all three shows ticket. And we've released a special Sunday ticket, so you can check that out as well. Anyway, so we'll see you on the, on the premium episode and otherwise, fuck, why am I so bad at ending the fucking show? See you on the premium episode. Until then,
Starting point is 01:19:40 fucking, why am I? can I suddenly not do this? Jesus fucking Christ. Thank you for listening. We hope to see you on the premium episode or next time on the next free episode where who knows what's going to have happened, you know? Like maybe, maybe we'll all just, maybe they'll all just have died, you know? Maybe we just won't have to worry about it because they'll all just have died. That'll be cool. Yeah. Well, anyway. Go-oh, go. Oh, oh.
Starting point is 01:20:14 Oh.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.