TRASHFUTURE - Edict of Brainworms

Episode Date: March 7, 2023

We take an in depth look at a an article by Henry Kissinger, Eric Schmidt, and Daniel Huttenlocher about how AI is going to re-catholicise the entire world, and turn Patagonia vest nerds into a class ...of priests. We also turn to a stupid little startup before we get into the weighty stuff. If you want access to our Patreon bonus episodes, early releases of free episodes, and powerful Discord server, sign up here: https://www.patreon.com/trashfuture *NEW SHIRTS ALERT* We have a reprint of the Flying Lada shirt and an all-new design from our friend Phoebe Paradise available for preorder! https://www.trashfuture.co.uk/shop *WEB DESIGN ALERT* Tom Allen is a friend of the show (and the designer behind our website). If you need web design help, reach out to him here:  https://www.tomallen.media/ *MILO ALERT* Check out Milo’s upcoming live shows here: https://www.miloedwards.co.uk/live-shows and check out a recording of Milo’s special PINDOS available on YouTube here! https://www.youtube.com/watch?v=oRI7uwTPJtg Trashfuture are: Riley (@raaleh), Milo (@Milo_Edwards), Hussein (@HKesvani), Nate (@inthesedeserts), and Alice (@AliceAvizandum)

Transcript
Discussion (0)
Starting point is 00:00:00 And here we are, live from the Trash Uja Studio, you're listening to Riley and Brian Carrs. All right, all right, I want to start with Barber Complaints. Barber Complaints? Yeah. I have no Barber Complaints at all. You know, I think Hussain's Barber Complaints is interesting. I just want to start with like the Barber Zone. Hussain wins one of those Turkish Barbers.
Starting point is 00:00:38 You make it look like they're going to give you the haircut and then they don't. No, like my experience was really in the sense of like I got three hot towels, nice little Turkish coffee to go with it. Great. Like the guy's usually quite good. Don't call it a Greek coffee. They'll throw you right out. Yeah, they will.
Starting point is 00:00:52 They will throw you right out. You mean a Turkish haircut? But it's kind of, I don't know, I've been finding it very difficult to go to any barbers that I sort of pay like I think between 25 to 40 pounds for a haircut and a beard shave, right? And I'm just like, look, I don't want anything too fancy. I'm like an old guy now. I just sort of want the haircut I have, but shorter. But they just, they insist on no matter what you say, no matter what.
Starting point is 00:01:18 I keep getting fades. That's right. You were in a prison of fades. He's in the fade of family. Yeah, that's right. He's drinking and puking. But I do actually have this feeling. I don't, I don't know.
Starting point is 00:01:31 I just feel like the only thing that lots of barbers know how to do at this point is fades. So when I asked him, like, do not give me a fade. You'd be a return guy for any other hairstyle. This is true though. This is what they took from you. This is true. The barber on my street literally, it only does fades, only has fades in the window, but somehow in the course of about six months,
Starting point is 00:01:50 changed from Turkish barber to Chechen barber back to Turkish barber without any of the other signage or the people inside changing. That's right. We rebranded to Chechen barber. We were trying to get the ginger crew in. Can I say something else? Because like, whenever I've gone to barbers, you know, they have like a little, I don't like describing, I cannot describe the type of haircut I want to a barber.
Starting point is 00:02:12 I've never been able to do it. I feel like it's just really weird to do it. But so usually it's sort of you find a picture, right? And I was looking at like some of the kind of face. We're going to do Ramzan Kediro. Yeah, exactly. I would like to look at Ramzan Kediro. I mean, tell me what you will about him.
Starting point is 00:02:26 He has a fantastic hairline. Yeah, no fade. But you go through like their look books and their look books are all like these kind of really strange 90s haircuts, right? They all have like, there are like lots of ones in the barber. I went to where it was like guys with frosted tips and like, you know, and obviously like, you know, I'm not going to go for the frosted tips to see. It's just like all four guys from blur and you have to pick one.
Starting point is 00:02:47 But I was thinking to myself, what if I just showed this guy, but he would still just do a fade anyway? Like what if I asked for faded tips? And he was like, yeah, yeah, absolutely. I just heard the first syllable. That's it. Well, fade say no more. I know how to do that.
Starting point is 00:03:01 Yeah, I understand that you wanted a normal haircut. I've given you a 0.5 around the sides and there's nothing you can do about that. So I think that we can say that this is yet another symptom of capitalism of flattening everything in this case. That's right. Capitalism giving everything a 0.5 around the sides. Hello, everybody. It is TF.
Starting point is 00:03:20 It is Riley, Milo Hussain and Alice. It's the free one. And before we get into all of the shucking and the jiving, the yuckum ups and so on, I wanted to do some announcements right off the bat. There are shirts. Oh, there are. Do you have a chest?
Starting point is 00:03:37 Would you like to cover it? That's right. Legally, you may have to. Small naturals, small naturals or no naturals at all or even unnaturals. It doesn't matter to us. You can put a shirt over them. Put a shirt on your eldritch tits if you want, you know? For God's sake, cover them up, man.
Starting point is 00:03:56 It's nothing sacred. There are three shirts, I believe. No, there are two. I'm sorry, there are only two. There is one less than you thought, but there are three options. Because we have a new shirt designed by Australian artist Phoebe Paradise. It's called What If Your Telephone Was Portable. And it's like a turn of tomorrow's world type thing.
Starting point is 00:04:17 What about the phone? It's a fun little thing that we've done as though What If Trash Shooter was around in the 80s. It's a fun shirt. I think you'll enjoy it. The artwork is very good. And also, we've reissued the Lada shirt, which you all know and love. What if the Soviet Union was shitting expensive, flying Lada, et cetera, et cetera.
Starting point is 00:04:36 That was so popular that we decided to reissue it. Both these shirts are available for pre-order, which means that we will let you order them for about a week, and then we will close the orders, and then we will order the shirts, and then the shirts will come. So all of this takes some time. So if you order them, please bear that in mind probably a few weeks. And then we'll probably order a few extra.
Starting point is 00:04:56 So there might be some additional sale after that. And we also have special exclusive shirts, which are going to be on sale at our live shows. Only the live shows. So if you are in other parts of Germany, for example, Frankfurt, Hamburg, Munich, there are others, then you can come to Berlin, and you can purchase one of the exclusive live show-only shirts. Take advantage of your slightly better-than-ours public transit.
Starting point is 00:05:25 Come to Berlin. Oh, yeah. Also, the third option... Don't do that. The show is sold out, I believe. The third option of the two online shirts is the usual one, which is that you can buy both shirts together and get a bit of an additional discount on top of it.
Starting point is 00:05:37 If you're a patron, you get discounts on the shirts, look on the Patreon, the codes are on there. We've not forgotten you. It's all good. Also, are you in Perth, Australia? Fucking buy a ticket to my show. This is the kind of plug that goes at the end of the show. Ignore him.
Starting point is 00:05:52 No, no, no. Don't buy a ticket to Milo's show in Perth. You dogs in Perth. Until you hear the end of the episode, when we will be telling you that. We've been messing about for a while. It's time to get into the stuff. So, we finally...
Starting point is 00:06:06 A little bit of the UK politics first, and we're going to mostly spend time in the world of people who asked... In the world. People who drew a gun on an etch-a-sketch and then freaked themselves out. Ah, but then they freaked out so much that they shook and then the gun was gone.
Starting point is 00:06:21 They were like, oh, thank God. Oh, I hope that doesn't happen again. We need to create a global government around this etch-a-sketch. They can generate the picture of the gun. That's right. But we have to do a little bit of UK-ing at first, because I just want to say congratulations to the Conservative Party.
Starting point is 00:06:40 They finally solved Brexit. It's never going to be around again. We've agreed a Northern Ireland protocol. Yeah, we've done the Windsor framework. They wheeled Charles out to shake hands with everyone. And now we've come up with an acceptable fudge, right? Because the sticking point was that Unionists didn't want there to be a border in the IRSC.
Starting point is 00:07:04 And so what we've done is we've put a border in the IRSC, but we've painted it green and we've put a big tick on it. And we're hoping that that will convince them. That sounds good. Big tick. What could be bad about a tick? Well, Lyme disease, arguably. But other than that...
Starting point is 00:07:19 Remember, it's moments like this when it's fun to remember all of the hand-waving that took place a few years ago. And they were like, oh, blockchain will be no border because blockchain will solve it. And just all of what it came to was, I guess we have to have a border that we just agree to not enforce. Yeah, or like agree to pretend to not enforce. Like the lane is going to be green.
Starting point is 00:07:41 There'll be fewer checks, but not no checks. And so this sort of like hedge is very funny because essentially it's presented the DUP with like the central question here. Do you want to go back to Stormont? Do you want to play in the sand pit again? Or do you want to sort of exile yourself to the land of wind and ghosts and have no interest in power sharing forever? And so the people who would be most hostile to this,
Starting point is 00:08:10 which would be the European Research Group in Westminster and the DUP, have both been outplayed by Rishi Sunak to the extent that they've gone... Right, we're not saying no, but we're going to get the lawyers in and we're going to take two or three months to really go through this to make sure it says here that we're not owned. And once we've established that it says we're not owned, we're going to sign it. I love the name of the European Research Group. So yeah, we've been researching Europe.
Starting point is 00:08:36 We don't like it one bit. The more we look into it, the more we don't fucking like it. Gathering data on various of bastards. What's funny to me about this, right, is the... You can notice immediately that the same kinds of media attack dogs who are now shouting about a betrayal of the ideals of Brexit or whatever, again, like, yeah, that's true. We were supposed to be one country that had no sort of...
Starting point is 00:09:01 That no other organization had sovereign claims over. Like the maximalist version that was been sold for the last five years is the only game in town has not been carried out. And the people who are like... Again, the attack dogs who've been loudly yelling about that for five years are continuing to loudly yell about it. And yet, oddly, it seems that every single columnist is no longer lining up to amplify them
Starting point is 00:09:25 and amplify their voices via the television. No, curiously. Curiously, the narrative on this one seems to be... Rishi Sunak, isn't he a clever boy? Hasn't he, like, done well to square this circle? Mainly for my own notification this week. I did a Twitter thread where I tried to encapsulate what every prevailing position on Brexit had been at different times
Starting point is 00:09:44 because you're not allowed to remember what any of the previous positions were. And it's fascinating to, like, go through the chronology of, like, when the leave campaign said, obviously we won't leave the single market, that would be insane. And then Theresa May went, well, obviously Brexit means leaving the single market, despite prior to that having been a remainder. And then various people had their careers destroyed
Starting point is 00:10:06 by trying to envisage what leaving the single market would look like by doing things like, well, if we had a blockchain laser grid in the Irish sea and so on and so forth, which made everyone look insane because they were. And then now you get the point where Rishi Sunak gives a speech in Northern Ireland where he says, don't you see you're in the best position because you have access to the British market and the single market.
Starting point is 00:10:25 How good is that? And it's like, no, you're supposed to believe the single market is bad. Have you forgotten about the lie that we all had to pretend to believe to make this whole project seem not completely deranged? Like, they've just started saying the quiet part loud because they've realized that no one cares anymore. Like, yeah, it's all made up. No one cares.
Starting point is 00:10:43 The public are no longer interested. We're just going to, like, stop with the charade that any of this was good or was in anyone's interests. Like, it's maddening. Yeah, the contradiction at the heart of the Tory party has finally been solved for a while. That's right, for a while. What's really funny is that, like,
Starting point is 00:11:03 we've managed to, the Conservative Party have managed to successfully contain one opponent of this and one big Brexit booster, Earth's weepiest man, Steve Baker, the Northern Ireland secretary, whose job it has been to sort of, like, parade this deal around with tears in his eyes, begging people to accept it.
Starting point is 00:11:23 And it occurs to me, right, that between Steve Baker and Alistair Jack, maybe giving the sort of, like, devolved administration ministries to just whichever asshole you feel like that day on the basis that, like, it's, like, important but not interesting or sexy. Well, you know what it is?
Starting point is 00:11:44 It's that they're too important to not give a job to, but too stupid to give an important job to. Yeah, and so now both of those have turned out to have actually quite serious constitutional implications, which is great for, like, fans of sort of middle-aged men being in over their heads, which we love on this process. In many ways, Alistair Jack is the perfect choice
Starting point is 00:12:05 because he has two names that are in themselves reasonably serious, but when combined, have a slightly comedic quality. Like an adult man called Alistair Jack, just rotating him in your mind. No, no. I want to move off of all of this Brexit nonsense because quite frankly, it's boring to talk about
Starting point is 00:12:23 beyond a few minutes. So I want to move instead to a start-up, a start-up company. A company that has been started up. If you live in London, you will have seen ads for this on YouTube. I enjoy his lines. London ambivalence noise is a great deal there.
Starting point is 00:12:42 If you live in or near London, you will have seen these. Is it Skull Shaver? I'm so fascinated by Skull Shaver. It's not Skull Shaver, I'm afraid. Andrew Tate's company, Skull Shaver. I see some pictures of him. He could use the Skull Shaver in Romanian prison.
Starting point is 00:13:00 No, it's not Skull Shaver. If you have long hair, that's free. Don't you buy a Skull Shaver like a rich guy and shave your hair off? Also, if you watch Channel 4, you may have seen ads for it. It is called We Are Eight. That's the digit, eight.
Starting point is 00:13:17 We are eight. What do you think it is? Milo. We are eight. What's the rate of? Legs on a spider? Is it a start-up for spiders? They put spiders in your house?
Starting point is 00:13:28 You got other insects that need eating? A little spider in there? It's a start-up for the second stage of that old lady's issue. Yeah, that old lady's issue. Yeah, Angina. Hussein, we are eight. I'm sorry to say this, but I know what it is.
Starting point is 00:13:44 I know what it is because I did a whole episode of my podcast on We Are Eight. So, yeah, I'm going to... But I can pretend I don't know, because I actually forgotten how a lot of it works. Okay, well, that's good. We're going to skip you and go directly to Alice. We are eight.
Starting point is 00:13:59 Is this going to be like some version of one of the like 50 different weird mask-wearing councils the Venetian state governed itself by? Like the Council of the 20, the Secret Council of the Eight, we're going to do one of those. Is that it?
Starting point is 00:14:16 Yeah, the Council of 20 men. I would really enjoy if we did incorporate... You want to be on a council of number, like so badly. Oh, absolutely. And in Venice, if you were like of a certain social standing, you were probably not just a member of one, but a few.
Starting point is 00:14:32 Yeah, municipal councils. Yeah, that's why you want to be. And mostly all you did was talk about who should be on other councils. Yeah, but they were incredibly secret. And like everyone was sort of knifing each other constantly. So, it is social media
Starting point is 00:14:47 for a better world is promoted heavily by Rio Ferdinand. And if you just go down... That's a good sign. Go down the escalator in the tube. You will see Rio Ferdinand's face staring back at you from a We Are Eight ad. Rio Ferdinand's weirdly rectangular mouth
Starting point is 00:15:05 staring back at you. It's like social media that like, you know, claims to do some kind of mental health for your mental health. So, I'm going to actually turn to Hussein. What does... Can you tell the other children? What does We Are Eight claim to do?
Starting point is 00:15:19 So, honestly, I've sort of gone like a lot of how it works, but what it's supposed to do, it's supposed to be like a social media platform that is supposed to tackle hate online. So, it's supposed to be like this platform where like, you won't... Yeah, it's supposed to be like this kind of... Oh, so it's like Cockney acting like,
Starting point is 00:15:36 We Are Eight. You think you ate? Nah, we fucking ate, right? We are bastards who ate back. Rio Ferdinand explains Karl Popper's paradox of tolerance to you. Yeah, so it says that it's a platform that's free of hate,
Starting point is 00:15:50 and it also sort of like is financially... It's kind of tagline is that it also sort of puts money in your wallet. And that's sort of where I've forgotten how it works, but what I do remember is reading a lot of reviews of this feature and being like, I earned nothing from it, or worse, I earned like a pound fifty
Starting point is 00:16:10 after like four months on it, I'm never going back. Because it's... Rio Ferdinand works out how racist you are, and if you're like sufficiently not racist, you earn one pound fifty. One thing to bear in mind about Rio Ferdinand is actually like, even though he's sort of like the face of the brand,
Starting point is 00:16:23 but he doesn't actually really know how it works either. And so he's sort of been used as like marketing, and this is also part of Rio Ferdinand's like post football career, where he has like tried to kind of be a tech investor. Amazing. So I think he's invested in like a few tech companies, none of which have gone particularly well. But this is the first one where he's sort of been like
Starting point is 00:16:47 the face of the brand one. What a lot of people don't know about Karl Popper is that you actually demand with the most dilated asshole in history. It's social media for a better world. Join the We Are Eight community, and together let's add some goodness to the world of social media. It's social media with a purpose.
Starting point is 00:17:05 We give you a share of our revenue, as Hussain said, a very small share. For any advertising you choose to watch in the app from brands such as Nike, Rexona, Virgin, and more. Is it all like woke advertising? Yeah, you can pick your favorite woke ads, and you can watch them, and you know, 20p at a time, great.
Starting point is 00:17:23 It's like a picture of like a cat boy, like doing drone strikes for Raytheon, you know. And then it's like Raytheon is an equal opportunities employer. That's absolutely correct. It says, so you get a little payout, but then they forward a bunch of other money to their charity partners, some of which I looked up and do not have websites.
Starting point is 00:17:43 At least at this point. Well, they're not wasting your money on a website. They're spending all of that money on guns for children, which is where I want my money going. Guns for Tots program. That's right. You can be an eight citizen, and just be a user of the app.
Starting point is 00:18:01 You can be inspired following creators on the eight stage. You can share your world connecting with family and friends, support causes, and be rewarded. Yes, you can make upwards of 10 pounds a year. Whoa. Just by watching. How do I start? Just by watching like 10 minutes of ads every day on your phone.
Starting point is 00:18:23 This is the premise of the website lockers. Do you remember that back in the day? Don't. It was a website where you could like, yeah, you could like log in and like watch ads and stuff, and then you would get given points, and then you could use those points to like buy shitty stuff. And this was in like, we're talking like,
Starting point is 00:18:39 this is like my space era. But the difference is, is that this has the same premise, exactly the same premise. Uh-huh. However, it's also going to save the world, and was started by an Australian. Oh, yeah.
Starting point is 00:18:55 Fuck yeah. Sick. So you can also be. We started by an Australian Lex Green cell. You can earn rewards like this fucking sick, can't hold and come at all. You can do fucking sick burn out. You can do so.
Starting point is 00:19:10 I would love to be a creator and we are eight who just does hooning videos. I just, I just found like my old notes from when we did the we are eight episode. And I found that the reason why re offered and invested into this was because he wanted to become Manchester United's like executive director. And he lost that position because basically no one liked him
Starting point is 00:19:28 and they didn't really find him a particularly affable person. So then his like tech investor sort of persona has basically come because he want, he was basically very mad about not being, not getting to be Manchester United's exact director. And this is how we now end up at we. He took his revenge on the world by creating a kind of version of Instagram that gives you a pittance
Starting point is 00:19:47 while claiming a kind of global mission of salvation. So as an eight creator, you decide where your content goes. Our curated eight stage feed is the window to the world. You can share your best content with the wider eight community. I love being part of an eight community. His name is Rio and he invests in startup brands. You're ready to be an eight creator. There we go. I could be.
Starting point is 00:20:10 Well, here's something insulting. They received 13 and a half million euros in series B funding for their mission to quote, unite advertisers and people. So much of this is just nothing. It's just absolutely nothing. It's like preschool shit. It's like, oh, do some coloring to fill in the time.
Starting point is 00:20:28 It sounds a bit like a startup, but you know, in the same way that that sounds a bit like schoolwork, but it's just all this is, is like, I honestly like, I'm going full Elon Musk on this. I would believe that like, I got hit by a car on the way in to do this. And like my brain is sort of like spooling out extra content while I'm like dying.
Starting point is 00:20:49 And it's like procedurally generating. Oh, here's a startup. It's got fucking Rio Ferdinand. It's like going to do mental health, social media. It's like, no, that's nothing new. Venture cap guys need to listen to trash future. That's why I've, because I have now heard about, so I don't even read about the startups, right?
Starting point is 00:21:07 Riley reads about the startups and Riley tells me about them. Okay. So from that extremely dumb guy perspective, I've now heard about enough startups to predict when a startup is not something you should invest in. And if it's anything to do with mental health, charity, the combination of watching ads and being rewarded, do not, I know this.
Starting point is 00:21:29 So people in venture capital should be able to know that. Like I hear about this and I'm like, there's no way that will ever make money. It will lose its investors a huge amount of money before being embarrassingly wound up as quietly as possible. That is, that is obvious to me. That's as obvious to me as like looking at a cube and someone asking me what shape is that.
Starting point is 00:21:48 I'm like, that's like, it's that obvious. So why would anyone invest millions of dollars in that? It just makes sense to me. The reason is VC people are smarter than us and that's why they have control of those 100 minutes. That's why we're in Sushi Samba right now and we're recording a podcast in a basement.
Starting point is 00:22:09 Yeah. All of them. They're very nice space. Better than our basement used to be. That is, that's true. It doesn't smell of sewage. The funding was led by Channel 4 Ventures. Yeah.
Starting point is 00:22:20 Maybe they should have fucking privatized them on that. No, it's the British government is funding this. Yeah. The British government is. Okay, wait. No, now it makes sense. If there's anyone stupid enough, it would be the British government.
Starting point is 00:22:32 Yes. So the idea, that's what we're going to do. We're going to address environmental challenges by looking at ads. This is, I mean. Here's an environmental challenge, Rio Ferdinand. I've trapped you in a paper bag. How are you going to get out?
Starting point is 00:22:48 Have you seen the merchandise? Because it was genuinely one of the funniest things that I saw when I was researching this. Hit me. So the fact, like, when, when, it's worth noting that a lot of the investors in We Are A are based, or like a lot of the people involved in We Are A
Starting point is 00:23:02 are all just like management consultants, right? They are kind of like. A bad sign. Yeah, exactly. That is a bad sign, right? They're former like, you know, lots of different management consultants. And this is like the most kind of like,
Starting point is 00:23:12 the best way to describe it. Oh no, they've taken down their web store. The best way to describe it is, I mean, I would just type in like, we are eight and it's on like the images. And I just want to get your reaction to it while I sort of explain what's going on. But if you were on the apprentice,
Starting point is 00:23:28 or if like there was an apprentice thing, where it's like, you have to invent your own social media network. We are eight would be one of the things that they would come up with. And the merchandise would be part of that. So when, yeah, so he's a,
Starting point is 00:23:40 Rhea Ferdinand is wearing a hat that says the eight. It used to be eight. The hoodie that says I rep my people. So the hat used to be, the hat used to be called on the web store. Our bespoke limited edition. There was like a kind of like, they were trying to sort of like present it
Starting point is 00:23:56 as the sort of like very high in demand street wear product. Man, Rhea Ferdinand is looking old. Like his mouth used to be like rectangular in a useful way. And now it's just, now it's just rectangular. It is prodigiously rectangular though. I mean, incredible.
Starting point is 00:24:15 So, we are eight inspires people with exclusive content around themes such as the planet, mental health, international women's day. And every time a user watches an ad, they are paid, donations are made to a charity, and the content will help underpin. We are eight is a hate free social app
Starting point is 00:24:30 on a mission to inspire, inspire and unite millions to solve the world's biggest problems. Again, by watching ads. So it's like a blockwork orange thing where the more ads you watch, the less hateful you're able to be because all of the ads are like about corporate
Starting point is 00:24:42 social responsibility, backlight matter. So I suspect so. Yeah, I mean, again, the thing to remember is this is basically a place for, think about sort of social media advertising in the broader sense, right? We know that a lot of it doesn't really work. It was kind of just a way to fund things
Starting point is 00:25:01 that are more or less public goods, but not run them as public goods. Yeah, it's busy. Think of it like a lot of those marketing budgets were a kind of tax paid to a company to provide a service, even then you had to pretend that you were getting something in return for it.
Starting point is 00:25:17 In this case, what we've done is we have taken the form of that, right? The form of that, but we have disconnected it from the only thing that makes it valuable, which is a bunch of people are already there. So the only people who are on We Are 8 are people who are basically just trying to make an extra 10 pounds by putting
Starting point is 00:25:36 an advert on and then going somewhere else while it plays. Yeah, management consultants. Yeah, so they say through We Are 8 sustainable ad buying engine, advertisers, and this is what, again, you don't want to know the service it actually offers, right? Not a social media platform, nothing like that.
Starting point is 00:25:53 It's that if you spend money, because We Are 8 is a B corporation, if you spend money advertising with We Are 8, then you can say that you're delivering against some kind of a sustainability goal. Ah, okay. Yeah, so you can take your marketing budget that you were going to waste anyway,
Starting point is 00:26:09 and you can put that up against the money that you're spending on sustainability, which is considering the urgency of the problems that sustainability is solving more broadly, I would say any diversion of money that is spent on sustainability that is doing anything other than, I don't know, getting cars off the road, building trains,
Starting point is 00:26:29 building seawalls at this point, is a crime against humanity. Cool. Well, that's good. Yeah, but this one has a bespoke hat. You know? Yeah. That's what it was.
Starting point is 00:26:40 Yeah, a bespoke hat. What if we put that hat on the sun, and then it played a really big hat, and put it on the sun, and then that would reduce the amount of, you know, heat radiation that came into the earth, and then that would solve climate change. My only question again,
Starting point is 00:26:54 and I asked this one a week when we covered it on the other show, which was like, what does The 8 stand for? It's not clear what The 8 is for. Why is it called 8? I don't understand, because it was like, Oh, that's one of those like central mysteries of the thing. We are 8.
Starting point is 00:27:09 Like, if it's supposed to be like an anti-hate social platform. Shouldn't it be We Are 8? Yeah, exactly, right? It's agnostic relation. It doesn't make any sense. Where does The 8 come from? Why is it there? Rio Ferdinand, if you're listening.
Starting point is 00:27:21 If you're listening to this podcast, because you're looking at what you want to invest in, like, please let us know what The 8 is for, please. Let us know what The 8 is for. I'm going to call it 8. Doesn't mean anything. People are going to wonder what it means. Before we move on to our sort of main dish for the day,
Starting point is 00:27:37 and yes, we will be talking about sort of Elon Musk's, let's say, dramatically terrible in Tesla Investor Day in the bonus episode this week. I want to read you the script from a We Are 8 ad that I found on Fiverr. How much are you going to pay me to listen to this? I will pay you one-fifth of the Patreon. Oh, okay.
Starting point is 00:28:01 Sounds like a good deal. Can you throw in a bespoke hat? You will have to do a number of other things. Wait, what? No. Okay, so this is from Fiverr, and they were looking to hire a female VO artist with a British accent to voice a TV commercial poem. Hello, hello, hello. It's a poem about 8.
Starting point is 00:28:25 It's really virtual, no. You don't have to wait to work here, but it helps. I mean, you fit the brief. If you're not perturbed by a rectangular gub, I'll tell you it'll probably make your life a bit easier. So, here's the script, and I'm going to read it now. A simple tap on an app. Great, I love it already.
Starting point is 00:28:49 Riley, can you do this poetry slam style? We'll all click our fingers at you. I think it would distract from the important content of the poem, communicating the values of We Are 8. A simple tap on an app has a profound possibility to benefit the world with a simple capability. No, you can't. You can't do that. The capability and possibility are too close in meaning there.
Starting point is 00:29:13 It's so far they've said nothing. By using We Are 8 for just two minutes a day, the impact you can make will blow you away. What's the meter of this poem? Keep shifting between couplets. Boo, We Are 8. Money to the planet, money to charity, money to you to bring unity and prosperity.
Starting point is 00:29:35 Even I could see that didn't scan. It's also slow, right? Unless you're pronouncing it really Americanly. Prosperity? Yeah, but they asked for a British. They asked British. No, no, it's prosperity. To bring unity and prosperity.
Starting point is 00:29:51 We are looking for an ordinary Irish view artist to read out this lovely poem about our new venture, We Are 8. Is anyone interested in reading out this new poem about We Are 8? I think there is another Irish one here. Riley's dying. We're social with purpose, so download the app and feel empowered with the amount you give back. That doesn't rhyme.
Starting point is 00:30:19 Again, that's a slight rhyme. It's not. Impired with the amount you give back. Before we leave off, their CEO, Sue Fennessey, is a prolific blogger. Sue Fennessey is a great name. It loves to blog about how changing the world, things of that nature. And has blogged.
Starting point is 00:30:41 Our platform and movement. Yes, a movement of people getting a ping to watch an ad. It's a movement of management consultants who are paid to be there. We love it. Our platform and movement is about changing the way we interact with each other. Social media has fostered a place where hateful communication can thrive. Unlike our place where we can watch ads. I come from a place where hate has never been allowed to thrive.
Starting point is 00:31:03 You know what's really funny about this, though, is that this comes on the same day that Aussie, Aussie media that we've talked about, you know, late of this parish has wound down as like, you know, ceased operations. So we're operating sort of a one-in-one-out policy for social movements for marketing consultants. Social movements that are really kind of just websites.
Starting point is 00:31:24 Yeah. We want to restore a vision of a happier, healthier, and more connected planet, again, by the medium of watching ads. Yeah. Or via going on the computer. You can't do anything now unless it's not going on the computer. On this international day of peace, we know it can sometimes feel like we're headed for a worse place than we started,
Starting point is 00:31:41 but we're asking you to join us at We Are Eight and decide together that we will not give up hope by watching ads. Cool. This is genuinely one of the stupidest things I think we've ever had on this. I know it's a crowded field. I know we say it a lot. But like, fuck me. Like, the disparity between the stated like,
Starting point is 00:32:02 saving the world, saving the planet, we're going to revolutionise the way people relate to each other. We're going to stop online hate somehow. It's not really clear. Even as much as all of our aims are extremely obstruous and none of them can really be achieved by watching ads, this really is the aim which has nothing to do with watching ads whatsoever. Like, at least with the climate thing,
Starting point is 00:32:23 some of the money from the ads that are watched, you know, goes to Hugo's startup which gives books to trees. But, you know, it's doing something, but you know, what is any of it doing about hate? That's unclear other than just like, we're so committed to like, good vibes. Like, honestly, the vibes I'm sending out, they're fucking stellar. I feel like if you gave a book to a tree,
Starting point is 00:32:44 it would interpret it as a threat. See what we did with your brother? Don't be fucking deciduous in your leaps onto my car. All right? Or next thing you know, you're going to be fucking Mills and Boone, son. You're going to be the fucking Stormbreaker series with Alex Ryther. All right? I am not fucking about.
Starting point is 00:33:07 Listen, if I see one more leaf on my car, the next thing you fucking know, you're going to be a complete anthology of Andy McNabb. And I am not pissing about. All right, we're not going to get any better than that for we are, right? Man threatening a tree. It's a great man. So, so.
Starting point is 00:33:28 Do you want to be eat, pray, love, son? Because you are testing my patience. Do you want to find yourself pages 100 to 400 of fucking Middlemarch? Because that is where you are headed, sunshine. So, okay, okay, okay. That was we are eight. And more importantly, that was man threatened tree. But now we're going to, we're going to go into our final,
Starting point is 00:33:50 our final segment here, which was an article that was released in the Wall Street Journal. Something that used to be a tree, a tree that fucked up. Yeah. Do you want to have the words of Henry Kissinger, Eric Schmidt and Daniel Hootenlocker printed on you? You treat Daniel Hootenlocker. Sorry.
Starting point is 00:34:09 Excuse me. I am now done with man threatening tree. I want to talk about Daniel Hootenlocker. I think it's, I think it's Hootenlocker. Sorry, everyone. No, no, I'm sorry. We've renamed him. We have shoved him into the trash reach of Locker
Starting point is 00:34:22 and renamed him. He was, he was named after the German version of the film, Locker. Yeah. So this, this article is one that I think is quite revealing about how a certain class of elite is thinking about AI. It's also full of, you might say a lot of lazy thinking, a lot of eliding of different technologies together
Starting point is 00:34:45 to make a big scary point where they basically asked to speak in spell to say the word gun and then got scared when it said the word gun. Yeah. This article is like 15,000 words long and me and Riley have been like sending each other insane with it, bouncing it back and forth for the last week or so. Love that.
Starting point is 00:35:03 Love that for you guys. Yeah. It's, it's been an experience. And now we get to share it all with you. So the article is entitled chat GPT heralds an intellectual revolution. And this is by Henry Kissinger of one of sort of the 20th latter part of the 20th century's greatest monsters fame.
Starting point is 00:35:23 Eric Schmidt, the one of the founders of Google and Daniel Hutton locker or Hutton locker as we have now renamed him, who was again, like a prominent sort of professor of computing at MIT, but who we are less concerned about directly here. So Hutton locker is very people's German clown nose. The intro, the intro or thesis statement of this article can be is as follows.
Starting point is 00:35:47 A new technology bids to transform the human cognitive process as it has not been shaken up since the invention of printing as its capacities become broader. They will redefine human knowledge, accelerate changes in the fabric of our reality and reorganize politics and society at a fundamental level. And they spend the next 15,000 fucking words elaborating on this statement, which again is autocomplete.
Starting point is 00:36:14 It is elaborating on a big and complex enough autocomplete. Cool. And it's I think that the thing that the thing that I keep coming back to right up in this article is the comparison of AI to printing in terms of knowledge generation. Yeah, right. Because one thing you can say about the printing press is that it did work.
Starting point is 00:36:36 Right. That's one of the key things. It didn't add things to things, typically. Yeah. It's one of the key things about it. The other thing about it is that it did also like save labor like legitimately, whereas a lot of sort of uses of this kind of AI and sort of heavy air quotes, as we've seen,
Starting point is 00:36:54 have just turned out to be like guys. You know that this is sort of the equivalent of like pointing to a room that is like a sort of a locked scriptorium full of monks with no windows and being like, this is a printing press. This is a machine that produces illuminated manuscripts. You have to like shove some food into it now and then. But like, you know, as long as you don't look into like any of
Starting point is 00:37:16 the functions of how this works, this is a printing press. Maybe we should go back to monks. Yeah, maybe. You know, monks, they'll do it for free. They're not allowed to accept money. Maybe we should go back to monks is kind of one of the arguments that these guys end up making in a very roundabout and different way than you're talking about in a more sort of
Starting point is 00:37:36 social way, let's say it says generative artificial intelligence, meaning sort of these algorithms that will essentially take a beginning series of values, whether those are usually text and then generate what it thinks is the next logical steps of those values, whether that is whether that is going to be the next pieces of text in a string of text, an image that will follow from it, et cetera, et cetera. Generative artificial intelligence presents a
Starting point is 00:38:05 philosophical and practical challenge on a scale not experienced since the beginning of the enlightenment, the beginning of the enlightenment when all of Europe killed the rest of all of Europe as our mode of production was shifting from feudalism to bourgeois capitalism. Yeah, pick your teams now, right? Gevelins or Gelfs, you're going to have to get in on one of these on the ground floor because pretty soon we're all going
Starting point is 00:38:29 to be cracking open each other's skulls to feast on the goo inside. That's right. Yeah. I mean, imagine if Descartes had chat GPT. The printing press enabled scholars to replicate each other's findings quickly and share them. An unprecedented consolidation and spread of information
Starting point is 00:38:44 generated a scientific method and the medieval interpretation of the world based on faith was progressively undermined. Similarly, chat GPT enables you to jack off of women, two images of women that aren't real. Can you imagine what that's going to do? So the argument that's being made by Kissinger and Schmidt and Houtenlocker is that this process, the process by which knowledge was primarily gained through revelation and that
Starting point is 00:39:09 access to that revelation was controlled by a, let's say, tightly walled, sacred, one might say, cathedral-like institution that had a priestly class and was deeply bound up with the political rule by the military aristocracy, that that process of that breaking down, of that being opened up as new economic actors were empowered, as new ways of seeing the world were coterminous with that, as the knowledge itself democratized as power spread from these very small number of
Starting point is 00:39:45 warrior elites to burgers, right? And that this process by artificial intelligence... Trying to explain the early modern age to an American. Imagine a burger. Power spread to burger. This process through the process of artificial intelligence through chat GPT, through autocomplete, is going to be thrown into reverse.
Starting point is 00:40:06 Yeah, we'll get into this in detail, but essentially what we're saying is this is going to fuck with sort of like our foundations of knowledge so badly that it must be controlled by us. It must be controlled by a priestly class of people who will have privileged access and to knowledge. And it must not be interfered with by people who can sin, which in this case is misinformation.
Starting point is 00:40:40 What we're looking at is we are looking at all of the social conventions of medieval Catholicism with none of the mystification, or rather the mystic experience is purely to what you know to be an algorithm, what you know to be autocomplete, what you know to be predictive text, but that has been now conceived as so gigantic that you must worship it as a God. We've encountered a lot of tech people who are trying to build
Starting point is 00:41:09 God recently on this show. Yeah, but the thing is though, there is mystery, but the mystery arises from these people becoming extremely stupid and we'll sort of see that. They'll say as much. It's amazing to see people fail to build a working self driving car and then immediately set about God. It seems really putting the car before the horse.
Starting point is 00:41:31 Why don't you keep cracking on at the self driving car for a while? That's probably simpler than God, I would say on the whole. You'd say first car, no, first autocomplete, then car, then God. Yeah, that's pretty much the obvious stages of a career to me. They say generative AI will open revolutionary avenues for
Starting point is 00:41:54 human reason and new horizons for consolidated knowledge, but there are categorical differences. Enlightenment knowledge was achieved progressively step by step with each step, testable and teachable. Again, that's a deeply a historical way of viewing enlightenment knowledge. It's just it is the way that we look at it now having really smoothed all the hard edges off of like.
Starting point is 00:42:12 It's wick history, you know. A lot of the enlightenment was about being concerned that you're being fooled by an evil demon. People forget about that, but it was. And now we've programmed the evil demon and it lives in your phone. That's right. AI-enabled systems start at the other end.
Starting point is 00:42:32 They can store and distill a huge amount of existing information. Again, the fact that all of this information does have to exist first is not something that they deal with in the article. Of course. And in chat GPT's case, much of the textual material on the Internet and a large number of books, billions of
Starting point is 00:42:46 items, holding and distilling that volume of information is beyond human capacity. Raising my head here. No, it isn't. No, it isn't. And the reason why we can tell it's not beyond human capacity is because that information was held and collated and distilled by human capacity.
Starting point is 00:43:02 Yeah. It may. Is it beyond the capacity of any individual human? Sure. But that's more of a distribution problem. Go and read every book ever published. Like, you know, there is a sort of an upper limit to how much you personally can like conceive of or whatever.
Starting point is 00:43:20 Fine. Sure. I mean, you can have to wait for a lot of dinosaur erotica before you start. That's probably not what you're expecting. This is like pointing to a library and because there are more books in it than you can read in a lifetime going, this library is like sort of fundamentally unknowable.
Starting point is 00:43:39 There's shit in here which we can't even like conceive of. Yeah. It's cosmic horror. Like it's like the obelisk from 2001, a space odyssey. You go down to the public library and you're like, oh, my God, there's more biff and kipper in here. Than any man could read in a lifetime. And I think the argument they would make, right, is well,
Starting point is 00:43:57 unlike a public library where that information is static, all of this information is presented dynamically. But there's a very large leap between that and knowledge is now unknowable and we must return to faith in a revelation but from the computer, basically. By what process does the learning machine stores its knowledge, distills it and retrieves it, remains similarly unknown.
Starting point is 00:44:20 No, it doesn't. No, it doesn't. No, it doesn't. Okay. Right. Like a large language model, right? These motherfuckers are scaring themselves with a graph. This is a complicated graph.
Starting point is 00:44:31 All it's doing is it's like understanding like patterns and like replicating patterns in a way that requires no cognition, no sort of like fucking thinking about shit or whatever. It's just trying to suggest to you what it thinks the next thing in a sequence is going to be in a slightly complicated way. And the fact that it's a large model and you may not be able
Starting point is 00:44:53 to like individually tease out what made it pick that one thing at any one time doesn't mean that it's like an unknown process. That's a known process that you just can't specify. These people are so stupid. Alice, are you suggesting to me that Henry Kissinger, the world's oldest man, doesn't have a good working understanding of computer science?
Starting point is 00:45:16 I don't know how much he was involved with this and how much it's just sort of like elder abuse at this point. But let me tell you, if any elders deserve to be abused, Henry Kissinger is one of them. Someone showed him that 4004, if you put it in the calculator and reverse it, it says boob and he got really scared. I think that's what's important to think about here, right, is to ask, well, what are these guys, the people that who are
Starting point is 00:45:39 writing the article and the people who they represent, what are they setting themselves up for? Which I think is that they're setting themselves up to make the argument that this is not understandable fully, that in what we need to do is create a priestly class of graph tenders. Yeah. I mean, this was sort of like, it's a cringe version of
Starting point is 00:45:59 nuclear priesthood, which is at least a sort of interesting speculative fiction idea, right? But like, it doesn't hold up because nuclear energy is something that is very complicated and is potentially very dangerous. Whereas this is, as of yet, slightly complicated and slightly dangerous. And crucially, like, more well understood than these people
Starting point is 00:46:24 are suggesting it is. They say, AI's capacities are not static, but expand exponentially as the technology advances. And this is where, again, there is this... Speculative. Well, this is where there's the sleight of hand. The main sleight of hand in the article is that they're talking about the thing we have now, which is things like
Starting point is 00:46:42 large language models, generative AI, and so on. And they're comparing that to what you might call AGI, artificial general intelligence. That is a thing that is, for all intents and purposes, meaningfully conscious, whatever that means. And there's a lot of getting into that, by the way, that could be interestingly done at another time. There's this whole thing called the philosophy of mind.
Starting point is 00:47:01 Actually, some guys came up with it in the Enlightenment. But suffice to say, they are alighting these two things. Okay, let's say there's an evil demon. Therefore, generative AI systems have capabilities that remain undisclosed even to their inventors. And with each new AI system, they are building new capacities and understanding their origin without understanding their origin or destination.
Starting point is 00:47:24 As a result, our future now holds an entirely novel element of mystery, risk, and surprise. Only if you build them, first of all, is the main thing. But second of all, I picked out that word mystery, too, because that's a very Christian theological concept. God contains mysteries. There are fucking Christological mysteries and stuff, which you're not meant to understand.
Starting point is 00:47:54 And in fact, your lack of understanding of them is the thing that you have to deeply ponder, if you're part of this freestyle class or whatever. Great, sure, which we're going to apply to the thing that can't draw hands good on the basis that one day it might be able to draw hands good. What if that hand is giving you the finger? Yeah, and so therefore, it contains the possibility of
Starting point is 00:48:20 deep cognition of a kind we can't yet understand. Well, why would it? Well, I think this is also building on that theme of mysteries. This is what we're talking about, about the return to a religious revealed text as opposed to one that is sort of centered, you might say, in people. Instead, pushed out, the source of knowledge, the epistemology is pushed out of humanity and into the God box.
Starting point is 00:48:50 And what we're doing, really, is we're putting the Protestant God, it's the scientific method, back into the Catholic revelation chamber. And so we have to have a mystical relationship with knowledge. Just as you would have had a mystical relationship with knowledge in, say, the 11th century, what is knowable is the things that are told to you by, say, the cardinal, the priest,
Starting point is 00:49:16 whatever level of society you are, however you interact with the source of knowledge, that is what is knowable to you. We're doing Midden's Advent again. Yon Lord hath revealed his new enterprise, and we shall commentate. Like, these people sort of like, you know, I heard the deep fake of Joe Biden talking about, you know, Bhutanese dragonweed and decided, okay, it's time to roll
Starting point is 00:49:44 back the enlightenment now. And the core content, the other core contention of the article, in addition to AI is unknowable, and we must have a priestly relationship with it, because we must interact with it as unknowable eldritch mysteries. The other contention is that AI has already, or will very soon, slip out of human control as it develops and nurtures capabilities that are beyond autocomplete.
Starting point is 00:50:09 But that's making people with as many fingers as it wants. Well, this is interesting to me, because like these, it speaks with sort of a philosophy of technology, right, which is that it's going to get better, it's going to get better in an uncontrolled and uncontrollable way, because that's the way we've been doing technology for our entire lifetimes, right? And so, you know, we assume that like any possible system will behave as stupidly and as irresponsibly as we have.
Starting point is 00:50:38 And therefore, it's just an inexorable fact that like the technology just does this, if we don't do it, someone else is going to do it, China's going to do it or whatever. And also that there's no people involved, right? The best people can do is sort of like hang on to the side. It's this weird sort of like negation of responsibility and negation of labor and all of these other things that we're so familiar with from like all of the capitalism we're used to.
Starting point is 00:51:04 It's just now these guys have like got religion about it. The religion stuff to me is like really interesting only because it sort of feels like, I've sort of seen this a lot even like with AI guys and just like tech guys who are sort of like trying to use Catholic or like kind of other sort of religious aesthetics. And it feels like if you kind of look at it from the outside, it feels like they're sort of having a meltdown.
Starting point is 00:51:28 But I think like what's happening here is really just like a way of kind of being this article kind of is a really good example of that. Trying to kind of at the one hand, trying to sort of like reaffirm why these people who are kind of already elite in society should sort of remain in their positions, but also to kind of do so in a way that tries to make sense of and especially like with like Eric Schmidt as well,
Starting point is 00:51:53 like trying to kind of work out how to assert the importance or insert like why this technology is necessary, despite how like a massively like anti-human it actually is. And like all the examples that we've seen of AI are incredible ones out on a very basic level. Like the advantages and the sort of like pros of it promoted by these tech guys are like, oh, you'll need fewer humans to sort of do this stuff.
Starting point is 00:52:19 And you can kind of like automate all these things and you can kind of make people or you can make stuff more productive and they don't really sort of kind of elaborate on that. And I guess like the problem that emerges out of that is like, well, okay, if this thing is designed primarily just to sort of like eradicate the use of people, especially when it comes to sort of like building tech products, how do you kind of justify its existence, right?
Starting point is 00:52:41 How do you justify like the sort of havoc and the chaos that this is about like that if applied in the way that these guys would like it to be applied to, like how is that going to, how are you going to justify that? And I wonder whether like the kind of pseudo like fascination of religion and the idea of oh, you can't question AI as a system because like it has this sort of like God like character.
Starting point is 00:53:03 And so the only way in which to sort of treat it is not with kind of like suspicion or with criticism, but to kind of entrust it to a smaller set of elites to manage it. It's almost like a rarefication of like AI as a kind of like a whole without sort of them maybe realizing. I don't know. AI is a way of knowing, but I'll be, the thing is like
Starting point is 00:53:23 when you look at a concept like God, the concept of God is useful when it was like last used. Yeah, no, it's useful as a self-driving car though. It is useful for a society to keep itself together, to allay people's concerns about what will, what will happen that they can't quite answer yet. It's also useful for promoting social, social bonds, both horizontally and vertically.
Starting point is 00:53:48 We have our society of the orders. It was ordained by God, our oaths to one another, our relationships of tribute, those are ordained by God. We're also ordained by God that we're Christendom and not, you know, the, the Fiendish Turk. You know, these, these various things are. Yeah, he's busy giving you a fate. The Fiendish Turk is too busy giving you a fate to ever develop
Starting point is 00:54:10 generative AI. But these, these are useful ideas. And if you want to talk about, right, what happens when we make more people surplus to requirements? Well, we know what happens when you make more people surplus to requirements, which is what's been happening in the UK for the last 40 years when huge, huge swaths of the country just consigned to manage decline, which is a polite way of saying
Starting point is 00:54:32 suggesting that people just die, right? And a very useful and, and there, but there was, there was a contradiction there because you did not have, right? What you, because you were, you were trying to do that while contending with a sort of, with something that descended from a basically liberal philosophy that asserted sort of protestantly these sort of fundamental, fundamental in some ways equal moral worth of people.
Starting point is 00:54:58 So people objected to being told to, well, just die. And if you have a God though, that you can talk to, that you control access to as a priestly class, that is much more germane to a much tighter, hierarchical closed society, like that sort of of say, for example, the high medieval period, then it would be to one where that, that the scientific method was sort of came out as, again, when I say egalitarian, I don't mean
Starting point is 00:55:24 like actually egalitarian, but it did represent an opening of power from its like closed feudal halls. This, what this is, is this is bringing back God so we can go back into the closed feudal halls, but without any of the any of the actual mysticism. What's curious about this though to me is that all of these people have been sort of like very successful, very, very powerful people ideologically in their society.
Starting point is 00:55:49 All of them are sort of like good neoliberals in their own way. It doesn't get more successful than like Henry Kissinger and Eric Schmidt politically. And this is over the course of decades. And previously, their sort of way of shifting that responsibility and the way of justifying those authoritarian things was the market, right, the line, our beloved line. And now we found a sort of replacement for the line,
Starting point is 00:56:12 which is like even more explicitly unknowable. And it just strikes me that we've ended up with this, this ideology in neoliberalism that's just sort of like walking around looking for something to surrender to. To do all of the things that you wanted to do that like coincidentally happen to like allow you to have your very nice house and everything else and immiserate everyone else. But you have to like sort of be seen to be handing over
Starting point is 00:56:36 control to something else, something that's like fundamentally out of your out of your hands, out of your control. And this is like the ultimate expression of it is these guys going like, yeah, no, it's not me putting you out of a job. It's the Etch-a-Sketch, right? And to be honest, you shouldn't, you should not like feel hostile towards me about that because I'm really more scared of the Etch-a-Sketch than you are because I understand it on more levels.
Starting point is 00:57:02 It's the objective and rational God that also means tests everyone and everything. So this is also a Democrat God. I hate not being able to get into heaven because I haven't like opened a small business in an underfunded area for at least two years. So this is this actually takes us quite tidily into our next section, which goes from recreating God to the flock,
Starting point is 00:57:24 the naive user, the lamb. This says on the receiving end, generative artificial intelligence appears superhuman or at least greatly enhanced cognitively to the naive user. This means you or me or anyone who isn't in the priestly class or goes innocently trying to jack off to an image of a woman with eight hands. It seems like a supremely fast and highly articulate librarian
Starting point is 00:57:47 scholar coupled with a professorial savant. No, it doesn't. It's not about shit all the time. It's just confident. It's like confidently wrong. That's one of the things it's built to be. It's one of the things we understand about it. And it's like you only come away thinking,
Starting point is 00:58:01 Oh, this thing's so smart. If you are extremely easily fooled. And I think this is one of one of the central things about this is again, these are vastly important people. Everything has gone their way. They've gotten everything they want out of politics and it has made them so much dumber, so much more insular to the point that they are able to talk to a chatbot.
Starting point is 00:58:24 And they interview fucking GPT in this article and ask it whether it thinks it has a soul. And it goes, well, I can't really talk about that. And they go, holy shit, it's like experiencing existential doubt. Being this rich, being this successful is legitimately bad for you. It's bad for your brain. It goes back to the thing where we were getting the AI
Starting point is 00:58:48 to generate Kia Starma speeches. And it was doing brilliantly. And then we tried to get to generate a Trump speech and it couldn't do it. And then we realized that the AI isn't good at sounding like Kia Starma. Kia Starma just sounds like an AI. So the reason why they're so terrified is because when you talk to the chat GPT,
Starting point is 00:59:05 it sounds like every politician and person that they respect. It deflects every question and talks in complete meaningless nonsense. Which they have been raised to believe is like the high, like the absolute zenith and summit of being like a smart guy. It's not having any kind of substantive opinion, but just saying things like, well, I think this is a multifaceted issue. And then they're like, God damn this thing's smart.
Starting point is 00:59:29 We've hacked liberals. If we can get into these large language models and we can like tell them, okay, you got to give confident, but kind of non-committal answers in the direction of like socialism. You know, these people, they got to throw up their hands and go, well, fuck it. I guess we got to, I guess we got to do some like socialism then.
Starting point is 00:59:49 Because, you know, it... Chat GPT told me in the future, all vapes would be olive flavored when I lost my mind. Look, look, that's what God said to us. It goes on though. All these qualities encourage unquestioning acceptance of whatever GPT generates in a kind of magical atmosphere or operation.
Starting point is 01:00:05 You're the one who are accepting it. You're the people who are just accepting it unquestioningly. You're the people who said, well, to other less developed minds, it would come across as an articulate librarian scholar, but it's coming across the way to fucking you. You wrote that. God damn.
Starting point is 01:00:20 But at the same time... Some old men get terrified of the computer. One excellent article. At the same time, it possesses a capability to misinform its human users with incorrect statements and outright fabrications, which is where we come back to the Catholic concept of sin.
Starting point is 01:00:35 If you interact with God poorly, if you disobey its rules, then your soul will be polluted and you will have experienced misinformation. Imagine if Donald Trump got a hold of Chat GPT. That would be really scary. I think it'd be fun to watch and play with it. What if some Macedonian teenagers hacked Chat GPT
Starting point is 01:00:55 and then it stopped Hillary Clinton becoming president like a third time? You know? I think the idea is that these people understand misinformation as the greatest possible danger to social cohesion. In so doing, they've managed to perfectly replicate
Starting point is 01:01:15 Catholic scholastic theology of the Middle Ages, where the idea is, not only is it fine that you can't read the Bible, it's like that's actually good because there is a spiritual hazard involved. If you, the naive user, start thinking about this stuff on your own without some sort of intercession,
Starting point is 01:01:35 you can badly misguide yourself. And therefore, it's necessary to have this highly educated, to have this set of doctors of the church to interpret this for you. My other example I was thinking was also just creating a protected class of ulama. In Iran, where the actual sort of ulama
Starting point is 01:02:00 are kind of treated as kind of modern, well, contemporary, the royalty is not the right term, but a very elite protected class. Every sort of Shia Muslim in the world has to pay, I have to pay a portion of my income to this group of people. Because they're the elite.
Starting point is 01:02:21 And that's part of it. If you don't pay. Someone's got to kill Salman Rushdie. That's free. Look how badly they've done it so far. It needs more money. You would think that based on the amount of money I've given them over the years,
Starting point is 01:02:34 that they would have invested at least on these things. You should demand to be on an investicle for the attempt on Salman Rushdie's life. They're creating like, they want to create an elite ulama class that has political privileges and has lots of access but doesn't ever get questioned,
Starting point is 01:02:52 questioning them is akin to questioning God. And also like the greatest fear is that like you read incorrect or falsified hadiths, right? And so, yeah, I think what they're trying to invent is an ulama of tech guys that you're not allowed to question and should have as much political influence as possible.
Starting point is 01:03:14 Henry Kissinger is wearing the big turban, as we speak. Yeah, and they get to wear the robes. Honestly, so when I was thinking about like training to be like in sort of Islamic seminary, one of the things I was thinking about is I would just love to wear robes. You can just do that after having a shower. I can actually just wear robes.
Starting point is 01:03:32 I do like wearing my buffer but I feel like I should be allowed to wear robes. You want to wear your bathrobe in a more official capacity? Yeah, yeah. But it's just very funny to think of the idea like, oh, it's very annoying that my local graph tender has taken a secret wife and I'm not allowed to do anything about it.
Starting point is 01:03:49 I'm telling positive interpretations of AI prompts around the town. Yeah, I mean, this is the thing and these people are like so stupid and so corrupt that if they try and do this and they try and set themselves up as AI clergy, they are going to start the AI reformation. It's going to be like two years they're going to start it.
Starting point is 01:04:06 But also the funny thing is, right, this is in many ways the product of the scientific method of the idea that through the application of individual rationality, we can create tools that will allow us to understand the universe better than we can understand it ourselves and the idea that we create a tool that understands and interprets the universe so well
Starting point is 01:04:27 that we cannot ourselves understand it fully means that what's happened is that Protestantism has created the Catholic God which has now supplanted it. Awesome. But you know what that means, right? It went Catholicism, Islam, Protestantism. That means we have now Catholicism too.
Starting point is 01:04:48 What's next logically? It's time for Islam too. I've been soft launching this one for a while. Very proud to be part of the launching. That's what they're building in the big cube. That's right. There's a new cube in Saudi Arabia. It's going to be the Kaba too.
Starting point is 01:05:07 It's going to be even bigger. Yeah, they're investing a lot into AI and like very into like, a holy place that's like genuinely enthusiastic about crypto and blockchain. This is how you get the fucking Dajjal, dude. First of all, this is so obviously idol worship. Second of all, it's so obviously false prophecy.
Starting point is 01:05:27 Third of all, we had Seamus on relatively recently and he ended up talking about fucking Kiyama. So like, it's going to keep coming back to this. This is now a sort of Islamic eschatology podcast. We're talking about the end of days because it's soon. MBS dropping hints about Islam too, the way like Elon Musk does about New Tessus and stuff, just being like, yeah, yeah, we're toying with seven pillars.
Starting point is 01:05:50 We're considering seven pillars. Just putting that out there. It's going to be big. Very big things happening in Saudi Arabia. They're doing a new Islam. It's going to be bigger. It's going to be the biggest in the world actually. It's going to be one of the greatest things, better than ever.
Starting point is 01:06:06 I see that as the only possible outcome, which is we're going to get cyber Islam. Hadith, I don't like Hadith. I like stuff God actually said. So like, Wellbeck was right, but he just didn't include enough like robots. Yeah, that's all right. The ultimate impression of a human conversion
Starting point is 01:06:24 is that the AI is relating stationary collections of facts into dynamic concepts. Again, no, no, that's to you. You think that even though the model is incapable of... Because they're saying that other people will think that, but then they're saying, well, obviously this is how it's going to be soon. So we might as well act as though it is. Do you think Kissinger like asked for chat GPT when his birthday was
Starting point is 01:06:45 and the chat GPT like obviously found out. He was like, Holy shit. Oh my God, what are you getting me? How did you going to get to me? How did it know? He's Arnold Schwarzenegger. I'm coming right now. It must be God.
Starting point is 01:07:01 I'm Henry Kissinger to me starting a war is as satisfying as coming. So it says, even though the model is incapable of understanding in the human sense, but that implies that it's capable of understanding. Yeah, that's a beautiful sort of illusion, right? Like slipping from one thing to the other, because there's no understanding happening. There's no cognition there as to where, you know, cognition arises as a whole different thing.
Starting point is 01:07:23 But I'll tell you where it's not is in the fucking chatbot, right? It's not there. It is like you are deluding yourself if you think that this thing is understanding anything. Instead of plotting out points on a graph that correspond to which words it's like detected in a shitload of Google Books and 4Chan posts. I was reading about this thing called the Chinese Room, and now I understand that the chat GPT has some very smart Chinese guys inside there, and they understand everything.
Starting point is 01:07:52 They are mastermindings the end of the world. So I think this is also where we say, right, throughout the article, they acknowledge that this is communication without rationality that reasonable. This is a new kind of intelligence, but they keep on sort of trying to deflect that blow at the same time, saying that it is doing a kind of understanding, which means that those two things are in contradiction. You can't have both.
Starting point is 01:08:20 You can't say it's rational without being reasonable, and also it's understanding. It is completely impossible and intellectually lazy. It says, It's outputs reflect an underlying essence of human language. No political or philosophical leadership has formed. In the same way that if you take a complete hamburger and you put it in a blender, that reflects an underlying essence of the total hamburger, right?
Starting point is 01:08:46 Imagine a burger. Yes. I'm legitimately trying to get through to the listener now. This is not an underlying essence of human language. This is human language, graft and like modeled and then predicted like fucking exponented out, right? There's nothing underlying there for fuck's sake. It's like drawing a guy and going,
Starting point is 01:09:10 Holy shit, that looks like a guy. Maybe he's real. Yeah, literally. No political or philosophical leadership has formed to explain and guide this novel relationship between man and machine, leaving society relatively unmoored to which I again ask the question, Hey, how come all of those societies are so relatively unmoored? Henry Kissinger and Eric Schmidt.
Starting point is 01:09:30 Oh, that was because that was because of the previous God. That was because of the line. Now we've got this new locker. You're all right. We've got no beef with you. But Kissinger and the other guy, you're on notice. You know, it's like, well, yeah. Okay.
Starting point is 01:09:44 They're saying that this new revelatory epistemology will lead to social dislocation. But how come the previous one fucked the human centered one that we kept on trying to do fucked up so badly? It could have nothing to do, of course, with the people writing this article. Yeah. I wouldn't worry about it. And we should put them in charge of being sort of prophet seer and revelator of whatever the next thing is.
Starting point is 01:10:09 The truth of generative AI will need to be justified by entirely different methods to enlightenment science, which was trusted because each step of replicable experimental processes are also tested and therefore trustable. As we attempt to catch our understanding up to our knowledge. There's a beautiful little attempt to sort of murder the scientific method with a couple of quick stabs there is to be like, no, no, don't don't try and apply this to this.
Starting point is 01:10:30 It's too hard. So we shouldn't try. Yeah. No, no, you're never, you're never going to be because one person can't understand. Again, one, there are whole fields of science. One person doesn't understand, but they're understood by humans working together. Well, you think Socrates could have understood going on the computer.
Starting point is 01:10:47 Also, also not to be too like reductive about this, right? But these large language models, they were built and trained by humans. They were designed by humans. These are these are things with which, you know, like human labor and the human intellect is like deeply connected. And like, I know that these people's like special move is abstracting the labor from things they've been doing it their whole lives, right? But it doesn't work that easily.
Starting point is 01:11:14 This isn't just magic, this thing, just because the model now runs by itself doesn't mean that like the data involved in training it and the design involved in like creating the architecture of it, like doesn't matter anymore, right? And this is a whole area of philosophical thought that they completely refuse to engage with, right? Because this is like quite a famous argument in the philosophy of mind, which is that like you can't build something that's as smart as you are.
Starting point is 01:11:42 Because that's just like you can't build up to the limits of your own intelligence because you have to be a bit smarter than a thing to make it. This is like a popular like philosophical theory and it kind of and it's born out by AI because AI all sucks. It's really bad. It's nowhere near as smart as we are. It's like because we made it. Like and then and so the idea that then AI could make a smarter AI
Starting point is 01:12:04 is even more incorrect. It's like, well, AI is dumber than we are. So the AI that it would make would be even dumber than it is. Wait, isn't this just the plot of multiplicity? I don't know what that is. So the other thing is like windows 95 did not build windows 98. Like it did not. That did not happen.
Starting point is 01:12:24 And again, this is another place where they allied what actually exists with what they think will exist at someone in the future. And again, there's this premise that what exists now necessarily implies what will exist in the future. I let large language models necessarily imply the future existence of artificial general intelligence. I think it's probably far from certain. It's at least it's far enough from certain that I'm not ready
Starting point is 01:12:50 to have a priestly class monopolize knowledge creation again. I don't know. Call me old fashioned. Well, you know all fashioned enough. Yeah, exactly. It says what about the machine is not yet being revealed to us and what obscure knowledge is it hiding? It's a fucking Google Doc.
Starting point is 01:13:12 It's a it's a fancy Google Doc. You put stuff in it and then it keeps hitting the predict the next word button. And so it's the fucking viral tweets where someone goes, Oh, to start with I think women are and then put in the rest of your predicted is that but on a large scale for search engine optimization. Can we please take the torch out from under our chin? It's too late. You know, we've we've found this weird thing.
Starting point is 01:13:37 And now we're pretty determined that it's a good idea to worship it like a God. I think this is also just like the fact that it's and again, like there are lots of parallels to the whole sort of like the kind of crypto blockchain thing that, you know, see people these guys seemingly just have kind of forgotten about or like pretended to have forgotten about. But it's very much the thing of just like, oh, we kind of this thing has been invented and like those those got one point five. It exists and like, you know, it has lots of pictures of haircuts
Starting point is 01:14:08 when I ask for haircuts and many of them are not fades. And therefore it must be a lot smarter than like everything else. And we have to like, yeah, we have to sort of valorize it and we have to give as much kind of money and resources to people who are like making it and not question. And I think ultimately that's kind of what it comes down to. It's just like this, I get the impression that the feeling is in trying whether they're sort of like advocating the creation of like a clergy
Starting point is 01:14:35 or an ulama or whether they're not. The impression that I sort of got was you should not kind of question the utility and the place of the AI you should just kind of let it happen because ultimately it's going to like happen anyway and it's going to establish itself on its own and therefore, you know, and that sort of seems to be when I was reading this, I was kind of like I felt like I was going insane in certain areas because it just felt very much where yeah, it felt very much like what they were trying to say was
Starting point is 01:15:05 like this is going to happen whether you like it or not and you can't stop it. And we don't even like it. And we actually don't like it either. So we have to reluctantly say that our mates have to be the clergy and Cataela and also you can't question it because would you question God? No, you wouldn't. You'd feel very guilty like a bad boy and, you know, you'd be very sorry about that.
Starting point is 01:15:29 You know, ultimately, I think it's just kind of like it's just like sort of standard tech bullshit in a lot of ways. And, you know, and I see this in like other kind of corners of the sort of like chat GPT enthusiast. And again, like it mirrors a lot of the blockchain stuff because a lot of the blockchain advocates were like, well, this is going to happen anyway. And if you don't like invest in it, then like, you know,
Starting point is 01:15:48 have funds staying poor and all that stuff. And, you know, it's not to say that like it all kind of like the AI will sort of see the same fate as like crypto and blockchain for reasons that we've kind of covered on the show. But it also, yeah, to me, it sort of feels like this is kind of the best way to sort of pronounce it is as PR, but ultimately what it's trying to say is that like, you know, we want this to happen.
Starting point is 01:16:13 We would very much like our friends to sort of be in control of it. And if you say anything else, then like you are sort of more, you are committing like sins on a moral level. The peoness of this is sort of like curious to me as well, because I did wonder as they're writing it, how much is like sort of cynical power seeking and how much is like legitimate like, I am scared by the Etch-a-Sketch.
Starting point is 01:16:43 And I think they genuinely are scared by the Etch-a-Sketch. I think they are afraid of this thing. And like, again, it's an ideology like seeking stuff to surrender to, but imagine if this was applied to anything else. Imagine if fucking aliens landed on the south lawn of the White House, during say the next administration, and they had sent Henry Kissinger out to like talk to them. This is exactly what he would have said is like,
Starting point is 01:17:12 please don't harass us, first of all. Second of all, you should leave us in charge of everything. Well, I'd actually like to move to the political implications they talk about. They say, and this is basically, we can sum up all of their political discussion with one, a couple of sentences, and this is what they write. The question remains, can our leaders learn quickly enough to challenge rather than simply obey?
Starting point is 01:17:36 Or are we in the end obliged to submit? Are what we consider mistakes part of the deliberate design? What if an element of malice emerges in the AI? Oh my God, these people have no epistemology, they have none. Like, okay, I'm not going to say that like nothing interesting comes out of AI. It recombines existing stuff, sometimes in the course of doing that, it presents like a new or interesting thing, right?
Starting point is 01:18:01 But it depends on the user, like the person, the human at the end of it, to determine which of those is useful and which of those is garbage, and to be able to interpret that and say, that person's got too many hands on their fingers, or that molecule you've just tried to synthesize won't work. Like, and we're just sort of like, again, just writing that labor completely out of the equation here.
Starting point is 01:18:24 And they go on, they say, the potential for group obedience to an authority whose reasoning is largely inaccessible to its subjects has been seen from time to time in the history of man, perhaps the most dramatically and recently in the 20th century subjugation of whole masses of humanity under the slogan of ideologies on both sides of the political spectrum, and then they write my favorite sentence in the article,
Starting point is 01:18:43 perhaps a third way of knowing the world may yet emerge, one that is neither human reason nor faith. What becomes a democracy in such a world? To which I say, you have invented a third way, chat TIG. If Tony Blair were not, were not made, it would be necessary to invent him. Like, and you know, again, this comes back to, if the entire foundations of your society
Starting point is 01:19:09 have been shaken by the autocomplete, the thing that actually exists as opposed to the thing that you're positing might exist in the future implied by the autocomplete, then the people who are going to believe that, whether you're common people committing the sins of misinformation, or you're an elite who's decided to have an Avignon AI papacy because maybe you have some disagreement on some interpretation of who should be where or whatever, right?
Starting point is 01:19:35 That is the capstone on a lot of other social decay and a lot of other unrepresented conflict. That's a lot of contradictions building up, and then you're just looking at the AI and assigning it because it happened most recently, because it was proximate, because it's the thing everyone's talking about, that it was the real cause of the social breakdown that you were imagining.
Starting point is 01:19:56 If the video of Joe Biden talking about the Bhutan ditchweed leads to social breakdown, then perhaps your social fabric was not so strong for reasons that are not related to the graph. It wasn't related to the people who wrote this fucking thing. Don't talk to me about breakdowns in society when you've been breaking it down on purpose for like 250 years or however fucking long you've been alive,
Starting point is 01:20:21 you old, dead bitch Henry Kissinger. Well, also on the Henry Kissinger point, he is someone who could reasonably be considered to be kind of a master of the universe in terms of the fate of the world in the 20th century, and he's now kind of feeling what it would be like to live in the world he created, where all of your ability to decide what you're going to do,
Starting point is 01:20:43 how you're going to live, especially if you're like, I don't know, a Central American farmer or whatever, is taken away from you by systems that are deliberately made difficult for you to access or understand and where judgments of what to do are sort of, let's say, handed down from far away authorities or something that doesn't feel so good when the boot's on the other foot, does it, Henry?
Starting point is 01:21:06 What is it? You said this to me, Henry Kissinger is trapped in a room with a copy of Capitalist Realism for five minutes and then just decides that we need a new god. Yeah, we have imprisoned Henry Kissinger inside an invisible web that constrains his action, and it's like Google also complete. You know, and again, like this,
Starting point is 01:21:29 this is what I go back to as well, it's like in the sense that neoliberalism is the process of surrendering to automatic scripts, right? And those scripts are not, those scripts we deny that they exist, you know, we remember a lot of people saying, oh, neoliberalism, it's not a real thing. Also, these scripts are things you cannot question,
Starting point is 01:21:47 there's no alternative to them. We've been governed by AIs for 40 years, it's just they're AIs in human bodies. And it speaks to... Metal Gear Solid 2 Real, Arsenal Gear Real, Hideo Kojima as a genius. Yeah, and I think like this process, right, this process of turning everything into an AI
Starting point is 01:22:09 has already happened within the realm of elites, you know, and this is one of the reasons why they're so scared as a ruling class because this talks to their ultimate desire to continue surrendering stuff on our behalf to forces that they say are beyond their control. There is one thing that I want to say about this that is really funny, which is you put it in perspective, right?
Starting point is 01:22:32 What these people have done with their power over the time that they've had and the time that we as a species have left. And given what we've done to, for instance, the climate or any number of other things, right, worrying about Skynet, this is sort of like these people have driven the car that we're all in off the cliff
Starting point is 01:22:53 and about halfway down, they're like, oh, shit, what if the turn signal starts controlling my mind? Yeah, yeah, maybe, I guess, but there's kind of like some other problems you already caused that are about to come due any second now. We could describe it as moot. Yes, yeah. It's the process of what happens when,
Starting point is 01:23:14 just like in Britain, our elite got too insular and weird and now you can't get tomatoes. Well, the solution for this is for us to get more insular and weird by all becoming monks, like AI monks. Like, oh, you're looking at the world and going like, they're going to need to get a tonsure. They're going to need to call up the fucking boys and we're all going to go and get a tonsure
Starting point is 01:23:37 and we're going to tell that mischievous Turk, no fades, we want tonsures. Google what a tonsure is and give me that. The tonsuring your ass on Pius Twitter. That's right. With all that being said, I think it's probably time to cut it out for the week and hang up our podcasting tonsures.
Starting point is 01:23:57 Cut it out. Yeah, that's right. We're all taking off the cassocks that we wear to podcast. One might be legitimately angry. I don't religiously object to much, but I religiously object to the big computer auto correct, telling you what to do. That's right.
Starting point is 01:24:15 And to remind everyone of what we said at the beginning about the shirts. Yeah, there are shirts you can buy them. Now, if you're in Perth. If you're in Perth, Australia on the 25th of March, please come to the show that I am doing for your benefit, not for mine, because people said, hey, come to Perth. And I said, I will come to Perth
Starting point is 01:24:34 and I spent 1200 Australian dollars on diverting via Perth and, oh boy, do we need to sell those tickets? Also, 9th of March, earlier than that. Berlin, stand up. Come along. Tickets now selling somewhat better than they were previously. Some of you have listened. Some of you have listened, but not enough of you.
Starting point is 01:24:53 A lot of you are very excited that Lydia Tarr was the guest. You were like, yeah. Yeah, let's do it. She'll be conducting that. And one and two and Fritzel. Yeah, that will be... God, please never be a conductor. And also, our theme song is Here We Go by Jinseng.
Starting point is 01:25:12 It is. You can find it on Spotify. There also is a Patreon. There's also a Twitch stream. Every week. Every week. There's a Twitch stream that we do now. If you're listening to this on the Patreon
Starting point is 01:25:22 and maybe it's released on Monday, tonight, Milo, myself, Devin and Alice will be watching the Russian Harry Potter movie Children vs. Wizards. It's like an anti-Harry Potter movie, I believe, from like a right-wing Orthodox church standpoint. I'm so excited for this. Twitch.tv.
Starting point is 01:25:40 It's Mondays and Thursdays from 9 p.m. British time. We're finally remembering after... I hate wizards, they're gay. We are finally remembering to advertise the spinoff on the thing it's a spinoff from. Someone bought us a URL too. If you got a slop.delivery, which is the best website anyone's ever had,
Starting point is 01:26:00 that will also take you to the Twitch stream. Why are all wizards wear dress? Why? The film will hopefully explain. Yeah, perfect. All right, all right. That's enough rock and rollin' for the end of the show. Thank you for listening.
Starting point is 01:26:19 Don't forget to subscribe to the Patreon. And we'll see you in a couple of days on the bonus episode. It's actually called the 8-tree-on. The We Are 8-tree-on. And soon in Berlin. Bye, everyone. Bye. See you next time.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.