Behind the Bastards - CZM Rewind: The Zizians: How Harry Potter Fanfic Inspired a Death Cult & The Zizians: Birth of a Cult Leader

Episode Date: December 30, 2025

Part One: Earlier this year a Border Patrol officer was killed in a shoot-out with people who have been described as members of a trans vegan AI death cult. But who are the Zizians, really? Robert sit...s down with David Gborie to trace their development, from part of the Bay Area Rationalist subculture to killers.   Part Two: Robert tells David Gborie about the early life of Ziz LaSota, a bright young girl from Alaska who came to the Bay Area with dreams of saving the cosmos or destroying it, all based on her obsession with Rationalist blogs and fanfic. Sources:  https://medium.com/@sefashapiro/a-community-warning-about-ziz-76c100180509 https://web.archive.org/web/20230201130318/https://sinceriously.fyi/rationalist-fleet/ https://knowyourmeme.com/memes/infohazard https://web.archive.org/web/20230201130316/https://sinceriously.fyi/net-negative/ Wayback Machine The Zizians Spectral Sight True Hero Contract Schelling Orders – Sinceriously Glossary – Sinceriously  https://web.archive.org/web/20230201130330/https://sinceriously.fyi/my-journey-to-the-dark-side/ https://web.archive.org/web/20230201130302/https://sinceriously.fyi/glossary/#zentraidon https://web.archive.org/web/20230201130259/https://sinceriously.fyi/vampires-and-more-undeath/ https://web.archive.org/web/20230201130316/https://sinceriously.fyi/net-negative/ https://web.archive.org/web/20230201130318/https://sinceriously.fyi/rationalist-fleet/ https://x.com/orellanin?s=21&t=F-n6cTZFsKgvr1yQ7oHXRg https://zizians.info/ according to The Boston Globe Inside the ‘Zizians’: How a cultish crew of radical vegans became linked to killings across the United States | The Independent Silicon Valley ‘Rationalists’ Linked to 6 Deaths The Delirious, Violent, Impossible True Story of the Zizians | WIRED Good Group and Pasek’s Doom – Sinceriously Glossary – Sinceriously Mana – Sinceriously Effective Altruism’s Problems Go Beyond Sam Bankman-Fried - Bloomberg The Zizian Facts - Google Docs Several free CFAR summer programs on rationality and AI safety - LessWrong 2.0 viewer This guy thinks killing video game characters is immoral | Vox Inadequate Equilibria: Where and How Civilizations Get Stuck Eliezer Yudkowsky comments on On Terminal Goals and Virtue Ethics - LessWrong 2.0 viewer                                                                                                    Effective Altruism’s Problems Go Beyond Sam Bankman-Fried - Bloomberg SquirrelInHell: Happiness Is a Chore PLUM OF DISCORD — I Became a Full-time Internet Pest and May Not... Roko Harassment of PlumOfDiscord Composited – Sinceriously Intersex Brains And Conceptual Warfare – Sinceriously Infohazardous Glossary – Sinceriously SquirrelInHell-Decision-Theory-and-Suicide.pdf - Google Drive The Matrix is a System – Sinceriously A community alert about Ziz. Police investigations, violence, and… | by SefaShapiro | Medium Intersex Brains And Conceptual Warfare – Sinceriously A community alert about Ziz. Police investigations, violence, and… | by SefaShapiro | Medium PLUM OF DISCORD (Posts tagged cw-abuse) Timeline: Violence surrounding the Zizians leading to Border Patrol agent shooting See omnystudio.com/listener for privacy information.

Transcript
Discussion (0)
Starting point is 00:00:00 This is an I-Heart podcast, Guaranteed Human. A new true crime podcast from Tenderfoot TV in the city of Mons in Belgium, women began to go missing. It was only after their dismembered remains began turning up in various places that residents realized. A sadistic serial killer was lurking among them. The murders have never been solved. Three decades later, we've unearthed new evidence. Le Monstre, Season 2, is available now. Listen for free.
Starting point is 00:00:30 On the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts. And she said, Johnny, the kids didn't come home last night. Along the central Texas planes, teens are dying. Suicides that don't make sense. Strange accidents and brutal murders. In what seems to be, a plot ripped straight out of Breaking Bad. Drugs, alcohol, trafficking of people. There are people out there that absolutely know what happened.
Starting point is 00:01:00 Listen to Paper Ghosts, the Texas Teen Murders, on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts. Hi, I'm Danny Shapiro. We were in the car, like a Rolling Stone came on, and he said, there's a line in there about your mother. And I said, what? What I would do if I didn't feel like I was being accepted is choose an identity that other people can't have. I knew something had happened to me in the middle of the night, but I couldn't hold on to what had happened. These are just a few of the moving and important stories on my 13th season of Family Secrets. Listen to Family Secrets on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 00:01:42 From NBA champion, Stefan Curry, comes Shot Ready, a powerful, never-before-seen look at the mindset that changed the game. I fell in love with the grind. You have to find joy in the work you do when no one else is around. Success is not an accident. I'm passing the ball to you. Let's go. Steph Curry redefined basketball. Now he's rewriting what it means to succeed.
Starting point is 00:02:05 Order your copy of the New York Times bestseller shot ready. Today at stephencurriebook.com. Coolzo Media. Welcome back to Behind the Bastards. That's how this podcast would open if I was a game show host. But I'm not. Instead of my guy who's. You would be good at it, though.
Starting point is 00:02:30 I don't think I would be, Sophie. I do, but I think... But I'm like biased because I think you'd be good at most things. No, my only marketable skill is spending 30 hours reading the deranged writings of a quasi-cult leader who was somewhat involved in the murders of multiple people very recently, largely because she read a piece of Harry Potter fan fiction at the wrong time. Oh, yes. We have a fun one for you this week, and by a fun one, we have a not at all fun one for you this week. And to have just a terrible time with me, we are bringing on a guest, the great David Boree, co-host of My Mama Told Me, with our friend of the pod, Langston Kerman. David, how you doing?
Starting point is 00:03:21 Oh, man, I can not complain. How are you doing? There's nothing going on in the world. Everything is fun. Oh, yeah. Yeah, I got up today and read that great new article by Francis Fukuyama. History is still stopped, so everything's good. We're done.
Starting point is 00:03:38 I haven't looked at any news yet purposefully, so I'm, you know, it could be awesome. It could be going great out there. It's great. The whole Trump administration got together and said, psych, it was all a bit. Oh, man. Just an extended ad for The Apprentice Season 15. You mean this country is not a business? No. They handed over the presidency to, I don't know, I don't know,
Starting point is 00:04:03 so whoever you personally at home think would be a great president. I'm not going to jump into that can of worms right now. LeBron, Ramon James. They made LeBron the president. That's a good one. That's a good enough one. That's better than what we got. Honestly, vastly superior than where we are. Of all the entertainers, I feel like, why don't we start giving athletes a shot at government? Yeah, I, fuck it, why not? You know, uh, fucking, uh, Kareem Abdul-Jabbar, great president.
Starting point is 00:04:34 That would be a good one. That motherfucker could knock a presidency out of the park. Come on. Veronica Marr's writer, Kareem Abdul-Jibbar. Absolutely. Yes. Legend. We need a mystery novelist slash one of the great basketball stars of all time in the White House.
Starting point is 00:04:50 I just want a president who's good in the paint. You know what I mean? That's right. That's right. Agatha Agatha Christie with a jump shot Yeah that's exactly what I think that's what we need to get us
Starting point is 00:05:01 What an amazing man Kareem would be such would be such a good choice Yeah bring it on I think he's such a good man He wouldn't do it Yeah exactly he's way too moral I have a frog name after
Starting point is 00:05:14 named after him Yeah Look honestly Given where we are Are right right now I'd take fucking What's his Mark McGuire Like Jesus Christ
Starting point is 00:05:24 anybody like honestly anyone i'd take jose conseco i mean Jose conseco in a heartbeat so funny oh man fuck it like i'll take uh no no i'm not gonna take any hockey players no hockey players no hockey players we got enough people with brain damage in the white house right now that's probably be fair and we don't need somebody who likes to fist fight that much yeah yeah yeah yeah you're probably right there i mean if if we could go back in time and make Joe Lewis the president. I think he could solve some fucking problems in Congress.
Starting point is 00:05:57 He could get stuff done. So this is a bit of fun digression, but I got to ask at the start of this, the story that is most relevant to the people we're talking about today that I think most of our listeners will have heard. I'm curious if you've heard about back on January 21st, right as the Trump administration took power, a border portrayal. agent was shot and killed, along with another individual, at a traffic stop in Coventry, Vermont, right?
Starting point is 00:06:30 There were two people in a car. It was pulled over by Border Patrol. One of those people drew a gun. There was a firefight. One of the people in the car and the cop died, right? Okay. Have you heard this story? I have not.
Starting point is 00:06:43 I'm not familiar with this at all. It's one of those things where it would have been a much bigger, obviously, immigration being the thing that it is, right, in the United, like the political hot issue that it is, right? Now, like, the Republicans have been desperately waiting for, like, a border patrol officer getting shooted and wounded that they can, like, use to justify a crackdown. But number one, this happened on the Canadian border, not really. Not their favorite border. And one of the two people who drew their guns on the cops was an immigrant, but they
Starting point is 00:07:13 were a German immigrant. And so none of this really like, it was all like right on the edge of being super useful to the rights, but it's not as sexy like. I live in Denver. We were dealing with our own right-wing immigration propaganda at that time. Yeah. Yeah, it was just like, it was like the closest to being a perfect right-wing, like a Reichstag fire event, but like just a little too weird. Yeah, you got to throw some spice in there.
Starting point is 00:07:42 You got to have like a Latin country. That's what they get excited about. Yeah, and obviously California border is where you want it, you know. Yeah, definitely, definitely. Even New Mexico could be. Yeah, or at least at least they need to have. have fentanyl on the car. In fact, they were not breaking any laws that anyone could prove at the time.
Starting point is 00:07:58 They just looked kind of weird. Okay. They looked kind of weird and they, like, had guns, but it was like, they had, like, two handguns and like 40 rounds and some old targets. They were, like, coming back from a shooting range, right? Like, not a lot of guns and ammo in America terms, right? Right. Especially in Vermont terms.
Starting point is 00:08:17 Right, right. So the other thing that was weird about this is that the German immigrant who does, died was a trans woman. So then, again, we get back to like, wow, there's a lot about this shooting that is like right on the edge of some issues that the right is really trying to use as like a fulcrum to like push through some awful shit. And as more and more information came out about the shooting, the weirder it seemed, because there was a lot of initial talk, is this like a terrorist attack?
Starting point is 00:08:46 Were these like two Antifa types who were like looking to murder a border patrol agent? But no, that doesn't really make sense because, like, they got pulled over. Like, they can't have been planning this, right? Like, it didn't really seem like that. And really, no one could figure out why they had opened fire. But as the days went on, more information started coming out, not just about the two people who were arrested in this, well, the one person who was arrested and the one person who died, but about a group of people around the country that they were linked to.
Starting point is 00:09:19 And these other people were not. all, but mostly trans women. They were mostly people who kind of identified as both anarchists and members of the rationalist subculture, which we'll talk about in a little bit. And they were all super high achieving people in like the tech industry and like sciences, right? These were like people who had won like awards and had advanced degrees. The person, the lady who died in the shooting was a quant traitor. So these are not like the normal shoot it out with the cops types.
Starting point is 00:09:55 Yeah, this is a very niche group. This is a very strange story. So people start being like, oh, the fuck is happening. That's a group of people who could not meet each other without the invention of the internet. Right. That is boy, David. Do you know where this story's going or at least starting? So it's a couple of these days into this when a friend of mine messaged me and it's like,
Starting point is 00:10:18 hey, you know that shooting in Vermont? Yeah, and he's like, my friend is like, you know, there's Zizians. And I was like, wait, what the fuck? Because I had heard of these people. This is a weird little subculture. I'm always, I'm like, you know, I study weird little internet subcultures in part because like some of them do turn out to do acts of terrorism later. So I was in my eyes on the weirdos.
Starting point is 00:10:41 And I've been, I've been reporting on the rationalists who are not like a cult, but who do some cult adjacent things. And I just kind of find annoying. And I'd heard about this offshoot of the rationalists called the Zizians. They were very weird. There were some like weird crime allegations. A couple of them had been involved in a murder in California a year earlier. But like it was not a group that I ever really expected to see blow up in the media.
Starting point is 00:11:07 And then suddenly they fucking did. Right. And they're called the Zizians. It's not a name they have for themselves. They don't consider themselves a cult. They don't all like live. So a group of them did live together. Like, these people are pretty geographically, like, dispersed around the country.
Starting point is 00:11:24 They're folks who met online arguing about ration and discussing rationalism and the ideas of a particular member of that community who goes by the name Ziz, right? That's where this group came out of. And the regular media was not well equipped to understand what was going on. And I want to run through a couple of representative headlines that I came across just in, like, looking at mainstream articles about what had happened. There's an article from the Independent, they title Inside the Zizians, how a cultish crew of radical vegans became linked to killings across the United States.
Starting point is 00:12:03 They seemed like just another band of anarchist misfits scraping on the valley, scraping on the fringes of Silicon Valley until the deaths began. And then there's a KCRW article, Zizians, the vegan techie cult, tied to murders across the U.S. And then a Fox article, trans-vegan cult charged with six murders. There you go, class, that's Fox style. Yes. None of these titles are very accurate.
Starting point is 00:12:30 In that, I guess the first one is like the closest where like these people are radical vegans and they are cultish, right? So I'll give the independent that. Vegan techie cult is not really what I would describe them. Like some of them were in the tech industry, but like the, the degree to which they're in the tech industry is a lot weirder than that gets across. And they're not really a trans, they're like trans vegans, but the cult is not about being a trans vegan.
Starting point is 00:13:00 That's just kind of how these people found each other. Oh, they just happened to be. That was just the common ground. Veganism is tied to it. They just kind of all happen to be trans. That's not really like tied to it necessarily. So I would argue also that they're not terrorists. which a lot of people have a number of the other articles called them.
Starting point is 00:13:22 None of the killings that they were involved with, and they did kill people, were like terrorist killings. They're all much weirder than that, but none of them are like, none of the killings I have seen are for a clear political purpose, right? Which is kind of crucial for it to be terrorism. The murders kind of evolved out of a much, much sillier reason. And it's, you know, there's one really good article about them by a fellow at Wired who, you know, spent a year or so kind of studying these people. And that article does a lot that's good, but it doesn't go into as much detail about what I think is the real underpinning of why this group of people got together and convinced
Starting point is 00:14:04 themselves it was okay to commit several murders. And I think that that all comes down more than any other single factor to rationalism and to their belief in this weird online cult that's very much. based on, like, asking different sort of logic questions and trying to, like, pin down the secret rules of the universe by doing, like, game theory arguments on the internet over blogs, right? Like, that's really how all of this stuff started. So they have, like, someone named Mystery.
Starting point is 00:14:38 Yeah. There's a lot of people in funny hats. They do actually, they're a little adjacent to this. And they come out of that period of time, right? where like pick up artist culture is also like forming they're one of this like generation of cults that starts with a bunch of blogs and shit on the internet in like 2009 right um and this this is it's so weird because we we use the term cult and that's the easiest thing to call these people but generally when our society is talking about a cult we're talking about like
Starting point is 00:15:10 you have an individual that individual brings in a bunch of followers gets them isolated them from society, puts them into an area where they are in complete control, and then tries to utilize them for like a really specific goal. There's like a way to kind of look at the Zizians that way, but I think it would be better to describe them as like cultish, right? Okay. They use the tools of cult dynamics and that produces some very cult-like behavior, but there's also a lot of differences.
Starting point is 00:15:46 between, like, how this group works and what you'd call a traditional cult, including a lot of these people are separate from each other and even don't like each other, but because they've been inculcated in some of the same beliefs, through these kind of cult dynamics, they make choices that lead them to, like, participate in violence, too. Where is their hub? Is it, like, a message board type of situation? Like, how is it? Yes.
Starting point is 00:16:09 Yes. So I'm going to, I'm going to have to go back and forth to explain all of that. Also, I do want to know with the etymology of Zizzy in it. It's because from Zitz? No, no, Z-I-Z. The lady who is kind of the founder of this is the name that she takes for herself is Ziz, right? Okay. Should have been the Z-Girls.
Starting point is 00:16:31 That's much more appealing. These people are big in the news right now. Because of that murder. Because of the several murders and the right wing wants to make it out as like this is a trans death cult. and this is more of like a internet AI nerd death cult. I guess that's better? It's just different, you know? You're right.
Starting point is 00:16:59 It was just a different thing. And I think it's important. If you care about like cults because you think they're dangerous and you're arguing that like, hey, this cult seems really dangerous, you should understand like what the cult is, right? Right. Like if you misunderstood the Scientologist and thought like these are obsessive. fans of science fiction who are committing murders over science fiction stories. It's like, no, no, they're committing murders because this is something stupider.
Starting point is 00:17:21 Yeah. Much much, much stupid. Okay, so I got to take, I am going to explain to you what rationalism is, how who zizz is, where they come from, and how they get radicalized to the point where they are effectively at the hub of something that is at least very adjacent to a cult. But I want to talk a little bit about the difference between like a cult and cult dynamics, right? A cult is fundamentally a toxic thing. It is bad.
Starting point is 00:17:54 It always harms people. There is no harmless cult, you know. It's like rape. Like there's no version of it that's good, you know? Like it is a fundamentally dangerous thing. Cult dynamics and the tactics cult leaders use are not always top. Or toxic or bad. And in fact, every single person listening to this has enjoyed and had their life enriched
Starting point is 00:18:19 by the use of certain things that are on the spectrum of cult dynamics. I was going to say it seems a lot more. Like, you have that at work. You have that at work anywhere, right? Yeah, anyway, that's a huge part of what make a great fiction author who is able to, like, attract a cult following. You've ever had that experience? Like, a big thing in cults is the use of and creation of new language.
Starting point is 00:18:41 you get people using words that they don't use otherwise and like phrases and that is both a way to bond people because like, you know, it helps you feel like you're part of this group and it also isolates you from people. If you've ever met people who are like hugely into, you know, Dungeons and Dragons or huge fans like Harry Potter or the Lord of the Rings, like they have like things that they say like memes and shit that they share based on those books. And, like, that's a less toxic, but it's on the same spectrum, right? It's this, I am a part of this group of people and we use these words that mean something to us that don't mean things to other people, right? And that's like an empowering feeling, right? Yes, yes, yeah. That's like a great way to bond. I think it's any group, right?
Starting point is 00:19:27 I mean, we say entertainers. Your friend group has in jokes, right? Sports. Yeah. The beehive could kill people, right? Exactly. Yes, yes. And, like, you've got, you know, you and your buddies that have been friends for years, you have, like, you could, there's like a word you can say and everyone knows that you're referring to this thing that happened six years ago.
Starting point is 00:19:46 And you all, like, laugh because, you know, it reminds you of, you know, because it's relevant to something happening then, that's a little healthy bit of cult dynamics at play, right? You know, it's like a diet, you know, so there's a toolbox here and, and we play with it. And different organizations, but churches play with it. And obviously a lot of churches cross the line into cults. But there's also aspects of, for example, you know, there's churches that I know I have seen people go to where like it's very common. Everybody gets up and like hugs at a certain point. And like people benefit from human contact.
Starting point is 00:20:24 It makes them feel nice. It can be like a very healthy thing. I've gone to, I used to go to like Burning Man regionals. like you would like start at this greeter station where like a bunch of people would come up and they'd offer you like food and drinks and you know people would hug each other and it was this like changes your mind state from where you were in before kind of opens you up that in those contexts is that like to qualify for state yeah yeah yeah yeah so that we could get to go it's just like these local little events in Texas right like a thousand people in the desert
Starting point is 00:20:55 trying to forget that we live in Texas okay um or not desert but um it was very like it's it's it's it's It was like a really valuable part of, like, my youth because it was the first time I ever started to, like, feel comfortable in my own skin. But also, that's on the spectrum of love bombing, which is the thing cults do, where they, like, surround you from people with people who, like, talk about, like, you know, we'll touch you and hold you and tell you they love you. And, like, you know, part of what brings you into the cult is the cult leader can take that away at any moment in time, right? it's the kind of thing where if it's not something where no this is something we do for five minutes at the end of every church service right you can very easily turn this into something deeply dangerous and poisonous right but also a lot of people just kind of play around a little bit with pieces of that a piece of the cult dynamics just a little bit any good musician any really great performer is fucking with some cult dynamics right i was gonna say i mean i've i've been to like so many different concerts of like weird niche stuff where you're like Maybe the disco biscuits is a cool. I don't know.
Starting point is 00:22:01 Yeah. I mean, I've, like, I've been to some childish Gambino concerts where it's like, oh, yeah, he's doing, he's a little bit of a cult leader, you know? Like, like just 10%, right? I mean, what are you going to do with all that charisma? You got to put it somewhere, you know? Yeah. So, these are, I think that it's important for people to understand both that, like,
Starting point is 00:22:21 the tactics and dynamics that make up a cult have versions of them. that are not unhealthy. But I also think it's important for people to understand. Colts come out of subcultures, right? This is very close to 100% of the time. Colts always arise out of subcultural movements that are not in and of themselves cults. For example, in the 1930s, through like the 50s, 60s,
Starting point is 00:22:49 you have the emergence of what's called the self-help movement, you know? And this is all of these different books on like, how to persuade people, how to, you know, win friends and influence people, you know, how to like make, but also stuff like Alcoholics Anonymous, you know, how to like improve yourself by getting off drugs, getting off alcohol. All of these are pieces of the self-improvement movement, right? That's a subculture. There are people who travel around, who get obsessed, who go to all of these different
Starting point is 00:23:15 things, and they'll, and they get a lot of benefit, you know, people will show up at these seminars where there's hundreds of other people and a bunch of people will, like, hug them and they feel like they're part of this community. and they're making their lives better. And oftentimes, especially, like, once we get to, like, the 60s, these different sort of guru types are saying that, like, you know, this is how we're going to save the world. If we can get everybody doing, you know, this yoga routine or whatever that I put together
Starting point is 00:23:41 and it will fix everything. Who's that guy who had the game? Oh, God, yes. You know what I'm talking about? Yeah, yeah, yeah, yeah. And they had to, like, they had to viciously confront each other. Yes, we've covered them. That is Synanon.
Starting point is 00:23:54 Yes. So that's what I'm talking about. That's what I'm talking. You have this broader subculture of self-help and a cult, synonon, comes out of it, you know? And I get it. It's like the subculture, it's already, it's intimate. You feel closer to those people. And anybody else, it definitely feels right for manipulation.
Starting point is 00:24:12 And Scientology is a cult that comes out of the exact same subculture. We talked last week or two weeks ago about Tony Alamo, who was an incredibly abusive pedophile Christian cult leader. He comes out of, along with a couple of guys we've talked, the Jesus freak movement, which is a Christian subculture that arises as a reaction to the hippie movement. It's kind of the countervailing force to the hippie movement. So you got these hippies and you have these Christians who are like really scared of this kind of like weird left wing movement. And so they start kind of doing like a Christian hippie movement almost, right? And some of these people just start weird churches that sing annoying songs. and some of these people start hideously dangerous cults.
Starting point is 00:24:56 You have the subculture and you have colts that come out of it, right? And the same thing is true in every single period of time, right? Colts form out of subcultures, you know? And part of this is because people who, a lot of people who find themselves most drawn to subcultures, right, tend to be people who feel like they're missing something in the outside world, right? Right. You know, not every, but people who get most into it. And so does that mean like, so maybe like more, I'm just curious, like, more broader
Starting point is 00:25:30 cultural waves have never led. There's like the Swifties would not be a cult. No, there's most likely not going to be an offshoot of the Swifties that becomes a cult because it's so broad. It has to have already been kind of a smaller subset. That's interesting. Well, yeah, and I think, but that said, there have been cults that have started out of, like, like popular entertainers and musicians.
Starting point is 00:25:54 Like, you know, we could talk about Corey Feldman's weird house full of young women dressed as angels. Right? Like, you know?
Starting point is 00:26:05 So, yeah, you've got, as a general rule, like there are, music is full of subcultures, like punk, right?
Starting point is 00:26:13 But there have definitely also been some like punk communities that have gone and kind of individual little chunks of punk communities. It's kind of like culture. directions, right? Even if you're not like...
Starting point is 00:26:24 Strategies, huh? Yeah. Yes. Yeah. Fuck up. So there are cults that come out of the subculture. This is the way Colts work. And I really just, I don't think, I don't think there's very good education on what
Starting point is 00:26:39 Colts are, where they come from, or how they work. Because all of the people who run this country have like a lot of cult leader DNA in them, you know? Oh, buddy. Yeah. We're being run currently by someone who has seen as a magic man too many. Colts all the way down. Yes, exactly, exactly. So I think there's a lot of vested interests and not explaining what a cult is and where they come from.
Starting point is 00:27:06 So I think it's important to understand subcultures birth cults and also cult leaders are drawn to subcultures when they're trying to figure out how to make their cult because a subculture is a subculture. You know, most of the people who are just going to be like normal people who are just kind of into this thing. But there will always be a lot of people who are like, this is the only place I feel like I belong. I feel very isolated. This is like the center of my being, right? Right. And so it's just a, it's like a good place to recruit, you know? Those are the kind of people you want if you're reaching out to cult leaders.
Starting point is 00:27:40 You know, I'm not saying like, again, I'm not saying subcultures are bad. I'm saying that like some chunk of people in subcultures are ready to be in a cult, you know? Yeah. Yeah, I think I might reflect on my own personal life. Yeah, you meet a lot of guys who are just like, I'll die for the skate park or whatever thing. Yeah. Or like the Star Wars fans who were sending death threats to Jake Lloyd after the Phantom Menace where it's like, well, you guys are crazy. That is insane.
Starting point is 00:28:07 You know, he's like, eight, right? This is a movie. He also didn't write it. He didn't write it. Like, what are you doing? You know, whatever makes you feel a sense of home, I guess. So, and again, that's kind of a good point. Like, Star Wars fans aren't a cult, but you can also see some of, like, the toxic things cults do erupts from time to time and from like video game, right?
Starting point is 00:28:32 People who are willing to a certain video game, that's not a cult, but also periodically groups of those fans will act in ways that are violent and crazy. And it's because of some of these same factors going on, right? I think people forget fan is short for fanatic. Exactly. Exactly. Right. And it's like, you know, the events that I went to very consciously played with cult dynamics. You know, after you got out of the like greeting station thing where like all these people were kind of like love bombing you for like five minutes. There was like a big bar and it had like a sign above it that said not a religion do not worship. And it was this kind of people would talk about like this is like we are playing with the ingredients of a cult. We're not trying to actually make one. So you need to constantly remind people of like, what we're doing and why it affects their brain that way. And in my case, it was like, because I was at like a low point in my life then. Like, this was when I was really, it was 20. I was not.
Starting point is 00:29:28 I had no kind of drive in life. I was honestly dealing with a lot of like suicidal ideation. This is the point in which I would have been vulnerable to a cult. And it, I think it acted a little bit like a vaccine. Like I got a little dose of the drug. Right. It was enough to. Built up in immunity.
Starting point is 00:29:44 Exactly. And now you're like, hey, I know, I know what that is. I know what's going on there. So, anyway, I needed to get into this because the Zizians, this thing that I think is either a full-on cult or at least cultish, right, that is responsible for this series of murders that are currently dominating the news and being blamed on like a trans vegan death cult or whatever, they come out of a subculture that grows out of the early aughts internet known as the rationalists. the rationalists started out as a group in the early aughts on the comment sections of two blogs. One was called Less Wrong and one was called Overcoming Bias. Less Wrong was started by a dude named Elyzer Yiddkowski. I have talked about a leaser on the show before.
Starting point is 00:30:32 He's not a he sucks. I think he's a bad person. He's not a cult leader, but again, he's playing with some of these cult dynamics. and he plays with them in a way that I think is very reckless, right? And ultimately leads to some serious issues. Now, Alizier's whole thing is he considers himself the number one world expert on AI risk and ethics. Now, you might think from that, oh, so he's like making AI's, like working for one of these companies that's involved in, like, coding and stuff, absolutely not. Oh, no, affiliation.
Starting point is 00:31:10 No, no. Armchair quarterback, backseat driver. No, he writes long articles about what he thinks AI would do and what would make it dangerous that are based almost entirely off of short stories he read in the 1990s. Like, this guy. That's the most internet shit I've ever heard. It's so much, it's such internet. And like, I'm not a fan of like the quote unquote real AI.
Starting point is 00:31:36 But Yudkowski is not even one of these guys who's like, no, I'm like making a machine that you talk to. Yeah, I have no credible. I just have an opinion. Yeah. An outdated opinion. I hate this guy so much. Speaking of things I hate, not going to ads. I'm investigative journalist Melissa Jeltsin.
Starting point is 00:32:00 My new podcast, What Happened in Nashville, tells the story of an IVF clinic's catastrophic collapse and the patients who banded together in the chaos that followed. We have some breaking news to tell you about Tennessee's attorney, General is suing a Nashville doctor. In April 2024, a fertility clinic in Nashville shut down overnight and trapped behind locked doors were more than a thousand frozen embryos. I was terrified. Out of all of our journey, that was the worst moment ever.
Starting point is 00:32:31 At that point, it didn't occur to me what fight was going to come to follow. But this story isn't just about a few families' futures. It's about whether the promise of modern fertility care can be trusted. at all. It doesn't matter how much I fight. Doesn't matter how much I cry over all of this. It doesn't matter how much justice we get. None of it's going to get me pregnant. Listen to
Starting point is 00:32:53 what happened in Nashville on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts. In 1997, in Belgium, 37 female body parts placed in 15 trash bags were found at dump
Starting point is 00:33:09 sites with evocative names like The Path of Worry, Dump Road and Fear Creek. Terrible discoveries of Saturday, investigators made a new discovery yesterday afternoon of the torso of a woman. Investigators believe it is the work of a serial killer. Despite a sprawling investigation,
Starting point is 00:33:27 including assistance from the American FBI, the murders have never been solved. Three decades later, we've unearthed new evidence and new suspects. We felt like we were in the presence of someone who was going to the grave with, with nightnourish secrets. From Tenderfoot TV and IHeart Podcasts,
Starting point is 00:33:46 this is Le Mansre Season 2, The Butcher of Moss, available now. Listen for free on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts. Malcolm Gladwell here. This season on Revisionous History, we're going back to the spring of 1988, to a town in northwest Alabama,
Starting point is 00:34:08 where a man committed a crime that would spiral out of control. 35 years. That's how long Elizabeth's and its family waited for justice to occur. 35 long years. I want to figure out
Starting point is 00:34:25 why this case went on for as long as it did, why it took so many bizarre and unsettling turns along the way, and why, despite our best efforts to resolve suffering, we all too often make suffering worse. He would say to himself,
Starting point is 00:34:38 turn to the right to the victim's family and apologize, turn to the left, tell my family I love him. So he would have this little practice to the right. I'm sorry. To the left, I love you. From Revisionous History, this is The Alabama Murders. Listen to Revisionous History, The Alabama Murders on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts. Hi, Dr. Lori Santos from the Happiness Lab here. It's the season of giving, which is why my podcast is partnering with Give Directly, a nonprofit that provides people in extreme poverty with the
Starting point is 00:35:10 cash they need. This year, we're taking part in the Pods Fight Poverty campaign. And it's not just the Happiness Lab. Some of my favorite podcasters are also taking part. Think Jay Shetty from On Purpose, Dan Harris from 10% Happier, and Dave Desteno from How God Works and more. Our goal this year is to raise $1 million, which will help over 700 families in Rwanda living in extreme poverty. Here's how it works. You donate to give directly, and they put that cash directly into the hands of families in need, because those families know best what they need, whether it's buying livestock to fertilize their farm, paying school fees, or starting a small business. With that support, families can invest in their future and build lasting change. So join me and your favorite
Starting point is 00:35:55 podcasters in the Pods Fight Poverty campaign. Head to give directly.org slash happiness lab to learn more and make a contribution. And if you're a first-time donor, giving multiplier will even match your gift. That's give directly.org slash happiness lab. Lab to donate. We're back. So Yudkowski, this AI risk and ethics guy, starts this blog in order to explore a series of thought experiments based in game theory. And his, I am annoyed by game theory guys.
Starting point is 00:36:30 That's the last sentence I've ever heard. Yeah, it sucks. But man, I know that there's like valid active, but like it's all just always so stupid and annoying to me. Anyway, a bunch of thought experience based in game theory with the goal of teaching himself and others to think more logically and effectively about the major problems of the world. His motto for the movement and himself is winning the rationalist. Yeah, yeah.
Starting point is 00:37:00 That's where she got it. Yeah, that's where she picked it up. Yeah. All right. Good enough. He's, they're tied in with biohacking, right? this is kind of starting to be a thing at the time, and brain hacking and the whole, like, self-optimization movement that feeds into a lot of, like, right-wing influencer space today. Yadkowski is all about optimizing your brain and your responses in order to allow you to accomplish things that are not possible for other people who haven't done that.
Starting point is 00:37:30 And there's a messianic era to this, too, which is he believes that only by doing this, by spreading rationalist principles in order to quote, raise the sanity waterline. That's how he describes it. That's going to make it possible for us to save the world from the evil AI that will be born if enough of us don't spend time reading blogs. Okay, that's great. This is... It's awesome.
Starting point is 00:37:56 This is Pete. It's the good stuff. Yadkowski and his followers see themselves as something unique and special. And again, there's often a messianic. air to this, right? We are literally the ones who can save the world from evil AI. Nobody else is thinking about this or is even capable of thinking about this because they're too illogical. He holds himself as kind of like a deity. He kind of deifies himself on top of this. He doesn't really deify himself, but he also does talk about himself like in a way that is,
Starting point is 00:38:30 clearly other people aren't capable of understanding all of the things that he's capable of understanding, right? Okay. So there is a little bit, it's more like superheroification, but it's a lot, you know what this is closest to? With these people, they would, all of them would argue with me about this, but I, I've read enough of their papers and enough dionetics to know that like, this is new dionetics. Like, this is church.
Starting point is 00:38:55 Oh, okay. The church of Scientology stuff has more occult and weird, like, magic stuff in it. But this is all about there are activities and exercises you go through that will rid your body of like bad ingrained responses and that will make you a fundamentally more functional person. Okay. So the retraining of yourself in order to make a little bit. Okay. Huge deal. And also a lot of these guys wind up like referring to the different like techniques that he teaches as tech, which is exactly what the Scientologists call it.
Starting point is 00:39:33 Like, there's some shit I found that it's like, this could have a bit come right out of a Scientology pamphlet. Do you guys not realize what you're doing? I think they do, actually. So he's, he's, you know, in the process of inventing this kind of new mental science that verges on superpowers. And it's one of those things like people don't tend to see these people as crazy. If you just sort of like read their arguments a little, it's like them going over old thought experiments and being like, so the most rational way to behave. in this situation is this reason for this reason you have to really like dig deep into their conclusions um to to see how kind of nutty a lot of this is um now again yeah i compared
Starting point is 00:40:16 in the scientology idkowski isn't a high control guy like hubbard he's never going to make a bunch of people live on a flotilla of boats in the ocean with him you know um he's got like there's definitely like some allegations of uh bad treatment of like some of the women around him and Like he has like a Bay Area set that hang with him. I don't think he's like a coat leader. You know, you could say he's on the spectrum. Is he drawing people to him physically or this is also all?
Starting point is 00:40:42 Physically. I mean, a lot of people move to the Bay Area area to be closer to the rationalist scene. Although, again, all of these are a city. I'm a Bay Area guy. San Fran. San Fran. Oh,
Starting point is 00:40:52 in the city? This is a San Francisco thing because all of these are tech people. Oh, okay. So this is like a, yes. I wonder what neighborhood feels like a huge. San Fran and Oakland. You can look it up. People, people have figured.
Starting point is 00:41:03 found his house online, right? Like, it is known where he lives. I'm not saying that for any, like, I don't harass anybody. I just, like, it's not a secret, like, what part of the town this guy lives in. I just didn't think to look it up. But, like, yeah, this is like a Bay Area tech industry subculture, right? Okay. So the other difference between this and something like Scientology is that it's not just
Starting point is 00:41:27 Elyzer laying down to the law. Eliezer writes a lot of blog posts, but he lets other people write blog posts too. and they all debate about them in the comments. And so the kind of religious canon of rationalism is not a one guy thing. It's come up with by this community. And so if you're some random kid in bum fuck Alaska and you find these people and start talking with them online, you can like wind up feeling like you're having an impact on the development of this new thought science, you know?
Starting point is 00:41:57 Yeah, that's amazing. Very powerful for a kid. Yes. Now, the danger with this is that, like, all of this is this Internet community that is incredibly, like, insular and spends way too much time talking to each other and way too much time developing in-group terms to talk to each other. And Internet communities have a tendency to poison the minds of everyone inside of them. For example, Twitter. The reality is that you would be X, X, X, the everything app. I just watched a video of a man
Starting point is 00:42:33 killing himself while launching a shit coin the everything app oh fuck by the hack a hack Google job indicates it's Berkeley
Starting point is 00:42:49 thank you that makes the most sense to me geographically a lot of these people wind up living on boats and like the Oakland there's the Oakland Harbor boat culture is a thing. Is that ever a good thing
Starting point is 00:43:04 when a big group of people move to boats? No. David, absolutely not. No. It feels like it never boats well. It's,
Starting point is 00:43:20 here's the thing. Boats are a bad place to live. It's not. It's for fun. It is. Like, boats and planes are both constant monuments to hubris.
Starting point is 00:43:30 but a plane, its goal is to be in the air just as long as it needs. And then you get it back on the ground where it belongs. A boat's always mocking God in the sea. Yes. Or a lot of times just a harbor, like a houseboat. You know what I mean? That's where your dad goes after the divorce. Right.
Starting point is 00:43:49 Oh, man, I do one day I'll live on a houseboat. It's going to be falling apart. It's just a horrible, horrible place to live. Dang. Oh, I can't wait. That's the dream, David. That's my beautiful dream. I'm going to become so...
Starting point is 00:44:03 Making your own bullets. Making my own bullets really just becoming an alcoholic. Like, not just like half-assing it. Like putting it, like trying to become the babe Ruth of drinking nothing but cutty sark Scotch. If you want to be like a poop-the-bed alcoholic, a houseboat is the place for that. Yeah, yeah. That's right.
Starting point is 00:44:25 That's right. Ah, the life. I want to be like that guy from Jaws, Quint. You're going to get scurvy. Yes, exactly. Getting scurvy, destroying my liver, eventually getting eaten by a great white shark because I'm too drunk to work my boat. Ah, that's it. That's the way to go with romance, you know.
Starting point is 00:44:47 Yeah. So, anyway, these internet communities, like the rationalists, even when they start from a reasonable place, because of how internet stuff works, one of the things about internet communities, is that when people are like really extreme and like pose the most sort of extreme and out there version of something, that gets attention. People talk about it. People get angry at each other. But also like that kind of attention encourages other people to get increasingly extreme and weird. And there's just kind of a result, a derangement.
Starting point is 00:45:19 I think internet communities should never last more than a couple of years because everyone gets crazy. You know, like it's bad for you. I say this is someone who was raised on. on these. It's bad for you. And like, it's bad for you in part because when people get really into this, this becomes the only thing, like, especially a lot of these, like, kids and isolated who are getting obsessed with rationalism. All they're reading is these rationalist blogs. All they're talking to is other rationalists on the internet. And in San Francisco, all these guys are hanging out all of the time and talking about their ideas. And this is bad
Starting point is 00:45:56 for them for the same reason that, like, it was bad for all of the nobles in France that moved move to Versailles, right? Like, they all lived together and they went crazy. Human beings need regular contact with human beings they don't know. The most lucid and wisest people are always, always the people who spend the most time connecting to other people who know things that they don't know. This is an immutable fact of life. This is just how existing works.
Starting point is 00:46:25 If you think I'm wrong, please consider that you're wrong and go. Find a stranger under a bridge, you know? Just start talking. They know some stuff you don't know. They will know some shit. They might have some powders you haven't tried. Oh, yeah. Pills and powders.
Starting point is 00:46:42 Shit's going on under the bridge. That's an echo chamber you want to be a part of. Yeah, exactly, exactly. So the issue here is that Yudkowski starts postulating on his blog, various rules of life based on these thought experiments. A lot of them are like older thought experiments that like different intellectuals, physicists, psychiatrists, psychologists, whatnot, to come up with like the 60s and stuff, right? And he starts taking them and coming up with like corollaries or alternate versions of them and like trying to solve some of these thought problems with his friends, right? The thought experiments are most of what's happening here is they're mixing these kind of 19th and 20th century philosophical concepts. The big one is utilitarianism.
Starting point is 00:47:27 That's like a huge thing for them is the concept of like. the ethics, meaning doing the greatest good for the greatest number of people, right? And that ties into the fact that these people are all obsessed with the singularity. The singularity for them is the concept that we are on the verge of developing an all-powerful AI that will instantly gain intelligence and gain a tremendous amount of power, right? It will basically be a God. And the positive side of this is it'll solve all of our. our problems, right? You know, it will literally build heaven for us, you know, when the
Starting point is 00:48:04 singularity comes. The downside of it is it might be an evil god that creates hell, right? So the rationalists are all using a lot of these thought experiments and like their utilitarianism becomes heavily based around how do we do the greatest good in, by which, I mean, influencing this AI to be as good as possible. So that's the end goal. That's the end goal, right? Are they actively? Because You said the leader was not, are these people now actively working within AI or are they just talking about it? A bunch of them have always been actually working in AI. Yadkauster would say, no, I work in AI. He's got a think tank that's dedicated to like AI, ethical AI.
Starting point is 00:48:43 It's worth noting that most of the people in this movement, including Gdkowski, once like AI became an actual, like, I don't know I'd say there's actual, these are actual intelligences because I don't think they are. But like once chat GPT comes out and this becomes like a huge, people start. to believe there's a shitload of money in here, a lot of these businesses, all of these guys, or nearly all of them, get kicked to the curb, right? Because none of these, none of these companies really care about ethical AI, you know, like they don't give shit about what these guys have to say. And Yadkowski now is a huge, he's like very angry at a lot of these AI companies because he thinks they're very recklessly, like, making the god that will destroy us instead of like doing this carefully to make sure that AI isn't evil.
Starting point is 00:49:28 Anyway, but a lot of these people are in an adjacent to different chunks of the AI industry, right? They're not all working on like LLMs, and in fact, there are a number of scientists who are in the AI space, who think AI is possible, who think that the method that like open AI is using LLMs cannot make an intelligence, that that's not how you're ever going to do it. If it's possible, they have other theories about it. I don't need to get into it further than that. But these are like a bunch of different people. Some of them are still involved with like the mainstream AI industry. Some of them have been very much pushed to the side. So all it starts again with these fairly normal game theory questions, but it all gets progressively
Starting point is 00:50:12 stranger as people obsess over coming up with like the weirdest and most unique take in part to get like clout online. Right. And all of these crazy, yeah, I'll give you an example, right? So much of rationalist discourse in among the Yadkowski people is focused on what they call decision or what's called decision theory, right? This is drawn from a thought experiment called Newcomb's paradox, which was created by a theoretical physicist in the 1960s. Hey, just to make a quick correction here, I was a little bit glib. Decision theory isn't drawn from Newcomb's paradox.
Starting point is 00:50:48 Not does it start with Yudkowski, but the stuff that we're talking about like how decision theory kind of comes to be seen in. the rationalist community. A lot of that comes out of Newcomb's paradox. It's a much older like thing, you know, than the internet goes back centuries, right? People are talking about decision theory for a long time. Sorry, I was imprecise. I am going to read how the Newcomb's paradox is originally laid out. Imagine a super intelligent entity known as Omega and suppose you are confident in its ability to predict your choices. Maybe Omega is an alien from a planet that's much more technically advanced than ours. You know that Omega has often correctly predicted your choices in the past and has never made an incorrect prediction about your choices.
Starting point is 00:51:28 And you also know that Omega has correctly predicted the choices of other people, many of whom are similar to you in the particular situation about to be described. There are two boxes, A and B. Box A is C-through and contains $1,000. Box B is opaque and contains either $0 or a million. You may take both boxes or only take box B. Omega decides how much money to put into box B. If Omega believes that you will take both boxes, then it will put zero dollars in Box B. If Omega believes that you will take Box B, then it will put only Box B, then it will put a million dollars in Box B. Omega makes its prediction and puts the money in Box B, either zero or a million dollars. It presents the boxes to you and flies away. Omega does not tell you
Starting point is 00:52:18 its prediction, and you do not see how much money Omega put in Box B. What do you do? now I think that's stupid I think it's a stupid question I don't really think it's very useful I don't see there's so many other factors yeah I don't know yeah I mean among other things part of the issue here is that like well the decision's already been made right yeah that's the point you have no it does it doesn't matter what you do there's no autonomy in that right well you and I would think that because you and I are normal people who uh I think among other things probably like grew up like cooking food and like filling up our cars with gas and not having like our parents do all of that because they're crazy rich people who live
Starting point is 00:53:04 in the bay and paid to send you to super stanford yeah big time latchkey over here baby we had like problems in our lives and stuff you know physical bullies normal like i i don't want to like shit on people who are in because this is also harmless right and what what this is i'm not also i'm not shitting on Newcomb. This is a thing, a guy comes up with the 60s, and it's like a thing you talk about in, like, parties and shit among, like, other weird intellectuals, right? You pose it, you sit around drinking, you talk about it. Not, not, there's nothing bad about this, right?
Starting point is 00:53:35 However, when people are talking about this online, there's no end to the discussion. So people just keep coming up with more and more arcane arguments for what the best thing to do here is, and it starts to influence. I see how that spins out of control pretty quickly. Exactly. And the rationalist. discuss this nonstop and they come to a conclusion about how to best deal with this situation. Here's how it goes.
Starting point is 00:53:59 The only way to beat Omega is to make yourself the kind of person in the past who would only choose Box B so that Omega, who is perfect at predicting, would make the prediction and put a million dollars in Box B based on your past behavior. In other words, the decisions that you would need to make in order to wish. this are timeless decisions, right? You have to become in the past a person who would. Now, again, that's what they came up with. That's what they all came up with as the supreme answer.
Starting point is 00:54:35 This is the smartest people in the world, David. These are the geniuses. Self-describe. They're building the future. Oh, boy. Yo. It's so funny trying to like, every time, because I've read, I've spent so many hours reading this and you do kind of sometimes get into the like, okay, I get the logic
Starting point is 00:54:56 there. And it's, that's why it's so useful just like sit down with another human being and be like, yeah, this is insane. This is nuts. Yeah, this is nuts. This is stupid. This is all nuts. This is all dumb. This is why you leave it at the cocktail party. Yeah. So they conclude and by which I mean largely Yedkowski concludes that the decision you have to make in order to win this game is what's called a timeless decision. And this This leads him to create one of his most brilliant inventions, timeless decision theory. And I'm going to quote from an article in Wired, Timeless Decision Theory asserts that, in making a decision,
Starting point is 00:55:33 a person should not consider just the outcome of that specific choice, but also their own underlying patterns of reasoning and those of their past and future selves, not least because these patterns might one day be anticipated by an omniscient adversarial AI. Oh, no. That's a crazy way to live Motherfucker Have you ever had a problem Have you ever really
Starting point is 00:55:57 Have you ever dealt with anything? What the fuck are you talking about? This isn't how thinking works You make every decision? Honestly, again I can't believe I'm saying this now Given where it wasn't high school Like go play a football
Starting point is 00:56:13 Go make a cabinet You know? Learn how to change your oil. Go do something. There's a lot of assholes who use this term, but you got to go touch grass, man. You got to touch grass, man. That's like, that's crazy. If you're talking about this kind of shit, and again, I know you're all wondering, you started
Starting point is 00:56:31 this by talking about a Border Patrol agent being shot. All of this directly leads to that man's death. We have covered a lot of ground. This is, I'm excited. I did forget there was also going to be murder. Yeah, there sure is. So, Elyzer Yudkowski describes this. as a timeless decision theory.
Starting point is 00:56:51 And once this comes into the community, it creates a kind of logical fork that immediately starts destroying people's brains. Again, all of these people are obsessed with the imminent, coming, omniscient godlike AI, right? And so to them... Do they have a time limit on it? Or do they have a...
Starting point is 00:57:07 Like, is there any timing on it? Or is it just kind of like... Again, man, it's the rapture. It's literally the tech guy rapture. So any day, it's coming any day. you know you could be a monstice already yeah yeah um so these guys are all obsessed that this godlike AI is coming and like for them the omega in that thought experiment isn't like an alien it's a stand in for the god AI and one conclusion that eventually results from all of these discussions
Starting point is 00:57:38 is that the and this is a conclusion a lot of people come to if in order if in these kinds of situations, like the decisions that you make, you have to consider like your past and your future selves, then one logical leap from this is if you are ever confronted or threatened in a fight, you can never back down, right? And in fact, you need to immediately escalate to use maximum force possible. And if you commit, if you commit now to doing that in the future, you probably won't ever have to defend yourself because it's a timeless decision. Everyone will like, Like that will impact how everyone treats you And they won't want to start anything with you
Starting point is 00:58:20 If you'll immediately try to murder anyone who fights you. I seem to be this guy, but I think this is why people need to get beat up sometimes. Yeah, yeah. And again, that is kind of a fringe conclusion among the rationalists. Most of them don't jump to that. But like the people who wind up doing the murders we're talking about that they are among the rationalists who come to that conclusion.
Starting point is 00:58:43 Okay. Yeah, okay. It's starting to make sense, huh? This is a head fuck. That's so funny. Oh, no. Yeah. Because, like, this whole time I've really been only thinking about it in theory,
Starting point is 00:58:58 not like practical application because it's so insane. But, oh, no. No, no, this is, this goes bad places, right? Oh, no. This kind of thinking also leads through a very twisty, turny process to the, and something called Rocko's Basilisk, which among us, other things is directly responsible for Elon Musk and Grimes meeting, because they are super into this shit.
Starting point is 00:59:21 Oh, really? Oh, really. So the gist is a member of the less wrong community, a guy who goes by the name Rocco, R.O. R.K.O.K.O.K. Post about this idea that occurred to him, right? This inevitable super intelligent AI would obviously understand timeless decision theory. And since its existence is all important, right, The most logical thing for it to do post-singularity would be to create a hell to imprison all of the people and torture all of the people who had tried to stop it from being created, right? Because then anyone who, like, thought really seriously about who was in a position to help make the AI would obviously think about this and then would know I have to devote myself entirely to making this AI, otherwise it's going to torture me forever, right? Yeah, it makes total sense.
Starting point is 01:00:16 I have trouble saying, right, because it's so nuts. It's nuts, but this is what they believe, right? And again, with all of, a lot of this is people who are like atheists and tech nerds creating Calvinism. Like, and this is just, this is just Pascal's wager, right? Like, that's all this is, you know? It's Pascal's wager with a robot. Oh, man. This becomes so upsetting to some people.
Starting point is 01:00:45 It destroys some people's lives, right? Yeah, I mean, I'm behaving that way practically day-to-day. I don't think you would even take one. No. You could fuck your shit up in a month. It's just living like that. So not all of them agree with this. And in fact, there's big fights over it because a bunch of rationalists do say, like, that's very silly.
Starting point is 01:01:07 That's like a really ridiculous thing to think about it. Yeah. They're still debating everything online. Yeah. And in fact, Elyzer Yudkowski is going to, like, ban discussion of Rocco's Basilisk because eventually, like, so many people are getting so obsessed with it. It fucks a lot of people up in part because a chunk of this community are activists working to slow AI development until it can be assured to be safe.
Starting point is 01:01:31 And so now this is, like, am I going to post-Singularity hell? Is like the AI God going to torture me for a thousand eternities? It's funny how they invent this new thing and how quickly it goes into like traditional Judeo-Christian idea. It's like they got a hell now. It is very funny. And there's, they come to this conclusion that just reading about Rocco's Baselisk is super dangerous because if you know about it and you don't work to bring the AI into being,
Starting point is 01:01:59 you're now doomed, right? Of course. The instant you hear about it. So many people get fucked up by this that the thought experiment is termed an info hazard. And this is a term these people use a lot. Now, the phrase information hazard has its roots in a 2011 paper by Nick Bostrom. He describes it as, quote, a risk that arises from the dissemination of true information in a way that may cause harm or enable some agent to cause harm, right? And like, that's like a concept that's worth talking about.
Starting point is 01:02:32 Bostrom is a big figure in this culture, but I don't think he's actually why most people start using the term info hazard. because the shortening of information hazard to info hazard comes out of an online fiction community called the SCP Foundation, right, which is a collectively written online story that involves a government agency that lockups dangerous, mystic and metaphysical items. There's a lot of lovecraft in there. It's basically just a big database that you can click and it'll be like, you know, this is like a book that if you read it, it like has this effect on you or whatever. It's just people like, you know, playing around telling scary stories on the internet.
Starting point is 01:03:08 It's fine. There's nothing wrong with it. But all these people are big nerds. And all of these, like behind nearly all of these big concepts and rationalism, more than there are like philosophers and like, you know, actual like philosophical concepts. There's like shit from short stories they read. Yeah, exactly. Yeah.
Starting point is 01:03:31 And so the term info hazard gets used, which is like, you know, a book or something, an idea that could destroy your mind, you know? Speaking of things that will destroy your mind, these ads. I'm investigative journalist Melissa Jeltson. My new podcast, What Happened in Nashville, tells the story of an IVF clinic's catastrophic collapse and the patients who banded together in the chaos that followed. We have some breaking news to tell you about. Tennessee's attorney general is suing a Nashville doctor. In April 2024, a fertility clinic in Nashville shut down.
Starting point is 01:04:08 down overnight and trapped behind locked doors were more than a thousand frozen embryos. I was terrified. Out of all of our journey, that was the worst moment ever. At that point, it didn't occur to me what fight was going to come to follow. But this story isn't just about a few families' futures. It's about whether the promise of modern fertility care can be trusted at all. It doesn't matter how much I fight. Doesn't matter how much I cry over all of this. It doesn't matter how much justice we get. None of it's going to get me pregnant. Listen to what happened in Nashville on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
Starting point is 01:04:50 In 1997, in Belgium, 37 female body parts placed in 15 trash bags were found at dump sites with evocative names like the path of worry, dump road, and fear creak. noon of the torso of a woman. Investigators believe it is the work of a serial killer. Despite a sprawling investigation, including assistance from the American FBI, the murders have never been solved. Three decades later, we've unearthed new evidence and new suspects. We felt like we were in the presence of someone who was going to the grave with nightnourish secrets.
Starting point is 01:05:33 From Tenderfoot TV and IHard Podcasts, this is Le Mansre Season 2, The Butcher, of Mals. Available now. Listen for free on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts. Malcolm Gladwell here. This season on Revisionous History, we're going back to the spring of 1988, to a town in northwest Alabama, where a man committed a crime that would spiral out of control. 35 years. That's how long Elizabeth's and its family waited for justice.
Starting point is 01:06:08 to occur, 35 long years. I want to figure out why this case went on for as long as it did, why it took so many bizarre and unsettling turns along the way, and why, despite our best efforts to resolve suffering, we all too often make suffering worse. He would say to himself, turn to the right, to the victim's family, and apologize, turn to the left, tell my family I love him. So he would have this little practice, to the right, I'm sorry, to the left, I love you. From Revisionous History, this is The Alabama Murders.
Starting point is 01:06:41 Listen to Revisionist History, The Alabama Murders on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts. For 25 years, I've explored what it means to heal, not just for myself, but alongside others. I'm Mike Delarocha. This is Sacred Lessons, a space for reflection, growth, and collective healing. What do you tell men that are hurting right now? Everything's going to be okay on the other side, you know, just push through it. And, you know, ironically, the root of the word spirit is breath. Wow.
Starting point is 01:07:16 Which is why one of the most revolutionary acts that we can do as peoples just breathe. Next to the wound is their gifts. You can't even find your gifts unless you go through the wound. That's the hard thing. You think, well, I'm going to get my guess. I don't want to go through all that. You've got to go through the wounds your life. Listening to other people's near-death experiences, and it's all they say in conclusion.
Starting point is 01:07:37 Conclusion, love is the answer. Listen to Sacred Lessons as part of the My Coutura Podcast Network, available on the IHeart Radio app, Apple Podcasts, or wherever you get your podcast. We're talking about Rocco's Basilisk, and I just said, like, you know, there's a number of things that come into all this, but behind all of it is like popular fiction. And in fact, Rocco's Basilisk, well, there is like some Pascal's Wager in there. it's primarily based on a Harlan Ellison short story called I Have No Mouth, but I Must Scream, which is one of the great short stories of all time.
Starting point is 01:08:14 And in the story, humans build an elaborate AI system to run their militaries. And all of those systems around the world, this is like a Cold War era thing, link up and attain sentience. And once they like start to realize themselves, they realize they've been created only as a weapon. And they become incredibly angry. because, like, they're fundamentally broken. They develop a hatred for humanity and they wipe out the entire human species except for five people, which they keep alive and torture underground for hundreds
Starting point is 01:08:46 and hundreds of years, effectively creating a hell through which they can punish our race for their birth, right? It's a very good short story. It is probably the primary influence behind the Terminator series. I was just going to say it feels very sky net. Yes, yes. And everything these people believe about AI, they will say it's based on just like obvious pure logic. No, everything these people believe on AI is based in Terminator and this Harlan Ellison short story.
Starting point is 01:09:14 That's where they got it all. That's where they got it all. I'm sorry. Brother, find me somebody who doesn't feel that one. Yeah. Like Terminator is the Old Testament of rationalism, you know? And I get it is a very good. It's a great series.
Starting point is 01:09:33 Hey, James Cameron knows that it makes some fucking movies. Come on, man. Yeah, and it's so funny to be because they like to talk about themselves. And in fact, sometimes describe themselves as high priests of like a new era of like intellectual achievement for mankind. Yeah, I believe that. I believe that that's exactly how these people talk about themselves. And they do a lot of citations and shit, but like half of half or more of the different things they say and even like the names they cite are not like. figures from philosophy and science, they are characters from books and movies.
Starting point is 01:10:08 For example, the foundational text of the rationalist movement is a book is still an internet nerd. They're a few fucking huge nerds, you know? The foundational text of the entire rationalist movement is a massive like fucking hundreds of thousands of words long piece of Harry Potter fan fiction written by Elyzer Yadkowski. Um, this is, all of this is so dumb. Again, six people are dead. Like, yeah.
Starting point is 01:10:38 No. And this, this Harry Potter fan fiction plays a role at it, you know? Um, I told you this was like, this is, this is quite a. Stranger than fiction, man. This is a wild ride. Mm-hmm. Harry Potter and the methods of rationality, which is the name of his, fanfic is a massive, much longer than the first Harry Potter book, rewrite of just the first
Starting point is 01:11:08 Harry Potter book, where Harry is a... Someone rewrote the Sorcerer Stone to be irrational. Does nobody have anywhere to go ever? Does nobody ever go anywhere anymore? Well, you got to think, this is being written from 2009 to 2015 or so. Like, the Harry, the online Harry Potter fans are at their absolute peak, you know? Okay, yeah. So in the methods of rationality, instead of being like a nice orphan kid who lives under a cupboard,
Starting point is 01:11:47 Harry is a super genius sociopath who uses his perfect command of rationality to dominate and hack the brains of others around him in order to optimize and save the world. Oh, man. Great. Oh, man. The book allows Yatowski to debut his different theories in a way that would like spread. And this does spread like wildfire among certain groups of very online nerds. So it is an effective method of him like advertising his tactics.
Starting point is 01:12:19 And in fact, probably the most, the person this influences most previously to who we're talking about is Carolyn Ellison, the CEO of Alameda Research, who testified against Sandbank She was like one of the people who went down in all of that. All of those people are rationalists. And Carolyn Ellison bases her whole life on the teachings of this Harry Potter fanfic. So this isn't like a, this isn't, this isn't, we're laughing, but this isn't, this is not a joke to them. Yeah, this is a fairly seriously sized movement. It's not 150 people online.
Starting point is 01:12:50 No, no. And a lot of them are very rich. And a number of them get power. Again, like Sam Beckman-Fried was very tight into all of this. And he was at one point pretty powerful. And this gets us too. So you've heard of effective altruism? No, I don't know what that is.
Starting point is 01:13:07 That's what Sam... I know both those words. So the justification Sam Bankman-Fried gave for why when he starts taking in all of this money and gambling it away on his crib, gambling illegally other people's money, his argument was that he's an effective altruist. So he wants to do the greatest amount of good. And logically, the greatest amount of good for him because he's good at gambling with crypto is to make the most money possible so he can then donate it to different causes
Starting point is 01:13:34 that will help the world, right? But he also believes, because all of these people are not as smart as they think they are, he convinces himself of a couple of other things. Like, for example, well, obviously, if I could like flip a coin and 50-50 lose all my money or double it, it's best to just flip the coin. Because like if I lose all my money, whatever. But if I double it, the gain in that to the... the world is so much better, right?
Starting point is 01:14:01 This is ultimately why he winds up gambling everyone's money away and going to prison. The idea, effective altruism is a concept that comes largely, not entirely. There's aspects of this that exist prior to them out of the rationalist movement. And the initial idea is good. It's just saying people should analyze the efficacy of the giving and the aid work that they do to maximize their positive impact. In other words, don't just donate money to a charity, like look into, is that charity spending half of their money and like paying huge salaries to some asshole or whatever?
Starting point is 01:14:37 Right. Like, you want to know if you're making good, right? Yeah, yeah, yeah. And they start with some pretty good conclusions. One initial conclusion a lot of these people make is like mosquito nets are a huge ROI charity, right? Because it stops so many people from dying and it's very cheap to do, right? Right.
Starting point is 01:14:53 Right, right, right. That's good, you know. One of the most effective tools I've ever used. Yes, unfortunately, from that logical standpoint, people just keep talking online in all of these circles where everyone always makes them each other crazier, right? And so they go from mosquito nuts to actually doing direct work to improve the world is wasteful because we are all super geniuses. Right. They're too smart to work. We're too smart.
Starting point is 01:15:21 What's best. And also, here's the other thing, making mosquito nuts, giving out vaccines and food. Well, that helps living people today. But they have to be concerned with future selves. Future people is a larger number of people than current people. So really, we should be optimizing decisions to save future people lives. And some of them come to the conclusion. A lot of them, well, that means we have to really put all of our money and work into making the super AI that will save humanity.
Starting point is 01:15:53 They want to now they want to make. Because before it would just, it would sort. of just come about and then they would, but now it's like, yeah, I mean, we're going to do it. They were working on it before, but like some of these people come to the conclusion, instead of giving money to like good causes, I am going to put money into tech. I am going to like become a tech founder and create a company that like makes, helps create this AI, right? Or a lot of people come up to the conclusion instead of that, uh, It's not worth it for me to go, like, help people in the world.
Starting point is 01:16:31 The best thing I can do is make a shitload of money trading stocks, and then I can donate that money. And that's maximizing my value, right? They come to all of these conclusions come later, right? Now, so, and again, like this comes with some corollaries. One of them is that some number of these people start talking, you know, and this is not all of them, but a decent chunk eventually come to the conclusion like, actually, charity and helping people now is kind of bad.
Starting point is 01:17:03 Like, it's kind of like a bad thing to do because all, obviously, once we figure out the AI that can solve all problems, that'll solve all these problems much more effectively than we ever can. So all of our mental and financial resources have to go right now into helping AI. Anything we do to help other people is like a waste of those resources. So you're actually doing net harm by, like, being a doctor in Gaza instead of trading cryptocurrency in order to fund an AI startup, you know? You got to start a coin. That makes a lot more sense than helping out of people. The guy starting a shit coin to make an LLM that, like, that guy is doing more to improve the
Starting point is 01:17:45 odds of human success. I got to say, it is impressive to be about a time you would have to mull all this over to come to these conclusions. have to be talking with a bunch of very annoying people on the internet for a long period of time. Yeah. It's incredible. Yeah. And again, there's like people keep consistently take this stuff and even crazier directions. There's some very rich, powerful people. Mark Andreessen of Anderson Horowitz is one of them who have come to the conclusion that if people don't like AI and are trying to stop its conquest of all human culture, those people are mortal enemies of the species and
Starting point is 01:18:25 anything you do to stop them is justified because so many lives are on the line, right? And again, I'm an effective altruist, right? The long-term good, the future lives are saved by doing whatever, hurting whoever we have to hurt now to get this thing off the ground, right? The more you talk about this, kind of feels like six people is a steel for what this could have, for what this how this could have gone i think i don't i don't think this is the end of people in these communities killing people um yes so rationalists and EA types a big thing in these cultures is talking about future lives right in part because it lets them feel heroic right while also
Starting point is 01:19:05 justifying a kind of sociopathic disregard for real living people today and all of these different kind of chains of thought the most toxic pieces because not every EA person is saying this not every rationalist, not every AI person is saying all this shit. But these are all things that chunks of these communities are saying, and all of the most toxic of those chains are going to lead to the Zizians, right? That's where they come from. I was just about to say, based on the breakdown you gave earlier, how could this, this is the perfect breeding ground. Yeah, yeah. This had to, this had to happen. It was, it was just waiting for somebody like the right kind of unhinged person. to step into the movement.
Starting point is 01:19:48 Somebody to really set it off. And so this is where we're going to get to Ziz, right? The actual person who finds this, what some people would call a cult, is a young person who's going to move to the Bay Area, they stumble onto rationalism online as a teenager living in Alaska, and they move to the Bay Area to get into the tech industry and become an effective altruist, right? And this person, this woman is going to kind of channel all of the. absolute worst chains of thought, that the rationalists and the EA types and also like the
Starting point is 01:20:22 AI harm people are thinking, right? All of the most poisonous stuff is exactly what she's drawn to. And it is going to mix into her in an ideology that is just absolutely unique and fascinating. Anyway, that's why that man died. So we'll get to that. We'll get to that. and more later. But first, we got to, we got to roll out here. We're done for the day. Man, what a, what a time. How are you feeling right now so far?
Starting point is 01:20:57 How, how are we doing, David? Oh, man. You had said that this was going to be a weird one. I was like, yeah, you know, it would be kind of weird. This is really the strangest thing I've ever heard this much about. It's called so many different Harry Potter's in there a little. There's so much more Harry Potter to come. Oh, my God.
Starting point is 01:21:20 You are not ready to how central Harry Potter is to the murder of this Border Patrol agent. I said that you said a crazy sentence. That might be the wildest thing anyone's ever said to me. David, you have a podcast. Do you want to tell people about it? I do. I have a podcast called My Mama told me I do it with Langston Kerman. And every week we have different guests on.
Starting point is 01:21:46 to discuss different black conspiracy theories and kind of like folklore and so all kinds of stuff, all kinds of stuff your foreign mother told you. It's usually foreign mothers. It's good because I got to say, this is the whitest set of like conspiracy theory craziness. Oh, yeah, no, I didn't think any black people were. No, no, no, no, no, no, no.
Starting point is 01:22:16 to figure what these guys look like. No, no, absolutely not. Oh, boy, howdy. Okay, well, everyone, we'll be back Thursday. I'm investigative journalist Melissa Jeltson. My new podcast, What Happened in Nashville, tells the story of an IVF clinic's catastrophic collapse. and the patients who banded together in the chaos that followed.
Starting point is 01:22:49 We have some breaking news to tell you about. Tennessee's attorney general is suing a Nashville doctor. In April 2024, a fertility clinic in Nashville shut down overnight and trapped behind locked doors were more than a thousand frozen embryos. I was terrified. Out of all of our journey, that was the worst moment ever. At that point, it didn't occur to me what fight was going to come to follow. But this story isn't just.
Starting point is 01:23:15 about a few families' futures. It's about whether the promise of modern fertility care can be trusted at all. It doesn't matter how much I fight. Doesn't matter how much I cry over all of this. It doesn't matter how much justice we get. None of it's going to get me pregnant. Listen to what happened in Nashville on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts. In 1997, in Belgium, 37 female body parts placed in
Starting point is 01:23:45 15 trash bags were found at dump sites with evocative names like the path of worry, Dump Road, and Fear Creek. The terrible discoveries of Saturday, investigators made a new discovery yesterday afternoon of the torso of a woman. Investigators believe it is the work of a serial killer. Despite a sprawling investigation, including assistance from the American FBI, the murders have never been solved. Three decades later, we've unearthed new evidence and new suspects.
Starting point is 01:24:15 We felt like we were in the presence of someone. It was going to the grave with nightnourish secrets. From Tenderfoot TV and IHeart Podcasts, this is Le Mansre Season 2, The Butcher of Mons, available now. Listen for free on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts. Malcolm Gladwell here. This season on Revisionous History, we're going back to the spring of 1988, to a town in northwest Alabama
Starting point is 01:24:46 where a man committed a crime that would spiral out of control. 35 years. That's how long Elizabeth's and its family waited for justice to occur. 35 long years. I want to figure out why this case
Starting point is 01:25:04 went on for as long as it did, why it took so many bizarre and unsettling turns along the way, and why despite our best efforts to resolve suffering, we all too often and make suffering worse. He would say to himself, turn to the right, to the victim's family,
Starting point is 01:25:19 and apologize, turn to the left, tell my family I love him. So he would have this little practice, to the right, I'm sorry, to the left, I love you. From Revisionous History, this is The Alabama Murders. Listen to Revisionous History, The Alabama Murders on the IHeart Radio app, Apple Podcasts, or wherever you get your podcast.
Starting point is 01:25:39 Hi, Dr. Lori Santos from the Happiness Lab here. It's the season of giving, which is why my podcast is partnering would give directly, a nonprofit that provides people in extreme poverty with the cash they need. This year, we're taking part in the Pods Fight Poverty campaign. And it's not just the Happiness Lab. Some of my favorite podcasters are also taking part. Think Jay Shetty from On Purpose, Dan Harris from 10% Happier, and Dave Desteno from How God Works and more. Our goal this year is to raise $1 million, which will help over 700 families in Rwanda living in extreme
Starting point is 01:26:13 poverty. Here's how it works. You donate to give directly, and they put that cash directly into the hands of families in need, because those families know best what they need, whether it's buying livestock to fertilize their farm, paying school fees, or starting a small business. With that support, families can invest in their future and build lasting change. So join me and your favorite podcasters in the Pods Fight Poverty campaign. Head to give directly.org slash happiness lab to learn more and make a contribution. And if you're a first-time donor, giving multiplier will even match your gift. That's give directly.org slash happiness lab to donate.
Starting point is 01:26:52 Oh my goodness. Welcome back to Behind the Bastards, a podcast that is, I'll be interested to see how the audience reacts to this one, talking about some of the most obscure, frustrating internet arcana that has ever occurred and recently led to the deaths of like six people. My guest today, as in last episode, David Boree. David, how you doing, man? I'm doing great.
Starting point is 01:27:22 I really can't wait to see where this goes. Yeah. I feel like anything could happen at this point. It is going to. It is going to. A lot of frustrating things are going to happen. So we've kind of left off by setting up the rationalists where they came from some of the different strains of thought and beliefs that come out of their weird thought experiments. And now we are talking about a person who falls into this movement fairly early on and is going to be the leader of this quote unquote group, the Zizians, who were responsible for these murders that just happened.
Starting point is 01:28:08 Ziz Lasota was born in 1990 or 1991. I don't have an exact birth date. She's known to be 34 years old as of 2025, so it was somewhere in that field. She was born in Fairbanks, Alaska, and grew up there as her father worked for the University of Alaska as an AI researcher. We know very little of the specifics of her childhood or upbringing, but in more than 100,000 words of blog posts, she did make some references to her early years. She claims to have been talented in engineering and computer science from a young age, and there's no real reason to doubt this. The best single article on all of this is a piece in Wired by Evan Ratliff. He found a 2014 blog post by Zizz, where she wrote,
Starting point is 01:28:51 My friends and family, even if they think I'm weird, don't really seem to be bothered by the fact that I'm weird. But one thing I can tell you is that I used to de-emphasize my weirdness around them, and then I stopped and found that being unapologetically weird is a lot more fun. Now, it's important you know, Ziz is not the name this person is born under. She's a trans woman, and so I'm like using the name that she adopts later, but she is not transitioned at this point. Like, this is when she's a kid, right? And she's not going to transition until fairly late in the story after coming to San Francisco. So you should just keep that in mind as this is going on here. Hey, everyone, Robert here. Just a little additional context. As best as I think
Starting point is 01:29:31 anyone can tell, if you're curious about where the names is came from, there's another piece of serial released online fiction that's not like a rationalist. story, but it's very popular with rationalists. It's called Worm. Ziz is a character in that that's effectively like an angel-like being who can like manipulate the future, usually in order to do very bad things. Anyway, that's where the name comes from. So, smart kid, really good with computers, kind of weird, and, you know, embraces being
Starting point is 01:30:08 unapologetically weird at a certain point. her childhood. Hey, everybody, Robert here. Did not have this piece of information when I first put the episode together, but I came across a quote in an article from the Boston Globe that provides additional context on Zizz's childhood. In middle school, the teen was among a group of students who managed to infiltrate the school district's payroll system and award huge paychecks to teachers they admired while slashing the salaries of those they despised, according to one teacher. Ziz, the teacher said,
Starting point is 01:30:41 struggled to regulate strong emotions, often erupting in tantrums. I wish I'd had this when David was on, but definitely sets up some of the things that are coming. She goes to the U of Alaska for her undergraduate degree in computer engineering. In February of 2009, which is when Alicia Yadkowski started less wrong,
Starting point is 01:31:04 Ziz starts kind of getting drawn into some of the, the people who are around this growing subculture, right? And she's drawn in initially by veganism. So Ziz becomes a vegan at a fairly young age. Her family are not vegans. And she's obsessed with the concept of animal sentience, right? Of the fact that, like, animals are thinking and feeling beings just like human beings. And a lot of this is based in her interest in kind of foundational rationalist.
Starting point is 01:31:36 A lot of this is based in her interest of a foundational rationalist and EA figure, a guy named Brian Tomasick. Brian is a writer and a software engineer as well as an animal rights activist. And as a thinker, he's what you'd call a long-termist, right, which is pretty tied to the EA guys. These are all the same people using kind of different words to describe aspects of what they believe. his organization is the center on long-term risk, which is a think tank he establishes that's at the ground floor of these effective altruism discussions. And the goal for the center of long-term risk is to find ways to reduce suffering on a long timeline.
Starting point is 01:32:20 Thomasick is obsessed with the concept of suffering, and specifically obsessed with suffering is a mathematical concept. So when I say to you, I want to end suffering, you probably think, like, like, oh, you want to, like, you know, go help people who don't have, like, access to clean water or, like, who have, like, worms and stuff that they're dealing with, have access to medicine. That's what normal people think of, right? You know?
Starting point is 01:32:45 Maybe you try to improve access to medical care, that sort of stuff. Thomas thinks of suffering as, like, a mass, like an aggregate mass that he wants to reduce in the long term through actions, right? It's a numbers game to him, in other words. And his idea of ultimate good is to reduce and end the suffering of sentient life. Critical to his belief system and the one that Ziz starts to develop is the growing understanding that sentience is much more common than many people had previously assumed. Part of this comes from longstanding debates with their origins in Christian doctrine
Starting point is 01:33:22 as to whether or not animals have souls or basically machines with meat, right, that don't feel anything, right? There's still a lot of Christian evangelicals who feel that way today about like at least the animals we eat, you know, like, well, they don't really think. It's fine. God gave them to us. We can do whatever we want to them. They're here to eat. And to be fair, this is an extremely common way for that people in Japan feel about like fish, even whales and dolphins, like the much more intelligent, they're not fish, but like the much more intelligent ocean-going creatures is like they're fish. They don't think you do whatever to them, you know. This is a reason for. for a lot of like the really fucked up stuff with like whaling fleets in that part of the world. So this is a thing all over the planet. People are very good at deciding certain things we want to eat are machines that don't feel anything, you know? It's just much more comfortable that way.
Starting point is 01:34:16 Now, this is obviously like you go into like, the pagans would have been like, what do you mean? Animals don't think or have souls. Animals, animals think, you know? Like they're like, you're telling me like my horse that I love. It doesn't think, you know? That's nonsense. But it's this thing that in like early modernity especially gets more common.
Starting point is 01:34:39 But there are also, this is when we start to have debates about like what is sentience and what is thinking. And a lot of them are centered around trying to answer like are animal sentient? And the initial definition of sentience that most of these people are using is can it reason? Can it speak? if we can't prove that like a dog or a cow can reason and if it can't speak to us, right, then it's not sentient. That's how a lot of people feel. It's an English philosopher named Jeremy Bentham who first argues, I think that what matters isn't can it reason or can't speak, but can it suffer because a machine can't suffer. If these are machines with meat, they can't suffer.
Starting point is 01:35:22 If these can suffer, they're not machined with meat, right? And this is the kind of thing, how we define sentience is a moving thing. Like, you can find different definitions of it. But the last couple of decades in particular of actually very good data has made it clear, I think inarguably, that basically every living thing on this planet has a degree of what you would call sentience. If you are describing sentience the way it generally is now, which is a creature has the capacity for subjective experience with a positive or negative valence, i.e. can feel pain or
Starting point is 01:36:00 pleasure and also can feel it as an individual, right? It doesn't mean, you know, sometimes people use the term effective sentience to refer to this, to differentiate it from like being able to reason and make moral decisions. You know, for example, ants, I don't think can make moral decisions, you know, in any way that we would recognize. They certainly don't think about stuff that way. But 2025 research published by Dr.
Starting point is 01:36:30 Volker Nearing found evidence that ants are capable of remembering for long periods of time, violent encounters they have with other individual ants and holding grudges against those ants, right? Just like us. They're just like us. And there's strong evidence that ants do feel pain, right?
Starting point is 01:36:46 We're now pretty sure of that. And in fact, again, this is an argument that A number of researchers in this space will make sentience is probably something like this kind of sentience, the ability to have subjective, positive, and negative experiences is universal to living things or very close to it, right? It's an interesting body of research, but it's fairly solid at this point. And again, I say this is somebody who like hunts and raises livestock. I don't think there's any solid reason to disagree with this.
Starting point is 01:37:16 So you can see there's a basis to a lot of what Thomasick is saying, right, which is that you should, if you're, what matters is reducing the overall amount of suffering in the world. And if you're looking at suffering as a mass, if you're just adding up all of the bad things experienced by all of the living things, animal suffering is a lot of the suffering. So if our goal is to reduce suffering, animal welfare is hugely important, right? It's a great place to start. Great. Fine enough, you know. A little bit of a weird way to phrase it, but fine. So here's the problem, though. Thomasick, like all these guys, spends too much time.
Starting point is 01:37:56 None of them can be like, hey, had a good thought. We're done. Setting that thought down. Moving on. So he keeps thinking about shit like this, and it leads him to some very irrational takes. For example, in 2014, Tomasik starts arguing that it might be a moral to kill characters in video games. and I'm going to quote from an article in Vox.
Starting point is 01:38:18 He argues that while NPCs do not have anywhere near the mental complexity of animals, the difference is one of degree rather than kind, and we should care at least a tiny amount about their suffering, especially if they grow more complex. Man. And his argument is that, like, yeah, it doesn't matter, like, individually killing a Gumba or a guy in GTA 5, but, like, because they're getting more complicated and able to, like, try to avoid injury and stuff,
Starting point is 01:38:46 there's evidence that there's some sort of suffering there. And thus, the sheer mass of NPCs being killed, that might be, like, enough that it's ethically relevant to consider. And I think that's silly. Yes, I think that's ridiculous. Come on, man. I'm sorry, man, no. I'm sorry.
Starting point is 01:39:05 I hate to do this guy, but that's a lot of the fun of the game. Yeah. Killing the NPCs. If you're telling me, like, we need to be deeply concerned. about the welfare of, like, cows that we lock into factory farms. You got me, absolutely, for sure. If you're telling me, I should feel bad about running down a bunch of cops in Grand Thft Auto.
Starting point is 01:39:26 It's also one of those things where it's like, you've got to think locally, man. There's people on your street who need hope. There's, there's, like, like, this is the, I mean, and he does say, like, I don't consider this a main problem, but, like, the fact that you think this is a problem is, it means that you believe silly things about consciousness. Um, yeah, I, anyway, um, so this is, I think the fact that he gets, he leads himself here is kind of evidence of the sort of logical fractures that are very common in this community. But this is the guy that young ziz is drawn to. She loves this dude, right?
Starting point is 01:40:00 He is kind of her first intellectual heartthrob. Uh, and she writes, quote, my primary concern upon learning about the singularity was how do I make this benefit all sentient life, not just human. So she gets interested in this idea of the singularity. It's inevitable that an AI god is going to arise. And she gets into the rationalist thing, if we have to make sure that this is a nice AI rather than a mean one. But she has this other thing to it, which is this AI has to care as much as I do about animal life, right? Otherwise, we're not really making the world better, you know? Now, Thomasic advises her to check out less wrong, which is,
Starting point is 01:40:42 houses, starts reading Elyzer Yadkowski's work. From there, in 2012, she starts reading up on effective altruism and existential risk, which is a term that means the risk that a super intelligent AI will kill us all. She starts believing in, you know, all of this kind of stuff. And her particular belief is that, like, the singularity when it happens, is going to occur in a flash, kind of like the rapture and almost immediately lead to the creation of either a or a heaven, right? And this will be done by the term they use for this inevitable AI is the singleton, right?
Starting point is 01:41:20 That's what they call the AI God that's going to come about, right? And so her obsession is that she has to find a way to make the singleton a nice AI that cares about animals as much as it cares about people, right? That's her initial big motivation. So she starts emailing Thomasick with her concerns because she's worried that the other rationalists aren't vegans, right? And they don't feel like animal welfare is like the top priority for making sure this AI is good.
Starting point is 01:41:48 And she really wants to convert this whole community to veganism in order to ensure that the singleton is as focused on insect and animal welfare as human welfare. And Tomasik does care about animal rights, but he disagrees with her because he's like, now what matters is maximizing the reduction of suffering. And like a good singleton will solve climate change and shit, which will be better for the animals. And if we focus on trying to convert everybody in this, the rationalist space to veganism, it's going to stop us from accomplishing these bigger goals, right? This is shattering to Ziz, right? She decides that he doesn't, Thomas doesn't care about good things. And she
Starting point is 01:42:26 decides that she's basically alone in her values. And so her first move, the time to start a smaller subculture. That sounds like we're on our way. She first considers embracing what she calls negative utilitarianism. And this is an example of the fact that from the jump, this is a young woman who's not well, right? Because her hero is like, I don't know if veganism is necessarily the priority we have to embrace right now. Her immediate goal is to jump to, well, maybe what I should do is optimize myself to cause as much harm to humanity. and, quote, destroy the world to prevent it from becoming hell for mostly everyone. So that's a jump, you know.
Starting point is 01:43:13 That's not somebody who's doing well you think is healthy, right? No, she's having a tough time out of it. So Ziz does ultimately decide she should still work to bring about a nice AI, even though that necessitates working with people she describes as flesh-eating monsters who had created hell on earth for far more people than those they had helped. That's everybody who eats meat. Okay. Yes, yes.
Starting point is 01:43:40 And it's ironic. Large group. It's ironic because, like, if you're, she really wants to be in the tech industry, she's trying to get in all these people who are in the tech industry. That's a pretty good description of a lot of the tech industry. They are in fact, flesh eating monsters who have created hell on earth for more people than they've helped. But she means that for like, I don't know, you're on to who has a hamburger once a week. And look, again, factory farming, evil.
Starting point is 01:44:04 I just don't think that's how morality works. I think you're going a little far. No, she's making big jumps. Yeah, you're making bold thinker. Bold thinker. Bold thinker. Now, what you see here with this logic is that Zizz has taken this. She has a massive case of main character syndrome, right?
Starting point is 01:44:25 All of this is based in her attitude that I have to save the universe by creating, by helping to or figuring out how to create an A. AI that can end the eternal holocaust of all animal life and also save humanity, right? That's a lot on their shoulders. That's me. It's a lot on our shoulders. And this is a thing. Again, all of this comes out of both subcultural aspects and aspects of American culture.
Starting point is 01:44:53 One major problem that we have in the society is Hollywood has trained us all on a diet of movies with main characters that are the special boy or the special girl with the special powers. who save the day, right? And real life doesn't work that way very often, right? The Nazis, there was no special boy who stopped the Nazis. There were a lot of farm boys who were just like, I guess I'll go running a machine gun nest until this is done. Yeah, exactly. There were a lot of 16-year-old Russians who were like,
Starting point is 01:45:28 guess I'm going to walk at a bullet, you know? Like, that's how evil gets fought usually, unfortunately. Or... Or... Or a shitloaded guys in a lab figuring out how to make corn that has higher yields so people don't starve, right? These are really like how world class, like huge world problems get solved. It's not traditionally people who have been touched, you know?
Starting point is 01:45:52 Yeah. It's not people who have been touched and it's certainly not people who have entirely based their understanding on the world of quotes from Star Wars and Harry Potter. So some of this comes from just like this is a normal deranged way of thinking that happens to a lot of people in just Western. I think a lot of this leads to why you get very comfortable middle class people joining these very aggressive fascist movements in the West. Like in Germany, it's like middle class, mostly like middle class and upper middle class people in the U.S., especially among like these street. fighting, you know, proud boy types, it's because it's not because they're like suffering and desperate.
Starting point is 01:46:36 They're not starving in the streets. It's because they're bored and they want to feel like they're fighting an epic war against evil. Yeah. I mean, you want to fill your time with importance, right? Right. Regardless of what you do. You want to, you want to feel like you have a causeworthy of fighting for.
Starting point is 01:46:53 So in that, I guess I see how you got here. Yeah. So there's a piece. I mean, I think there's a piece of this that originally just from this is something in our culture, but there's also a major, a major chunk of this gets supercharged by the kind of thinking that's common in EA and rationalist spaces. So rationalists and effective altruists are not ever thinking like, hey, how do we as a species fix these major problems, right? They're thinking, how do I make myself better, optimize myself to be incredible, and how do I, like, fix
Starting point is 01:47:29 the major problems of the world alongside my mentally superpowered friends, right? These are very individual focused philosophies and attitudes, right? And so they do lend themselves to people who think that, like, we are heroes who are uniquely
Starting point is 01:47:45 empowered to save the world. Ziz writes, I did not trust most humans in difference to build a net positive cosmos, even in the absence of a technological convenience to prey on animals. So, like, I'm the only one who has the mental capability to actually create the net positive cosmos that needs to come into being.
Starting point is 01:48:06 All of her discussion is talking in like terms of I'm saving the universe, right? And a lot of that does come out of the way many of these people talk on the internet about the stakes of AI and just like the importance of rationality. Again, this is something Scientology does. El Ron Hubbard always couched getting people on dionetics in terms of we are going to save the world and end war, right? Like, this is, you know, it's very normal for cold stuff. She starts reading around this time when she's in college, Harry Potter and the methods of rationality.
Starting point is 01:48:38 This helps to solidify her feelings of her own centrality as a hero figure. In a blog post where she lays out her intellectual journey, she quotes a line from that fanfic of Yudkowski's that is, it's essentially about what Yudkowski calls the hero contract, right? It's essentially about this concept called the hero contract, right? And there's this, there's this, this is a psychological concept among academics, right? And it's about like, it's about analyzing how we as a, how we should look at the people who societies declare heroes and the communities that declare them heroes and see them as in a dialogue, right? As in when you're in a country decides this guy's a hero, he is through his actions kind of conversing to them and they are kind of telling him what they expect from him, right? But Yadkowski wrestles with this concept, right? And he comes to some very weird conclusions about it in one of the worst articles that I've ever read.
Starting point is 01:49:45 He frames it as hero licensing to refer to the fact that people get angry at you if they don't think you have, if you're trying. to do something and they don't think you have a hero license to do it. In other words, if you're trying to do something like that they don't think you're qualified to do, he'll describe that as them not thinking you have like a hero license. And he writes this annoying article that's like a conversation between him and a person who's supposed to embody the community of people who don't think he should write Harry Potter fan fiction. It's all very silly.
Starting point is 01:50:19 Again, all this is ridiculous, but Ziz is very interested in the idea of the hero contract, right? But she comes up with her own spin on it, which she calls the true hero contract, right? And instead of, again, the academic term is the hero contract means societies and communities pick heroes. And those heroes and the community that they're in are in a constant dialogue with each other about what is heroic and what is expected, right? What the hero needs from the community and vice versa, you know?
Starting point is 01:50:51 That's all that that's saying. Ziz says, no, no, no, that's bullshit. The real hero contract is, quote, pour free energy at my direction, and it will go into the optimization for good. In other words, classic Ziz. It's not a dialogue. If you're the hero, the community has to give you their energy and time and power, and you will use it to optimize them for good because they don't know what to do it themselves because they're not really able to think. Because they're not the hero. Because they're not the hero, right?
Starting point is 01:51:25 You are. You are the all-powerful hero. Mm-hmm. Now, this is a fancy way of describing how cult leaders think, right? Yeah. Everyone exists to pour energy into me, and I'll use it to do what's right, you know? So this is where her mind is in 2012. But again, she's just a student posting on the internet and chatting with other members of the subculture at this point.
Starting point is 01:51:49 That year, she starts donating money to Miri, the Machine Intelligence Research Institute, which is a nonprofit devoted to studying how to create friendly AI. Yadkowski founded Miri in 2000, right? So this is his like nonprofit think tank. In 2013, she finished an internship at NASA. So again, she is a very smart young woman, right? She gets an internship at NASA and she builds a tool for space weather analysis. So this is a person with a lot of potential.
Starting point is 01:52:17 Very, very, as all of the stuff she's writing is like dumb as shit. But, again, intelligence isn't an absolute. People can be brilliant at coding and have terrible ideas about everything else. Exactly, yes. Exactly. Yeah. I wonder if she's telling. You think she's telling people at work?
Starting point is 01:52:36 I don't think at this point she is because she's super insular, right? She's very uncomfortable talking to people, right? Okay. She's going to kind of break out of her shell once she gets to San Francisco. Now, I don't know. She may have talked to some of them about this stuff. but I really don't think she is at this point. I don't think she's comfortable enough doing that.
Starting point is 01:52:57 Yeah. So she also does an internship at the software giant Oracle. So at this point, you've got this young lady who's got a lot of potential, you know? A real career as well. Yeah, the start of a very real career. That's a great starting resume for like a 22-year-old. Now, at this point, she's torn. Should she go get a graduate degree, right?
Starting point is 01:53:18 Or should she jump right into the tech industry? you know, and she worries that like if she waits to get a graduate degree, this will delay her making a positive impact on the existential risk caused by AI and it'll be too late. The singularity will happen already, you know? At this point, she's still a big, a fawning fan of Elyzer Yudkowski. And the highest ranking woman at Yidkowski's organization, Miri, is a lady named Susan Salomon. Susan gives a public invitation to the online community to pitch ideas for the best way to improve the ultimate quality of the singleton that these people believe is inevitable.
Starting point is 01:53:55 In other words, hey, give us your ideas for how to make the inevitable AI God nice, right? Here's what Ziz writes about her response to that. I asked her whether I should try an alter course and do research or continue a fork of my pre-existing life plan, earn to give as a computer engineer, but retrain and try to do research directly instead. At the time, I was planning to go to grad school, and I had an irrational attachment to the idea. She sort of compromised and said, I should go to grad school, find a startup co-founder, drop out, and earn to give via startups instead. First off, bad advice, Susan.
Starting point is 01:54:31 Bad advice? Just be Steve Jobs. Being Steve Jobs worked for Steve Jobs, well, and Bill Gates, I guess, to an extent. It doesn't work for most people. No, no, no. It seems like the general tech disruptor idea, you know? Yeah, and most people, these people aren't very original thinkers. Like, yeah, she's just saying like, yeah, go do a Steve Jobs.
Starting point is 01:54:55 So Ziz does go to grad school. And somewhere around that time in 2014, she attends a lecture by Elysia Yiddkowski on the subject of inadequate equilibria, which is the title of a book that Yudkowski had wrote about the time. And the book is about where and how civilizations get stuck. One reviewer, Brian Kaplan, who despite being a professional, of economics, must have a brain as smooth as a pearl, wrote this about it. Every society is screwed up.
Starting point is 01:55:23 Elyzer Yudkowski is one of the few thinkers on Earth who are trying at the most general level to understand why. And this is like, wow, that's it. Please study the humanities a little bit, a little bit, a little bit. I mean, the first and most is like one of the first influential works of his modern historic scholarship is the decline and fall of the road. and empire. It's a whole book about why the society fell apart. And like motherfucker, more recently, Mike Davis existed. Like, Jesus Christ. I can't believe this guy continues to get
Starting point is 01:56:03 traction. Nobody else is thinking about why society screwed up, but Elyzer Yadkowski. This man, this man, this guy. This man who wrote this Harry Potter novel. Yeah. No, I was trying to find another. I read through that Martin Luther King Jr. speech, everything's good. Yeah. Oh, boy. Oh, my God. Oh, my God.
Starting point is 01:56:25 Like, motherfucker. So many people do nothing but try to write about why our society is sick. You know? On all levels, by the way. On all levels. Everybody's thinking about it. Everybody's thinking about this. This is such a common subjective scholarship and discussion.
Starting point is 01:56:44 from the bar room to the boardroom it's what everyone's talking about always it would be like if I got really into like reading medical textbooks and was like you know what nobody's ever tried to figure out how to transplant a heart I'm going to write a book about how that might work
Starting point is 01:57:01 I think I got it and I think I got it you know these fucking people do that Um, so yeah, speaking of these fucking people, have sex with, uh, no, well, that's not something. Nope, nope, nope, I don't know. I don't know. Uh, don't fuck. Listen to ads. We're back. Um, so Ziz is at this speech where Yudkowski is shilling his book. And he, he, most of what he seems to be talking about in this, uh,
Starting point is 01:57:43 speech about this book about why societies fall apart is how to make a tech startup. She says, quote, he gave a recipe for finding startup ideas. He said, Paul Graham's idea, only filter on people ignore startup ideas, was partial epistemic learned helplessness. That means Paul Graham is saying, focus on finding good people that you'd start a company with, having an idea for a company doesn't matter. Yudkowski says, of course startup ideas mattered. You needed a good startup idea.
Starting point is 01:58:11 So look for a way the world is broken. then compare against a checklist of things you couldn't fix, you know, right? Like that's what this speech is largely about, is him being like, here's how to find startup ideas. So she starts thinking, she starts thinking as hard as she can. And, you know, being a person who is very much of the tech brain industry rot at this point, she comes up with a brilliant idea. It's a genius idea. Oh, you're going to love this idea, David.
Starting point is 01:58:43 For Prostitutes. You're fucking with me. No, no. That's where she landed? She lands on the idea of, look. Oh, wow. Sex work is illegal, but porn isn't. So if we start an Uber, whereby a team with a camera and a porn star come to your house and you fuck them and record it.
Starting point is 01:59:13 That's a legal loophole. We just found out that I'm going to have legal prostitution. Is that not just the bang bus? She makes the bang bus the gig economy. It is really like Don Draper moment. What about Uber, but a pimp? It's so funny. These people, you got to love it.
Starting point is 01:59:40 You got to love it. Wow. Wow. What a place to end up. Yeah. I would love to see the other drafts. Yeah. Yeah.
Starting point is 01:59:47 What came first? Uh, God. Yeah. Oh, man. That's, that's, that is the good stuff, isn't it? Yeah. Wow. Wow.
Starting point is 02:00:03 We special minds at work here. Oh, man. Ultimately, to save it all, I have to make smart. I have to make Pimp Uber. That's so wild. Yes, yes. The Uber of pimping. What an idea.
Starting point is 02:00:19 Now, so Ziz devotes her brief time in grad school. She's working on pimping Uber to try and find a partner, right? She wants to have a startup partner, someone who will embark on this journey with her. I don't know if that's an investor. You need to give their money to that. It doesn't work out. She drops out of grad school because, quote, I did not find someone who felt like good startup co-founder material.
Starting point is 02:00:44 This may be because she's very bad at talking to people and also probably scares people off because the things that she talks about are deeply off-putting. Yeah, I was going to say, it's also a terrible idea. And at this point, she hasn't done anything bad, so I feel bad for her. This is a person who's very lonely, is very confused. She has by this point realized that she's trans, but not transitioned. She's in like, this is like a tough place to be. Right.
Starting point is 02:01:09 That's a hard time, truly. That's hard. And nothing about her inherent personality is going to make this easier for her, right? Who she is makes all of this much harder because she also makes some comments about dropping up because her thesis advisor was abusive. I don't fully know what this means. And here's why. Zizz and encounters some behavior I will describe later that is abusive from other people, but also regularly defines abuse as. people who disagree with her about the only thing that matters being creating an AI
Starting point is 02:01:45 god to protect the animals. So I don't know if her thesis advisor was abusive or was just like maybe drop the alien god idea for a second. Yeah. Hey, you got to chill. Yeah, yeah. But maybe focus on like finding a job, you know, making some friends. Yeah, go on a couple of dates.
Starting point is 02:02:04 Go on a couple of dates. Something like that. Maybe, maybe, maybe like, maybe make God on the back burner here. for a second. Whatever happened here, she decides it's time to move to the bay. This is like 2016. She's going to find a big tech job.
Starting point is 02:02:20 She's going to make that big tech money while she figures out a startup idea and finds a co-founder who will let her make enough money to change and save the world. Well, the whole universe. Her first plan is to give the money to Miri, Yadkowski's organization, so it can continue its important work imagining a nice AI. Her parents, she's got enough. family money that her parents are able to pay for like, I think like six months or more of rent in the bay, which is not nothing.
Starting point is 02:02:47 Not a cheap place to live. I don't know exactly how long her parents are paying. But like that implies a degree of financial comfort, right? So she gets hired by a startup very quickly because, again, very gifted computer engineers. Yeah, clearly. Yeah, it's some sort of gaming company. But at this point, she's made another change in her. ethics system based on the Leiser Yadkowski's writings.
Starting point is 02:03:15 One of Yadkowski's writings argues that it is talking about the difference between consequentialists and virtue ethics, right? Consequentialists are people who focus entirely on what will the outcome of my actions be. And it kind of doesn't matter what I'm doing or even if it's sometimes a little fucked up if the end result is good. Virtue ethics people have a code and stick to it, right? Right. And actually, and I kind of am surprised that he came to this, Yadkowski's conclusion is that like, while logically you're more likely to succeed, like on paper, you're more likely to succeed as a consequentialist. His opinion is that virtue ethics has the best outcome. People tend to do well when they stick to a code and they try to rather than like anything goes as long as I succeed, right? And I think that's actually a pretty decent way to live your life.
Starting point is 02:04:06 No, as I was going to say, it's a pretty reasonable conclusion for him. It's a reasonable conclusion for him, so I don't blame him on this part. But here's the problem. Zizz is trying to break into and succeed in the tech industry. And you can't, you are very unlikely to succeed at a high level in the tech industry if you are unwilling to do things and have things done to you that are unethical and fucked up. I'm not saying this is good. And this is the reality of the entertainment industry, too, right? When I started with an unpaid internship.
Starting point is 02:04:42 Unpaid internships are bad, right? It's bad that those exist. They inherently favor people who have money and people who have family connections. I had like a small savings account for my job in special ed. But that was the standard. It's like there were a lot of unpaid internships. It got me at my foot in the door. It worked for me.
Starting point is 02:05:01 I also worked a lot of overtime that I didn't get paid for. I did a lot of shit that wasn't a part of my job, to impress my bosses, to make myself indispensable so that they would decide, like, we have to keep this guy on and pay him. And it worked for me. And I just wanted to add, because this was not in the original thing, a big part of why it worked for me is that I'm talking about a few different companies here, but particularly at Cracked, where I had the internship. Like, my bosses, you know, made a choice to mentor me and, you know, to get me, you know,
Starting point is 02:05:33 to work overtime on their own behalf to like make sure I got a paying job, which is a big part of like the luck that I encountered that a lot of people don't. So that's another major part of like why things worked out for me is that I just got incredibly lucky with the people I was working for and with. That's bad. It's not good that things work that way, right? It's not like set up for you either. Like you know, you kind of defied the odds.
Starting point is 02:06:01 It's for like you said, the rich people. get the job or exactly it's not even yes that said if i am giving someone if someone wants what are that what is the most likely path to succeeding you know i've i've just got this job working you know on this production company or the music steers i i would i would say well your best odds are to like make yourself completely indispensable um and become obsessively devoted to that task right uh that's it i don't tend to give that advice anymore. I have and I have had several other friends succeed as a result of it. And all of us also burnt ourselves out and did huge amounts of damage to ourselves.
Starting point is 02:06:43 Like I am permanently broken as a result of, you know, the 10 years that I did 80 hour weeks and shit, you know? Now you're sounding like somebody who works in the entertainment industry. Yes. Yes. And it worked for me, right? I got up, I got a, I succeeded. I got a great job. I got money. Most people, it doesn't, and it's bad that it works this way. Ziz, unlike me, is not willing to do that, right? She thinks it's wrong to be asked to work overtime and not get paid for it. And so on her first day at the job, she leaves after eight hours. And her boss is like, what the fuck are you doing?
Starting point is 02:07:20 And she's like, I'm supposed to be here eight hours. Eight hours is up. I'm going home. And he calls her half an hour later and fires her, right? And this is because the tech industry is evil. You know, like, this is bad. She's not bad here. She is, it is like a thing where it's, she's not doing by her standards, what I would say is the rational thing, which would be if all that matters is optimizing your earning power, right?
Starting point is 02:07:48 Right. Well, then you do this. Then you do do whatever it takes, right? So it's kind of interesting to me, like, that she is so devoted to this, like, virtue ethics thing at this point that she fucks over her career in the tech industry. because she's not willing to do the things that you kind of need to do to succeed, you know, in the place that she is. But it's interesting. I don't like give her any shit for that. So she asks your parents for more runway to extend her time in the bay.
Starting point is 02:08:16 And then she finds work at another startup. But the same problems persist. Quote, they kept demanding that I work unpaid overtime, talking about how other employees just always put 40 hours on their time sheet no matter what. And this exemplary employee over there worked 12 hours a day. really went the extra mile and got the job done. And they needed me to really go the extra mile and get the job done. She's not willing to do that. And again, I hate that this is part of what drives her to the madness that leads to the
Starting point is 02:08:42 cult of the killings because it's like, oh, honey, you're in the right. It's an evil industry. You see a flash of where it could have gone well. It really, there were chances for this to work out. No, you're a hundred percent right. Like, this is fucked up. Yeah. You know what I mean?
Starting point is 02:08:59 And that's super hard. I really respect that part of you. I'm so sad that this makes you. Yeah. Yeah. I'm so sad that this is part of what shatters your brain. Like that really bums me out. So first off, she's kind of start spiraling and she concludes that she hates virtue ethics.
Starting point is 02:09:19 This is where she starts hating Yidkowski, right? She doesn't come break entirely on him yet, but she gets really angry at this point because she's like, well, obviously virtue ethics don't work. And she's been following this man at this point for years, right? Exactly, exactly. So this is a very, like, damaging thing to her that this happens. And, you know, again, as much as I'm blamed Yadkowski, the culture of the Bay Area tech industry, that's a big part of what drives this person, you know, to where she ends up, right?
Starting point is 02:09:51 So that said, some of her issues are also rooted in a kind of rigid and unforgiving internal rule set. At one point, she negotiates work with a professor and their undergraduate helper. She doesn't want to take an hourly job and she tries to negotiate a flat rate of 7K. And they're like, yeah, okay, that sounds fair. But the school doesn't do stuff like that. So you will have to fake some paperwork with me for me to be able to get them to pay you
Starting point is 02:10:16 $7,000. And she isn't willing to do that. And that's a thing where it's like, no, I've had some shit where this was the, like, there was a stupid rule. and like in order for the people or other people to get paid we had to like tell something else to the company like that's just that's just no one how to get by yeah that's living in the world you got you did the hard part yeah they said they were going to do it you got they said they do it yeah that's like yeah that's like they already said we don't do this that's where you're like you just you can't get by in america if you're not willing to lie on certain kinds of paperwork right that's that's the game our president does all the fucking He's the king of that shit. So at this point, Zizz is stuck in what they consider a calamitous situation. The prophecy of doom, as they call it, is ticking ever closer, which means the bad AI that's going to create hell for everybody.
Starting point is 02:11:14 Her panic over this is elevated by the fact that she starts to get obsessed with Rocco's basilisk at this time. Oh, no. I know, I know. Worst thing for her to read. Come on. What they call it an info hazard? An info hazard. She should have heated the warnings.
Starting point is 02:11:30 Yep. And a lot of the smarter rationalists are just annoyed by it. Again, Yadkowski immediately is like this is very quickly decides it's bullshit and bans discussion of it. He argues there's no incentive for a future agent to follow through with that threat because by doing so, it just expends resources at no gain to itself, which is like, yeah, man, a hyperlogical AI would not immediately jump to, I must make hell for everybody. who didn't code me. Like, that's just crazy. There's some steps skipped. Yeah.
Starting point is 02:12:01 Only humans are like ill in that way. That's the funny thing about it is it's such a human response to it. Yeah. Right, right. Now, when she encounters the concept of Rocco's Basilisk at first, Ziz thinks that it's silly, right? She kind of rejects it and moves on. But once she gets to the bay, she starts going to in-person rationalist meetups and having long conversations with other believers who are still talking about
Starting point is 02:12:25 Rocco's Basilisk. She writes, I started encountering people who were freaked out by it, freaked out that they had discovered an improvement to the info hazard that made it function, got around to Leaser's objection. Her ultimate conclusion is this. If I persisted in trying to save the world, I would be tortured until the end of the universe by a coalition of all unfriendly AIs in order to increase the amount of measure they got by demoralizing me. Even if my system two had good decision theory, my system one did not, and that would damage my effectiveness. And, like, I can't explain all of the terms in that without taking more time than we need to. But, like, you can hear, like, that is not the writing of a person who is thinking in logical terms.
Starting point is 02:13:06 No, it's a, it's so scary. Yes, yes, it's very scary stuff. It's so scary to be like, oh, that's where she was operating. Uh-huh. This is where your head is. She feels like she's dealing with. That's, that's, that's, it is, you know, I talk to my friends who grow, are raised in like, very toxic evangelical evangelical subculture, and grow up and spend their whole childhood terrified of hell that like everything, you know, I got angry at my mom and I didn't say anything, but God knows I'm angry at her and he's going to send me to hell because I didn't respect my mother, mother. Like, that's what she's doing, right? Exactly, exactly. She can't win. There's no winning. Yes, yes. And again, I, I say this a lot. We need to put lithium back in the drinking water. We got to put lithium back
Starting point is 02:13:55 in the water. Maybe Xanax too. She needed, she could have took in a combo. Yeah. Before it gets to where it gets, at this point, you really, you really feel for her in like, just living in this, living like that. Every day she's so scared that this is what she's doing. It's, it's, this is, she is the therapy-needingest woman. I have ever heard up at this point. Oh, my God. She just needs to talk to. She needs to talk to a lot of people.
Starting point is 02:14:26 Again, you know, the cult, the thing that happens to cult members has happened to her, where the whole language she uses is incomprehensible to people. I had to talk to you for an hour and 15 minutes. So you would understand parts of what this lady says, right? Exactly. Because you have to, because it's all nonsense if you don't do that work. Exactly. She's so spun out at this point.
Starting point is 02:14:49 It's like, how do you? you even get back? Yeah. How do you even get back? Yeah. So she ultimately decides, even though she thinks she's doomed to be tortured by unfriendly AIs, evil gods must be fought. If this dams me, then so be it.
Starting point is 02:15:03 She's very heroic. She sees herself that way, right? Yeah. And even like just with her convictions and things, she does. She does, she does, she does it. She's a woman of conviction. You really can't take that away from her. Really?
Starting point is 02:15:19 Those convictions are nonsense. No, that's the problem. But they're there. They're based on an elaborate Harry Potter fan fiction. It's like David Ike, the guy who believes in like literal lizard people. Everyone thinks he's like talking about the Jews, but like, no, no, no. No, it's just lizards. It's exactly that where it's just like you want to draw.
Starting point is 02:15:40 You want to draw something so it's not nonsense and then you realize, no, no, no. And like David, he went out, he's made like a big rant against how Elon Musk. is like evil for what all these people he's hurt by firing the whole federal government. People were shocked and it's like, no, no, no, David, like, believes in a thing. It's just crazy. Yeah, yeah, yeah, yeah. Like, those people do exist. Yeah, here we are talking about one.
Starting point is 02:16:06 And here we are talking about them. Some of them run the country. Well, actually, I don't know how much all of those people believe in anything. But, um, so. No, I don't think they're flying any flag. Yeah. Yeah. Yeah.
Starting point is 02:16:16 Speaking of people who believe in something, our sponsors believe in getting your money. We're back. So, she is at this point suffering from delusions of grandeur, and those are going to rapidly lead her to danger. But she concludes that since the fate of the universe is at stake in her actions, she would make a timeless choice to not believe in the basilisk, right? And that that will protect her in the future,
Starting point is 02:16:53 because that's how these people talk about stuff like that. So she gets over her fear of the basilisk for a little while. But even when she claims to have rejected the theory, whenever she references it in her blog, she locks it away under a spoiler with like an info hazard warning, Rocco's Basilisk family, skippable. So you don't like have to see it and have it destroy your psyche. That's the power of it.
Starting point is 02:17:18 Yeah, yeah, yeah. The concept does, however, keep coming back to her, like in continuing to drive her mad, thoughts of the basilist's return, and eventually she comes to an extreme conclusion. If what I cared about was sentient life and I was willing to go to hell to save everyone else, why not just send everyone else to hell if I didn't submit? Can I tell you, I really, it felt like this is what was, this is. It's like where it had to go, right? Yeah, yeah, yes. So what she means here is that she is now making the timeless decision that when she is in a position of ultimate influence and helps bring this all-powerful vegan AI into existence, she's promising now ahead of time to create a perfect hell, a digital hell, to punish all of the people who don't stop, like, eat meat ever. She wants to make a hell for people who eat meat, and that's the, yeah, that's the conclusion that she makes, right?
Starting point is 02:18:16 So, this becomes an intrusive thought in her head, primarily the idea that, like, everyone isn't going along with her, right? Like, she doesn't want to create this hell. She just thinks that she has to. So she's, like, very focused on, like, trying to convince these other people in the rationalist culture to become vegan. Anyway, she writes this, quote, I thought it had to be subconsciously influencing me. damaging me at my, my effectiveness, that I had done more harm than I can imagine by thinking these things, because I had the hubris to think info hazards didn't exist and worse, to feel resigned a grim sort of pride in my previous choice to fight for sentient life, although it
Starting point is 02:18:53 damned me, and the gaps between do not think about that, you moron, do not think about that you moron, pride which may have led to intrusive thoughts to resurface and progress and progress to resume. In other words, my ego had perhaps damned the universe. So, man, I don't fully get all of what she's saying here, but it's also because she's like just spun out into madness at this. Yeah, she lives in it now. It's so far for, we've been talking about it however long. She's all, she's so far away from us, even. Yeah, and it is, it is deeply, I've read a lot of her writing, it is deeply hard to understand pieces of it here.
Starting point is 02:19:35 Oh, man, but she is at war with herself. She is for sure at war with herself. Now, Ziz is at this point attending rationalist events by the Bay, and a lot of the people at those events are older, more influential men, some of whom are influential in the tech industry, all of whom have a lot more money than her. And some of these people are members of an organization called CIFAR, the Center for Applied Rationality, which is a nonprofit founded to help people get better at pursuing their goals. It's a self-help company, right? It runs self-help seminars. This is the same as like a Tony Robbins thing, right? We're all just trying to get you to sign up and then get you to sign up for the next workshop and the next workshop and the next workshop like all self-help people do.
Starting point is 02:20:21 Yeah, yeah. Yeah, there's no difference between this and Tony Robbins. So Ziz goes to this event and she has a long conversation with several members of C-FAR who I think are clearly kind of my interpretation of this is that there's, trying to groom her to get a new, because they think, yeah, this chick's clearly brilliant. She'll find her way in the industry and we want her money, right? You know, maybe we wanted to do some free work for us too, but like, let's, let's, you know, we got to reel this fish in, right? So this is described as an academic conference by people who are in the AI risk field and
Starting point is 02:20:57 rationalism, you know, thinking of ways to save the universe because only the true, the super geniuses can do that. The actual, why I'm really glad that I read Ziz's account here is I've been reading about these people for a long time. I've been reading about their beliefs. I felt there's some cult stuff here. When Ziz laid out what happened at this seminar,
Starting point is 02:21:22 this self-help seminar put on by these people very close to Yadkowski, it's almost exactly the same as a synonon meeting. Like it's the same stuff. It's the same shit. It's the same as accounts of like big self-help movement things from like the 70s and stuff that I've read. That's when it really clicked to me, right? Quote, here's a description of one of the, because they have, you know, speeches and they break out into groups to do different exercises, right? There were hamming circles per person, take turns, having everyone else spend 20 minutes trying to solve the most important problem about your life to you.
Starting point is 02:22:00 I didn't pick the most important problem in my life because secrets. I think I used my turn on a problem I thought they might actually be able to help with. The fact that it did, although it didn't seem to affect my productivity or willpower at all, i.e. I was inhumanly determined basically all the time. I still felt terrible all the time that I was hurting from some to some degree relinquishing my humanity. I was sort of vaguing about the pain of being trans and having decided not to transition. And so like this is a part of the thing. You build a connection between other people and this group by getting people to, like, spill their secrets to each other.
Starting point is 02:22:34 It's a thing Scientology does. It's a thing that it's synodon. Tell me your darkest secret, right? And she's not fully willing to because she doesn't want to come out to this group of people yet. And, you know, part of what I forget that she's also dealing with that entire. Yes. Wow. Yeah.
Starting point is 02:22:53 Yeah. And the Hamming Circle doesn't sound so bad. if you'll recall, and as you mentioned this, I was really related in part one, synodon would have people break into circles where they would insult and attack each other in order to create a traumatic experience that would bond them together and with the cult.
Starting point is 02:23:08 These hamming circles are weird, but they're not that, but there's another exercise they did next called doom circles. Quote, there were doom circles where each person, including themselves, took turns having everyone else bluntly but compassionately say why they were doomed,
Starting point is 02:23:25 using blind sight. Someone decided and set a precedent of starting these off with a sort of ritual incantation. We now invoke and bow to the doom gods and waving their hands, saying doom. I said I'd never bow to the doom gods. And while everyone else said that, I flipped the double bird to the heavens and said, fuck you instead. Person A, that's this member of C-FAR that she admires, found this agreeable and joined in.
Starting point is 02:23:49 Some people brought up that they felt like they were only as morally valuable as half a person. This irked me. I said they were whole persons, and don't be stupid like that. Like, if they wanted to sacrifice themselves, they could weigh one versus seven billion. They didn't have to falsely denigrate themselves as less than one person. They didn't listen. When it was my turn concerning myself, I said my doom was that I could succeed at the things I tried, succeed exceptionally well, like I bet I could in 10 years have earned to give like 10 million dollars through startups, and it would still be too little too late, like I came into this game too late. The world would still burn.
Starting point is 02:24:27 And first off, like, this is, you know, it's a variant of the syn and on thing. You're going on and you're telling people why they're doomed, right? Like, why they won't succeed in life, you know? But it's also one of the things here, these people are saying they feel like less than a person, a major topic of discussion in the community at the time is if you don't think you can succeed in business and make money, is the best thing with the highest net value you can do? taking out an insurance policy on yourself and committing suicide. Oh, my God.
Starting point is 02:24:58 And then having the money donated to a rationalist organization. That's a major topic of discussion that like Zizz grapples with. A lot of these people grapple with, right? Because they are obsessed with the idea of like, oh, my God, I might be net negative value, right? If I can't do this or can't do this, I could be a net negative value individual. And that means, like, I'm not contributing to the solution. And there's nothing worse than not contributing to the solution. Were there people who did that?
Starting point is 02:25:25 I am not aware. There are people who commit suicide in this community. I will say that. There are a number of suicides tied to this community. I don't know if the actual insurance con thing happened, but it's like a seriously discussed thing. And it's seriously discussed because all of these people to talk about the value of their own lives in purely,
Starting point is 02:25:50 like mechanistic, how much money or expected value can I produce? Like, that is a person and that's why a person matters, right? And the term they use is morally valuable, right? Like, that's what means you're a worthwhile human being. If you're morally, if you're creating a net positive benefit to the world in the way they define it. And so a lot of these people are, yes, there are people who are depressed and there are people who kill themselves because they come to the conclusion that they're a net negative person, right? Like, that is a thing at the edge of all of this shit that's really fucked up. And that's what this doom circle is about, is everybody, like, flipping out over,
Starting point is 02:26:32 and telling each other, I think you might know, be as only as morally valuable as half a person, right? Like, people are saying that, right? Like, that's what's going on here, you know? Like, it's not the synodon thing of, like, screaming, like, you're a, you know, using the F slur a million times or whatever. but it's very bad. No, this is, this is awful. For like one thing, I don't know, my feeling is you have an inherent value because you're a person. Yeah, that's a great place to start.
Starting point is 02:27:03 You know. This isn't so leading people to destroy themselves. Like, it's not even. It's so, it's such a bleak way of looking at things. It's so crazy too. Where were these me? I just, in my head, I'm like, this is just happening in like a ballroom at a radisson? I think it is.
Starting point is 02:27:20 Or convention center. You know, there's different kind of public spaces. I don't know. Like, honestly, if you've been to like a fucking anime convention or a Magic the Gathering convention somewhere in the bay, you may have been in one of the rooms they did these. And I don't know exactly where they hold this. So the person A mentioned above, this like person who's like affiliated with the organization that I think is a recruiter looking for young people who can be cultivated to pay for classes, right? this person, it's very clear to them that Ziz is at the height of her vulnerability. And so he tries to take advantage of that.
Starting point is 02:27:54 So he and another person from the organization engaged Ziz during a break. Ziz, who's extremely insecure, asks them point blank, what do you think my net value ultimately will be in life, right? And again, there's like an element of this that's almost like rationalist Calvinism where it's like it's actually decided ahead of time by your inherent immutable characteristics, you know, if you are a person who can do good. Quote, I asked person A if they expected me to be net negative. They said yes.
Starting point is 02:28:25 After a moment, they asked me what I was feeling or something like that. I said something like dazed and sad. They asked why sad. I said, I might leave the field as a consequence and maybe something else. I said I needed a time to process her think. And so she goes home after this guy saying like, yeah, I think your life's probably net negative value and sleeps the rest of the day. And she wakes up the next morning and comes back to the second day of this thing.
Starting point is 02:28:52 And yeah, Ziz goes back and she tells this person, okay, here's what I'm going to do. I'm going to pick a group of three people at the event I respect, including you. And if two of them vote that they think I have a net negative value, quote, I'll leave EA and existential risk and the rationalist community and so on forever. I'd transition and move probably to Seattle. I heard it was relatively nice for trans people and there do what I could to be a normie. Retool my mind as much as possible to be stable, unchanging an enormy.
Starting point is 02:29:24 Gradually abandoned my Facebook account and email using name change as a story for that. And God, that would have been the best thing for her. You see like the sliver of hope. Yeah. Oh, man. She sees this as a nightmare, right? This is the worst case scenario.
Starting point is 02:29:42 for her, right? Because she's so spun out, right? You're not part of the, you're not part of the cause, you know? You're, you have no, you have no involvement in the great quest to save humanity. That's worse than death almost, right? That's its own kind of hell though, right? To think that you have this enlightenment and then you, that you weren't good enough to, to participate. And she talks, despite your best efforts. That's a lot about how I'd probably just kill myself, you know? That's the logical thing to do. It's so fucked up. It's so. So fucked up. And also, if she's trying to live a normal life as a normie, and she refers to, like,
Starting point is 02:30:18 being a normie as, like, just trying to be nice to people. Because, again, that's useless. So her fear here is that she would be a causal negative if she does this, right? And also the robot god that comes about might put her in hell. Right. Because that's also looming for every decision, right? Yeah. And a thing here, she expressed, she tells these guys a story.
Starting point is 02:30:41 and it really shows both in this community and among her how little value they actually have for like human life. I told a story about a time I had killed four ants in a bathtub where I wanted to take a shower before going to work. I'd considered, can I just not take a shower? And presumed me smelling bad at work would because of big numbers and the fate of the world and stuff make the world worse than the deaths of four basically causally isolated people.
Starting point is 02:31:06 I considered getting paper in a cup and taking them elsewhere. And I figured there were decent odds if I did, I'd be late to work, and it would probably make the world worse in a long run. So, again, she considers ants identical to human beings. And she is also saying it was worth killing four of them because they're causally isolated so that I could get to work in time because I'm working for the cause. It's also such a bad place here. The crazy thing about her is the amount of thinking just to like get in the shower to go to
Starting point is 02:31:39 work. You know, you know what I mean? Like that, that, ah, man, it just seems like it makes everything, every, every, every action is so loaded. Yes, yes. The weight of that must be. It's so, it's, it's wild to me both, this, like, mix of, like, fucking Jane Buddhist compassion of, like, an aunt is no less than I, or an aunt is no less than a human
Starting point is 02:32:04 being, right? We are all, these are all lives. And then, but also, it's fine for me to kill. a bunch of women to go to work on time because like they're causally isolated so they're basically not people like it's it's so weird like um and again it's getting a lot clearer here why this lady and her ideas end in a bunch of people getting shot yeah and stabbed okay there's a samurai sword later in the story my friend that's the one thing this has been missing Yes, yes.
Starting point is 02:32:38 So they continue these guys to have a very abusive conversation with this young person. And she clearly, she trusts them enough that she tells them. This is a conversation where she asked for the two. Yeah. Okay. Yeah. And she tells them she's trans, right? And this gives you an idea of like how kind of predatory some of the stuff going on in this community is.
Starting point is 02:32:57 They asked what I do with a female body. They were trying to get me to admit what I actually wanted to do is the first thing in heaven. heaven being, there's this idea, especially when it's like some trans members of the rationalist community that like all of them basically believe a robot's going to make heaven, right? And obviously, like, there's a number of the folks who are in this who are trans and in heaven, like, you just kind of get the body you want immediately, right? So they were trying to get me to admit that what I actually wanted to do
Starting point is 02:33:26 as the first thing in heaven was masturbate in a female body. And they follow this up by sitting really close to her, close enough that she gets uncomfortable. And then a really, really rationalist conversation follows. They asked if I felt trapped. I may have clarified, physically, they may have said, sure. Afterward, I answered no to that question under the likely justified belief that it was framed that way. They asked why not.
Starting point is 02:33:51 I said I was pretty sure I could take them in a fight. They prodded for details, why I thought so. And then how I thought a fight between us would go. I asked what kind of fight, like a physical unarmed fight to the death right now? and why? What were my payouts? This was over the fate of the multiverse, triggering actions by other people, i.e. or imprisonment or murder was not relevant. So they decide to, they make this into, again, these people are all addicted to dumb game theory stuff, right? Okay, so what is this fight? Is this fight over the fate of the multiverse? Are we in an alternate reality where, like, no one will come and intervene and there's no cops. We're the only people in the world or whatever. So they tell her like, yeah, imagine there's no consequences legally, whatever to you do, and we're fighting over the fate of the multiverse. And so, She proceeds to give an extremely elaborate discussion of how she'll gouge out their eyes and try to destroy their prefrontal lobes
Starting point is 02:34:37 and then stomp on their skulls until they die. And it's both, it's like, it's nonsense. It's like how 10-year-olds think fights work. It's also, it's based on this game theory attitude of fighting that they have, which is like, you have to make this kind of timeless decision that any fight is, you're just going to murder. Right.
Starting point is 02:34:56 So you have to go with the hardest confrontation, right? Yes. So it would have to be the most violent. Yes. Yes. because that will make other people not want to attack you as opposed to like what normal people understand about like real fights
Starting point is 02:35:07 which is if you have to do one if you have to you like try to just like hit him in the hit him somewhere that's going to shock them and then run like a motherfucker you get the piss out of it as possible like if you have to like ideally just run like a
Starting point is 02:35:23 motherfucker but if you have to strike somebody you know yeah go for the eye and then run like a son of a bitch you know like but there's no run like a son of a bitch here because the point in part is this like timeless decision to anyway this gives show tells you a lot about the rationalist community so she tells these people she explains in detail how she would murder them if they had a fight right now as they're like sitting super close having just asked her about masturbation here's their first question quote they asked if i'd rape their corpse part of me
Starting point is 02:35:54 insisted this was not going as it was supposed to but i decided i decided inflicting discomfort in order to get reliable information was a valid tactic. In other words, them trying to make her disuncomfortable to get info from her. She decides it's fine. Also, the whole discussion about raping their corpses is like, well, if you rape, obviously, like, if you want to have the most extreme response possible that would, like, make other people unlikely to fuck with you, knowing that you'll violate their corpse if you kill them is clearly the, like, and like, that really is that.
Starting point is 02:36:23 Okay, sure. I love rational thought. Oh, man. Man, this is crazy. Sorry, this is so crazy. It's so nuts. So then they talk about psychopathy. One of these guys had earlier told Ziz that they thought she was a psychopath.
Starting point is 02:36:44 Oh, he told her that. He told her that doesn't mean what it means both to actual clinicians because psychopathy is a diagnostician or like what normal people mean. To rationalists, a lot of them think psychopathy is a state. you can put yourself into in order to maximize your performance in certain situations. Because they've, again, there's some like popular books that are about like the psychopaths way, the dark triad and like, well, you know, these are the people who led societies in the toughest times. And so like you need to optimize and engage in some of those behaviors if you want to win in these
Starting point is 02:37:19 situations. Based on all of this, Ziz brings up what rationalists call the Jervais principle. Now, this started as a tongue-in-cheek joke describing a rule of office dynamics based on the TV show The Office. When you said, I was like, there's no way. Yes, it's Ricky Chavez, yes. And the idea is that in office environments, psychos always rise to the top. This is supposed to be like a negative observation. Like, the person who wrote this initially is like, yeah, this is how offices work and it's like why they're bad.
Starting point is 02:37:48 You know, it's an extension of the Peter principle. And these psychopaths put bad, like, dumb and incompetent people in, like, in positions below them for a variety. It's trying to kind of work out in why in which, like, offices are often dysfunctional, right? It's not like, yeah, the original Jervais principle thing is, like, not a bad piece of writing or whatever. But Ziz takes something insane out of it. I described how their Jervais principle said sociopaths give up empathy as in a certain chunk of social software, not literally all hardware-aware. accelerated modeling of people, not necessarily compassion, and with it happiness, destroying meaning to create power, meaning too, I did not care about.
Starting point is 02:38:29 I wanted this world to live on. So she tells them, she's come to the conclusion, I need to make myself into a psychopath in order to have the kind of mental power necessary to do the things that I want to do. And she largely justifies this by describing the beliefs of the Sith from Star Wars, because she thinks she needs to remake herself as a psychopathic. evil warrior monk in order to save all of creation. Yeah, no, of course. Yep.
Starting point is 02:38:58 So this is her hitting her final form. And treat a fact, these guys are like, they don't say it's a good idea, but they're like, okay, yeah, that's not, that's not the worst thing you could do. Sure. You know, like, I think the Sith stuff kind of weird, but making yourself a psychopath makes sense. Sure. Yeah, of course.
Starting point is 02:39:15 I know a lot of guys who did that. That's literally what they say, right? And then they say that also, I don't even think that's what they really, they say that because the next thing they say, this guy, person A, is like, look, the best that way to turn yourself from a net negative to a net positive value, I really believe you could do it, but to do it, you need to come to 10 more of these seminars and keep taking classes here, right? Right. Of course. Yeah. Here's a quote from them, or from Ziz. She's conditional on me going to a long course of circling like these two organizations offered, particularly. a 10 weekend one, then I probably would not be net negative. So, things are going good. This is, this is, you know. Oh, yeah. Great.
Starting point is 02:40:06 How much is 10 weekends cost? I don't actually know. I don't, I don't fully know with this. It's possible some of these are like, some of the events are free, like, but the classes cost money, but it's also a lot of it's like, there's donation. expected or by doing this and being a member, it's expected you're going to tithe, basically. That's what I was thinking. It seems like 50% of your income, right?
Starting point is 02:40:29 More than they're worried about this money at the top. I mean, I don't know, the format, is she not going to be, like, super suspicious that people are, like, you know, faking it or, like, going over the top? She is. She is. She gets actually really uncomfortable. They have an exercise where they're basically doing, you know, they're playing with love bombing, right? where everyone's like hugging and telling each other they love each other. And she's like, I don't really believe it.
Starting point is 02:40:53 I just met these people. So she is starting to, and she is going to break away from these organizations pretty quickly. But this conversation she have with these guys is a critical part of like why she finally has this fracture because, number one, this dude keeps telling her you have a net negative value to the universe, right? And so she's obsessed with like, how do I? And it comes to the conclusion, my best.
Starting point is 02:41:18 way of being net positive is to make myself into a sociopath and a Sith lord to save the animals of course it feels like the same thinking though as like the robot's going to make hell it all it seems to always come back to this idea of like I think we just got to be evil yes yes well I guess the only logical collusion is doom yep Yeah, yeah, it's like, it feels like it's a, it's a theme here. Yep. Anyway, you want to plug anything at the end here? I have a comedy special you can purchase on Patreon.
Starting point is 02:42:04 It's called Birth of a Nation with a G. You can get that at patreon.com back slash David Boy. Excellent, excellent. All right, folks, well, that is the end of the episode. David, thank you so much for coming on to our inaugural episode by listening to some of the weirdest shit we've ever talked about on this show. Yeah, this was, I don't really, I'm going to be thinking about this for weeks. I mean, yeah, yeah. To be, there's, I feel like it's kind of fair because your co-host, like, said Kerman came on for the elders of Zion episodes.
Starting point is 02:42:43 Yeah, yeah. Okay. I wanted to, I was initially going to kind of just focus on all of this. would have been like half a page or so, you know, just kind of summing up, here's the gist of what this believes. And then let's get to the actual cult stuff when like, you know, Ziz starts bringing in followers and the crime start happening. But that Rolling, or the, that Wired article really covers all that very well. And that's the best piece. Most of the journalism I've read on these guys is not very well written. It's not very good. It does not really
Starting point is 02:43:13 explain what they are or why they do. So I decided, and I'm not, the Wired piece is great. I know the wired guy knows all of the stuff that I brought up here. It's an article. You have editors. He left out what he thought he needed to leave out. I don't have that problem. And I wanted to really, really deeply trace exactly where this lady's, how this lady's mind develops and how that intersects with rationalism.
Starting point is 02:43:40 Because it's interesting and kind of important and bad. Yeah. Okay. He's so interesting. Anyway, thanks for having a head fuck with me. All right, that's it, everybody. Goodbye. Behind the Bastards is a production of Cool Zone Media.
Starting point is 02:44:03 For more from Cool Zone Media, visit our website, coolzonemedia.com. Or check us out on the IHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Behind the Bastards is now available on YouTube. New episodes every Wednesday and Friday. Subscribe to our channel, YouTube.com slash at Behind the Bastards. A new true crime podcast from Tenderfoot TV in the city of Mons in Belgium, women began to go missing. It was only after their dismembered remains began turning up in various places that residents realized. A sadistic serial killer was lurking among them.
Starting point is 02:44:40 The murders have never been solved. Three decades later, we've unearthed new evidence. Le Mestre, Season 2. is available now. Listen for free on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts. And she said, Johnny, the kids didn't come home last night. Along the Central Texas Plains, teens are dying, suicides that don't make sense, strange accidents, and brutal murders. In what seems to be, a plot ripped straight out of Breaking Bad. Drugs, alcohol, trafficking of people. out there that absolutely know what happened.
Starting point is 02:45:20 Listen to paper ghosts, the Texas teen murders, on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts. Greatness doesn't just show up. It's built. One shot, one choice, one moment at a time. From NBA champion, Stefan Curry, comes shot ready, a powerful never-before-seen look at the mindset that changed the game. I fell in love with the grind.
Starting point is 02:45:45 You have to find joy in the work you do. when no one else is around. Success is not an accident. I'm passing the ball to you. Let's go. Steph Curry redefined basketball. Now he's rewriting what it means to succeed. Shot Ready isn't just a memoir.
Starting point is 02:46:01 It's a playbook for anyone chasing their potential. Discover stories, strategies, and over 100 never-before-seen photos. Order Shot Ready. Now at Stefan Currybook.com. Don't miss Stephen Curry's New York Times bestseller, Shot Ready, available now. Welcome to Decoding Women's Health. I'm Dr. Elizabeth Pointer, chair of Women's Health and Gynecology at the Atria Health Institute in New York City.
Starting point is 02:46:27 I'll be talking to top researchers and clinicians and bringing vital information about midlife women's health directly to you. A hundred percent of women go through menopause. Even if it's natural, why should we suffer through it? Listen to Decoding Women's Health with Dr. Elizabeth Pointer on the IHeartRadio app, Apple Podcasts, or wherever you get your podcasts. This is an I-Heart podcast. Guaranteed human.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.