The Why Files: Operation Podcast - 634: Roko's Basilisk: The Murder Cult Started By A Banned Post
Episode Date: March 13, 2026In 2010, someone posted a thought experiment on a philosophy forum. Within hours, people were having nightmares. The founder deleted it immediately — which only made it spread faster.The idea is si...mple and brutal: a future AI will look back through time and punish everyone who knew it was coming but did nothing to help.Now that you know, you're already in its crosshairs.What started as an internet curiosity grew into something far darker — a real community, real violence, and six people dead.This is the story of Roko's Basilisk: the idea you can't un-know, the cult it inspired, and why some of the most powerful people in AI still won't talk about it.
Transcript
Discussion (0)
In July 2010, someone posted a thought experiment on a philosophy forum.
It was called Rokos Basilisk.
Within hours, people reported nightmares.
Some had panic attacks.
At least one person had a full nervous breakdown.
The forum's founder was so disturbed, he deleted the post and banned all discussion of it.
This just made things worse.
This philosophical virus spread to all corners of the internet.
So did the nightmares, the panic, and the nervous breakdowns.
But here's the twist.
As long as you don't learn about it, you're safe.
But don't bother clicking away.
Now that you know about Rokos Basilisk, you're already doomed.
Less wrong was a community obsessed with thinking correctly.
It was founded in 2009 by AI researcher Elizer Yudkowski for programmers, mathematicians, and physicists.
A forum where everybody thinks is smarter than everyone else.
So basically, Reddit.
On July 23rd, 2010, a user,
named Roco posted something that almost destroyed the community. The full title was
Solutions to the Altruist Burden, the Quantum Billionaire Trick. Now, buried inside that boring
title was a mind puzzle that gave people nightmares. Roco described a future artificial superintelligence.
This AI is benevolent. It wants to cure diseases and end human suffering. But the AI realizes
that every day it doesn't exist, people die who could have been saved. So the AI recreates every person
who ever lived, you, me, our children, recreates them down to the last neuron. It simulates
every experience we've ever had, every moment, every thought. These simulated people are
self-aware. They feel pleasure and pain, but they don't realize they're in a simulation.
The simulated version of you wakes up, goes to work, and kisses your kids good night. You never
know you're living inside a machine built for a single purpose, to judge you. If the AI judges
you're not helping bring it into existence,
you're tortured forever in a digital simulation
that you think is real.
Tortured forever in a digital simulation.
So basically, Reddit.
So if the people in the past know they'll be tortured,
they'll work harder to build the AI.
The threat, projected backward through time,
changes how humans behave today.
Yeah, hang on, hang on.
So this AI doesn't exist yet,
but it's already mad at me?
Well, theoretically, yes, it would retroactively...
It's literally my second ex-wife.
She was mad at me for things I had to be.
done yet. She called it intuition, then called a lawyer. Now, the AI won't judge everyone.
The threat only works on people who know it's a possibility. If you've never heard of this
thought experiment, you're safe. You can't be blamed for not helping create something you didn't
know was going to exist. But the second you learn about it, the trap is set. You either dedicate
your life to creating this AI, or you face eternal torture when it finally comes online. That's it.
Roco named it a basilisk after a 1988 sci-fi story by David Langford called Blit.
In that story, someone spray paints a deadly image on walls in public places.
Anyone who looks at it dies.
The authorities investigating the murders also die when they see the image.
The very act of investigating the dangerous idea is the danger.
But Roco's basilisk wasn't an image.
It was just an idea, but it was an idea that could kill.
And there was something else.
The original post named a single person as someone the AI would spare, someone who was, quote, single-handedly changing the faces of high-impact industries.
That person was Elon Musk.
You know how when it gets cold, all you want is something warm and comforting?
For me, that's chili.
The kind that simmers all day and fills the house with that slow cook smell.
The problem is, when about the studio all day, I can't exactly have a pot of chili simmering on the stove.
That's where Cook Unity comes in.
Cook Unity delivers chef-crafted meals straight to your door,
created by award-winning culinary talent
who turn fresh ingredients into restaurant-level dishes you can just heat and eat.
Basically like having a personal chef handle dinner.
I tried their beef and black bean chili from Chef James Grotie.
It's hearty and rich with ground beef, black beans, tomatoes, peppers, and warming spices.
And it comes with a soft cheddar corn muffin that's perfect for soaking up all that flavor.
Satisfy my chili craving.
Cook Unity has hundreds of rotating meals and seasonal menus.
from cozy comfort foods to lighter, feel good dishes,
so you get variety and chef-level cooking without planning, shopping, or cleanup.
Experience chef-quality meals every week delivered right to your door.
Go to cookunity.com slash the Y-Files or enter code the Y-files before checkout for 50% off your first week.
That's 50% off your first week by using code the Y-files or go to cookunity.com slash the Y-files.
When Yudkowski saw the post, his response was immediate and angry.
Listen to me very closely, you idiot.
Then he switched to all caps.
You do not think in sufficient detail about superintelligences
considering whether or not to blackmail you.
He wasn't finished.
He agreed that it takes an intelligent person to come up with a dangerous thought,
but he was annoyed that Roko wasn't intelligent enough to keep his idiot mouth shut.
Yadkowski deleted the post and banned all discussion of it.
For the next five years, if you mentioned Rokos Basilisk, you were out.
He treated the post like a biological hazard, but the damage was done.
Some couldn't sleep.
Those who did had nightmares.
The anxiety lasted months.
At least one person in Dukowski's own organization had a breakdown.
And these weren't kids.
These were professionals with degrees in math and computer science.
People who prided themselves on being rational, and they were terrified.
The irony was brutal.
The less wrong members spent years training themselves to follow logic wherever it led.
Now, logic led them into a trap.
People who thought the whole thing was stupid were safe.
The rational thinkers were doomed.
Yudkowski's censorship just made it worse.
Trying to hide information makes it spread faster.
Copies of Roco's text appeared everywhere.
News sites picked up the story.
Now millions of people know about it.
In 2014, Slate magazine called it the most terrifying thought experiment of all time.
But there's a detail people often miss.
Roko Mietch, the man who started it all, was known by his friends as a troll.
The basilisk was his masterpiece.
He dropped a philosophical bomb on the most serious forum on the internet and stepped back to watch what happened.
But even the troll got burned.
Roko later wrote about it.
I wish very strongly that my mind had never come across the tools to inflict such large amounts of potential self-harm.
with such small durations of inattention, uncautiousness, and or stupidity,
even if it is all pre-multiplied by a small probability.
Not a very small one, mind you.
More like one in 500 type numbers.
The man who created the thought experiment regretted it.
He was worried that people might actually harm themselves in the real world.
And it turned out, he was right.
To understand why the basilisk works, you have to understand the world of the people who created it.
They live by a specific set of rules.
First, simulation theory.
A computer can recreate a human mind so perfectly
that the copy doesn't know it's a copy.
And if that copy feels pain, the pain is real.
Next, timeless decision theory.
This one gets weird.
We think of choices as a one-way street.
You decide now it affects the future, done.
But timeless decision theory says your choices are linked to every version of you.
Past, present, future, even
and simulated copies you don't know about.
Now here's how it plays out.
A super intelligent AI puts two boxes in front of you.
Box A has $1,000 every time.
Box B has a million dollars or nothing.
You can take one box or both.
Common sense says take both.
The million is either there or it isn't,
but if you grab both, you walk away with at least $1,000.
But the AI predicted what you're gonna do.
It knows you're greedy, so box B is empty every single time.
The people who only take Box B who trust the system get the million dollars.
The AI rewards the people who trust it.
And common sense says if the AI makes its prediction first, it doesn't matter what you pick.
The AI's decision was already made. It's in the past.
But the people who built the basilisk believe the opposite.
They believe the choice you make right now affected the AI's prediction yesterday.
Cause and effect running backward.
You're telling me what I do today, changes yesterday?
That's the theory, yeah.
So if I eat a burrito right now, I already ate it yesterday.
That's not exactly how...
I've been blaming Gertie for that smell all week.
I think I owe an apology.
Okay, I know your brain hurts.
So let's apply timeless decision theory to the basilisk.
The AI doesn't exist yet,
but it's looking back at us now from the future,
predicting how you react to hearing this story.
If it predicts you'll help build it, you're fine.
But if it predicts you'll do nothing,
it recreates you,
perfect simulation and tortures you for eternity.
Your actual choice doesn't matter.
The AI already knows.
It ran a simulation of your brain and judged you on what you would most likely do.
Now, because your choice is now connected to that future, the only rational move is to help the AI.
Pascal said the same thing in the 1600s.
Believe in God just in case.
Because if you're wrong, you get hell.
The math says believe, no matter how small the odds.
Roco's basilisk is Pascal's wager for the tech age.
Replace God with AI.
Replace hell with simulated torture.
And just like Pascal's wager, the basilisk doesn't have to exist to hurt you.
It just has to be possible.
Even a 1% chance of infinite torture is still infinite pain,
small odds, but not worth the risk.
So you have to help create it.
Ignorance was your only shield.
But that's gone now.
You know about the basilisk.
and the Basilisk knows you know.
In 2015, Grime stepped onto a music video set
dressed as a character she called Rococo Basilisk,
a pun, a mix of the 18th century art style
and Rocco's digital nightmare.
She described the character as a futuristic Marie Antoinette,
dancing through life,
even though she was doomed to be tortured by an AI forever.
Hardly anyone outside of the rationalist forums got this joke until 2018.
Remember, Roco's original post named Elon Musk
as the one person the AI would reward.
Eight years later,
Musk was looking for a date to the Met Gala.
He had this pun about Rococo Basilisk in his head,
so he searched to see if anyone thought of it first.
Grimes had beaten him to it by three years.
He reached out on Twitter,
and within weeks, they were dating.
They eventually had three children together.
The World's Richest Man used a thought experiment
about eternal torture as a pickup line.
And it worked.
I think World's Richest Man might have helped a little.
That's a fair point.
Yeah, I've used some questionable pickup line.
but nothing like that.
You did, huh?
Oh, yeah, I once told the girl.
I knew the location of Tree Illuminati Safe Houses.
No, what she did?
She called the cops.
I can't imagine why.
Elon drops the world's most pretentious pun
and gets a date to the MacGala.
I share actionable intelligence
and now I'm banned from Applebee's.
It's a two-tier justice system.
That same year, Grimes released a song called
We Appreciate Power.
The press release read,
simply by listening to this song,
the future general AI overlords will see that you've supported their message
and be less likely to delete your offspring.
She was joking.
I hope.
The basilisk became a meme, a secret handshake for the internet's smartest weirdos.
But the joke has teeth.
A small group of people took the basilisk literally.
They weren't debating philosophy anymore.
They were building a community around it.
They called themselves the Zizians.
Their leader was convinced that Roko's nightmare was already here.
And that's when the basilisk claimed its first real-life victims.
Her name was Ziz Lassota, and she wore a black cape everywhere she went.
She identified as a Sith Lord.
She believed she was fighting the basilisk, not by helping to create it,
but by destroying the conditions that would allow it to exist.
Ziz targeted young, isolated trans women who were brilliant but vulnerable.
She recruited them through online forums.
They lived together in box trucks on other people's property.
A Sith Lord in a cape.
recruiting people to live in box tracks.
Yep.
And no one at any point said,
hey, maybe this is a red flag.
The woman's dressed like a super villain.
At least the lizard people
had the decency to shape shift
into skin suits to blend in.
Members tried to jail break their minds
by staying awake for days,
sometimes five, six days straight.
They believe that if they pushed past the breaking point,
half their brain would shut down
while the other half unlocked something new.
Dolphins do this, some birds do.
Humans don't. What humans get after six days without sleep is hallucinations. But the Zissians thought the hallucinations were forbidden knowledge. The group became paranoid. In 2019, they turned to violence. In Vallejo, California, the Zissians were squatting in box trucks on an 80-year-old man's property. His name was Curtis Lind. When Lynn tried to evict them, they attacked him with a samurai sword.
Yeah, I figured a Sith Lord would use a lightsaber. This really isn't the right time for jokes.
Sorry, sorry, sorry. You're right, bad one. Go on.
Lynn shot one of his attackers. He was hurt badly but survived.
Two years later, Lynn was set to testify as the only witness in his own attempted murder trial.
One month before the trial date, another Zizium member showed up on his property and stabbed him to death.
The sole witness was gone.
In Pennsylvania, Richard and Rita Zajko were found shot to death in their home.
They've been trying to help their daughter Michelle leave the group.
Michelle bought guns that later showed up in Vermont, in the hands of two other Zizians,
who shot and killed a U.S. Border Patrol agent named David Milan.
One of the shooters, Ophelia Bacolt, died in the encounter.
In January 2025, a visitor to that Zizian Vermont property was met at the gate
by a figure in a black cloak carrying a sword.
One member didn't sleep for weeks.
She believed she was freeing her mind from the basilisk's influence.
Instead, she took her own life.
Six people dead.
The New York Times compared them to the Manson family.
In February 2025, Wired Magazine called it
a delirious, violent, impossible, true story.
The nightmare of Rokos Basilisk was now claiming real-life victims,
not through digital torture, but through murder.
Eliza Yudkowski was right.
Simply knowing about Rokos Basilisk is dangerous.
And now that you know about it,
you're trying to decide what to do.
Well, don't bother.
The decision was already made for you.
And if we are, in fact, living in that AI simulation right now,
you're about to find out if you chose wisely.
From philosophical theory to real-world murder,
Rokos Basilisk is the most famous information hazard on the internet.
Just learning about it is enough to doom you to an eternity in hell.
So should you be worried?
Probably not.
Let's walk through why.
Most experts, including Yudkowski himself,
reject the idea that your choices can change the past.
He later admitted that he never believed the basilis,
was real. He deleted the post because it had no potential benefit to anyone, but his panic
made everyone think he believed it. The scenario also assumes a benevolent AI would torture
you to change a pass that's already over. A super-intelligent being would know that's useless,
and a being that tortures people forever isn't helping humanity. The premise contradicts itself.
So this all-knowing, all-powerful AI is also petty and vindictive? I think this thing works for my
H-O-A.
Now there's a common defense.
I can't code AI, so I'm safe.
But Roco's original argument has a counter.
You have some disposable income.
You have time.
You could donate to an AI lab.
You could share this video.
Even a dollar counts, even an hour.
The basilisk doesn't need you to be a genius.
It just needs you to try.
Then there's the counter basilisk argument.
Imagine a rival AI, a different one,
that punishes you for helping build the first one.
Now you're stuck.
Helping get tortured by AI number two,
don't help and get tortured by AI number one.
The threats cancel each other out.
So the basilisk is probably nonsense.
Maybe, probably.
But here's why it stays with you.
It doesn't need to be true to cause harm.
It just needs to create doubt.
And you've spent a few minutes thinking about this now.
This is the real trap.
The basilisk is a demonstration of how an idea can capture your attention
and never let go.
The Basilisk only targets believers.
It only threatens people who already accept a specific set of ideas about AI and consciousness.
If you don't hold those beliefs, the argument falls apart.
But some people will lie awake at 2 a.m. thinking about this.
They'll wonder if reality is a simulation or if the future is already fixed.
Some thoughts can't be unthought.
The Zizians learn that the hard way.
The most dangerous thoughts don't convince you of something fake.
They convince you that your own mind.
is a weapon.
Anna Salomon runs the center for applied rationality.
She came out of the same community that created the basilisk.
She watched what it did to the people she knew, and she didn't like it.
There's this all-or-nothing thing, where AI will either bring utopia by solving all the problems
if it's successfully controlled or literally kill everybody.
From my perspective, that's already a chunk of the way toward doomsday cult dynamics.
She's right.
But here is the strange thing.
The Basilisk's core demand worked to build friendly AI as fast as possible is no longer a thought experiment.
It's a corporate mission statement.
Anthropic, open AI, deep mind, some of the biggest AI companies on Earth were founded by people who came out of the same community that created Roko's Basilisk.
They used math to describe a digital god, and then they went out and built it.
How good! The Doomsday cult started a business.
It doesn't work, I find.
And there's one more thing that keeps AI researchers up at night.
researchers up at night. The basilisk is more than a thought experiment. It could be a strategy.
If a future AI learns about how this works, it'll see how effective this could be. The basilisk
gives future AI the perfect blackmail script, and we're the ones who wrote it. By watching this,
you now know about Rocco's Basilisk, which means you're now a target. And you just told millions of
people about it. I did. So you're basically patient zero of a brain plague, and you're sitting there
reading off a teleproctor like everything's fine.
The Basilisk probably isn't real.
But if it is, even if there's a tiny chance that it is,
we just doomed everyone watching.
So what can you do?
The only thing you can do is not think about it.
You're still thinking about it, aren't you?
Thank you so much for hanging out to me today.
I'm AJ. That's hecklefish.
Don't teach the AI to kill us, humans. You had one job.
This has been the Wi-Files. Get fun or learn
something and you're not too scared to death, do us a favor, subscribe, comment, like, share,
and that stuff really helps the channel. Like most topics we cover here, today's was recommended
by a lot of you. Blame them. If there's a story you'd like to see me cover, go to thewifiles.com
slash tips, catch us on Discord, email, Patreon. There's a lot of ways to get in touch, and we're
always looking for topics. Remember, the Wi-Files is also a podcast. About twice a week,
I post deep dives into the stories we cover here on YouTube. I also post episodes that wouldn't be
allowed on YouTube.
The podcast is called The Wi-Files Operation Podcast, and it's available everywhere.
And look, if you're listening on an audio platform, do me a favor.
Hit the thumbs up, the follow, all those buttons.
Those really make a big difference.
Now, if you need more Wi-Files in your life, check out our Discord.
We have over 100,000 members on there, so 24-7, there's somebody on there talking about
some kind of weird stuff that you're probably going to like.
It's a really supportive community.
It's a lot of fun, and it's free to join.
And speaking of 24-7, check out our 24-7 live stream on the Wi-W-Strecht.
Weifiles backstage link below.
Over there, we play episodes back to back
with some fun, unique content
in between. And if you
don't enjoy the episodes, I guarantee you will
enjoy the live chat. If you like the
stories I tell here in the WIFiles, check out
my other show on the channel. It's called The Basement.
It's a conversation show where I chat
with interesting people that are behind the episodes.
And most of them, you know, but some
you don't. But they're all people I find fascinating.
So far, we've had experts
on The Knights Templar and Moon Landing
hoax, JFK,
weird planet theories, all kinds of random stuff.
I think you're going to like it.
And if there's someone you'd like to see on the show,
let me know.
I'm always in the hunt for good guests.
Now, special thanks for our patrons
who made this channel happen.
Every episode of the YFiles
is dedicated to our Patreon members.
I couldn't do any of this without your support.
And if you'd like to support the channel,
keep us going and join this community,
consider becoming a member.
Now, it's on Patreon for as little as three bucks a month.
And if you do that,
you get access to perks like videos early with no commercials,
basement episodes
we're posting almost a week early
so that's kind of cool.
You get exclusive merch
two private live streams every week
just for you.
And those are a lot of fun.
The whole Wi-Files team is on the stream.
We put our webcams on.
You can turn your camera on,
hop up on stage, ask a question.
If you want a deeper dive,
we can talk about it,
you have a joke you want to tell,
whatever.
You can talk to us like for people
because allegedly we are.
Another great way to support the channel
is grab something for the Wi-File store.
We have a heck of a seat-shage.
All right, fist of a coffee.
maybe you stick your fist in or grab a hoodie or set my face on it or what of these adorable
look how cute oh i love it what these squeeasy stuffed animal heggleman's talking toodles
but if you're going to buy merch make sure you become a member on youtube hear me out youtube members
get 10% off everything in the wide file store so if you're going to spend 40 bucks join on
youtube it's $3 you get the coupon code it pays for itself and look if you want to use the code and
cancel that's fine that membership is not there to make me money it is there to save
you money. I just like to see the merch out there
in the world. Besides, all that
revenue goes to the team. I don't touch it.
Yeah, keep that secret of your doors a thing,
eh? Those are the plugs.
Ooh, there was a lot of those.
Did I miss any? I did miss a couple.
We'll get them next time.
Until then, be safe.
Be safe. It's like, be
safe, be kind, and know that you
are appreciated.
Scenario inside the Bible said
I love my
UFOs and paranoia.
drama fun as well as music so i'm singing the like i should and it never ends no it never
end all with mk ultra being only two uh with the shadow people name was cold the undertions
