Cognitive Dissonance - Episode 911: The Rise and Fall of Terrorgram
Episode Date: April 24, 2026https://www.propublica.org/article/rise-and-fall-terrorgram-inside-global-online-hate-network-frontline-telegram https://www.youtube.com/watch?v=l8nsgzwuc_o&pp=ygUbcmlzZSBhbmQgZmFsbCBvZiB0ZXJyb3JncmFt...
Transcript
Discussion (0)
Marketing is hard.
But I'll tell you a little secret.
It doesn't have to be.
Let me point something out.
You're listening to a podcast right now, and it's great.
You love the host.
You seek it out and download it.
You listen to it while driving, working out, cooking, even going to the bathroom.
Podcasts are a pretty close companion.
And this is a podcast ad.
Did I get your attention?
You can reach great listeners like yourself with podcast advertising from Lib Syn ads.
Choose from hundreds of top podcasts offering host endorsements
or run a pre-produced ad like this one,
across thousands of shows to reach your target audience in their favorite podcasts with Lib Synads.
Go to Libsonads.com. That's L-I-B-S-Y-N-AIDS.com today.
This episode of Cognitive Dissinance is brought to you by our patrons.
You fucking rock.
Be advised that this show is not for children, the faint of heart, or the easily offended.
The explicit tag is there for a reason.
Recording live from Gloryhole Studios in Chicago.
beyond. This is cognitive dissonance. Every episode we blast anyone who gets in our way.
We bring critical thinking, skepticism and irreverence to any topic that makes the news,
makes it big or makes us mad. It's skeptical. It's political. And there is no welcome at.
Today is also Thursday the 23rd, but not when you're listening to it. You're listening to it
on Thursday the 30th, I would guess. Yeah, something like that. Right around? Yeah, if,
If days of the week are still in the sevens, you are listening to this on 30s.
And it is long-form day.
We should have like a little theme song for long-form day.
We should.
Yeah.
Today, it doesn't matter.
It's Terrogram Day.
Yeah.
Oh, God.
We're going to be talking about.
This is called the rise and fall of terrorgram inside a global hate network.
This is a pro-publica article, but also pro-publica Frontline.
It is a documentary that they released on Frontline.
You can just search for on YouTube, the rise and fall of terror.
and it will come up and it's a full hour and 25 minute documentary on this as well.
So you could watch the documentary.
The documentary reaches into different places that this article doesn't reach into.
But it's really about how people are using telegram as a way to connect their terror cell that is a white supremacist, Aryan terror cell that is around the world and to promote the killing of.
people that are outside of that group and to damage people.
And it's large scale.
And it's a very scary thing that goes essentially unmonitored through telegram, at least for a
long time it did.
You know, when I was reading this article, first of all, I want to say that when I first
called up this article, my brain did that thing where it's tired and like skims.
And I was like, the rise and fall of phantogram.
I love fantagram.
Like, what happened
of Phantagram?
So I'm like,
I'm simultaneously glad that Fantagram has not fallen,
but very distressed about terragram.
That's way worse.
That's,
I don't like that.
So,
but I was really,
like,
I read this article and I was really thinking that
one of the central issues,
I think that we have to wrestle with,
which we haven't properly,
is redefining what our limits around speech should be.
And we haven't done a good job of recognizing the ways that technology influences what it means to yell fire in a crowded theater.
Right.
That's always been this sort of like ye oldie standard, right?
You know, like we can say things and we can do the, we have a fairly robust right to free speech.
And I am very, very, very, very supportive of that.
But there is a, there has always been recognized that there are limits to what free speech.
speech is and those limits, you know, come into play when violence and danger and, you know,
infringement on other people start to have an interaction, right? And so we have not done a good job,
I don't think, of fully intellectually interrogating the ways that technology and its amplification
and connection effects have created a scenario where fire in a crowded theater needs to be a
different standard, maybe. And I don't know that I'm
qualified to say what that standard should be. But it feels like urgent to me that we really think
about it in ways that are not just like knee-jerk reactions to the way that we were raised
to believe that everything that is an imposition on free speech is like a moral bad. Yeah.
And I think that that's not the case. I think we can we can have our cake and eat it too,
but if we're not careful, we're going to eat a lot of thermite in the process. Yeah, man. Yeah. You know?
Yeah, and I think like there is something to be said, I think, about the responsibility of these organizations that own these platforms for taking some responsibility for what gets put on them.
I think there's a, you know, obviously we can make arguments about, well, is it the guns fault or is it the guy who has the gun, right?
Does a gun, you know, does it matter?
But there's many people in this country who all think that maybe we should limit the amount of guns we have.
maybe we should limit the access to guns because we think that the means reduction that you can do is a big deal.
That changes the equation in the end.
It makes it so that there's less chance that someone is going to hurt another person or themselves with a gun if there's just less guns.
We know that to be true.
So how much are we willing to impinge on people's freedoms to own guns and how much are we willing to take away those freedoms so that other people won't have the opportunity to hurt themselves or other people?
And that happens, that's a, that's a conversation that's been happening for a long time.
And people in other countries think we're crazy.
They think we're insane that we are having these conversations about guns when all the rest of the world is like,
no, man, that's just stupid.
What are you doing?
And I think that they're probably looking at us as crazy too when they look at some of these
social media platforms because they have a better handle, I think, in Europe on what social media
can do and how social media can interact and how it can interact with people and how it can
interact with young people. And I think that they've, they've not looked at this sort of thing as
free speech monolith as a way in which to try to shape their rules. They just said, we need
new rules. We need new rules that have to guide this thing because it's a different animal.
I don't think that, you know, 100% it's the fault of the people who run the platform that
these things happen. But I recognize that you have to do a better job and be more diligent about
the stuff that's happening on your network,
I think you have a responsibility to people
to not connect to horrible people who can kill hundreds.
That is not a thing that you should be willing to do.
And it should be ethically in your rules,
in your mindset, in your mission statement.
It should be something that's there that says,
we need to be better at stopping things
that could feasibly hurt large swaths of people
or change people's minds in ways that is very negative to society.
Yeah, like all of that, man, all of that.
Yes, and all of that.
Because I think, you know, there are very bad arguments that, you know,
look back in time and try to draw these parallels.
And I always say to myself, yeah, but like a musket is not a machine gun.
Right.
Right.
Like we have to understand that there are issues.
of scale, and when those issues of scale come into play, it does nobody any good to reduce back,
like to do a backward reduction and to pretend that issues of scale don't affect, like, how we
should consider these things, ethically, morally, legally, socially, we of course need to do that.
When the cake, I was thinking about like this is my analogy, right, and tell me if this
makes any sense to you. But like, if you wanted in before the years of social media, right,
if you wanted to join a hate group, you had to find a hate group. And then you had to drive to
their meeting. And you had to take your body out of the car and you had to walk into their building.
And that the size of that hate group's meeting and their ability to sort of like connect and
share information and organize and plan was always going to self-limit based on the inconveniences
of meat space. So if I wanted to join the KKK, I had to find a local KKK group and I had to
drive over there and maybe I had to worry about what my neighbors might think and I might have to do that
only on Tuesday nights when they're having their meeting and maybe I got to work the second shift
on Tuesday nights. So I didn't make it there. And then, you know, I got there and I'm, I'm here in,
you know, northwestern Illinois. And so.
like they only have a certain amount of expertise.
And so if I want to get more expertise, we have to try to find other groups and connect with
them.
And everything was slow.
And the ability to sort of network our hate was very limited.
We were siloed off in important ways.
And what we have to recognize, I think, is that we're in a world now where there's a 24-7
global access to expertise, information, and radicalization at every level of society.
And we've seen it with, we've seen what the violent rightward swing that has happened since that has become a reality.
There has been a violent global rightward swing in many societies as that technology has allowed these groups, which previously had to organize in meat space and now they don't.
The in-cell movement.
The rise of rape culture and misogyny, the openness of it.
the rise of the authoritarian right wing across much of Europe.
There are so many examples of it.
It's not even uniquely American.
I think it is a result of the ways that we can now scale this technology and scale our communications up.
If I wanted to, like, make a bomb and I live in northwestern Illinois, I got to find a guy local who knows a guy who knows a guy, right?
if I want to make a bomb now, it's trivially easy to look that information up.
Sure.
I could do it on my phone on the toilet.
Don't we have to recognize that, like, there's a different level of responsibility
and a different level of, like, ways we have to think about what speech means?
I just, does that analogy work for you?
Does that mean, like, does that connect?
Yeah, I think so.
I think, you know, you have a group of people right now that are,
that were disparate.
And in the past,
if you wanted to go to a KKK meeting
and then you wanted to go out
into the middle of the world,
you put a thing over your head.
You had to hide your own face.
A physical hood had to be put over your face
and someone could grab that hood off your face.
And you wanted to hide who you were.
Now that hood is built in
to the thing that you're using.
That's another great point.
It's built in, right?
I don't have to reveal who I am to anyone.
I don't ever have to reveal.
And what's so interesting about this documentary,
is they highlight a couple of different people
that are part of these sort of networks.
One of them created 8chan, right?
So all this stuff leads its way back
to some of the earliest message boards
that got really popular on the internet 4chan
and then 8chan came from that.
And one of the people who did this was a disabled person.
He had some sort of disease that he had problems with his limbs
and he comes out in a wheelchair
and he very specifically says
he says, yeah, I
actually was on these boards
as a person who was pushing for eugenics.
I was pushing for eugenics
and I would have been the object of that eugenics.
There's another person who they later on
reveal in these things.
And I don't know that they mentioned it in the article,
but they definitely mentioned in the documentary
that one of the people who was making these videos
is an artist, what he calls himself an artist from Montana,
but he was part of the gay scene there.
So these are people who in real life,
even in the system that you had before, Tom,
that you were talking about where you had to show up in person,
you wouldn't show up if you were this person who was disabled.
You wouldn't show up if you were gay,
because they would know, and then they would gay bash you,
or they would, you know, I don't know what they would do
to the person who was disabled,
but you know what I mean?
You wouldn't even show up to these things.
So it's able to not just reach the people who would have been in there before.
It's able to reach a whole new group of people who also have hateful ideologies that they want to express into the world.
And you're reaching a whole new group of people that you can now seed things with and use them.
It's not a good thing to have this be.
I mean, I just, I don't know a solution.
So I'm not telling you I know a solution.
But I am saying that what we're seeing time and time again is these enough.
anonymous networks that good people together don't have a lot of great stuff that happens from
them. It's not like there's a ton of stuff that happens when a bunch of people who have a lot of
nefarious thoughts and complete anonymity get together. That doesn't seem like a recipe for positivity.
No, man. And like, it is absolutely a recipe for radicalization. Sure. So we know that. And it's,
it's interesting to me, there's some parallels that I think are worth noting is that I think
socially, we have all decided that we understand that there is a worry and a need to police
heavily radicalization when it comes from the Islamist world.
Right?
When ISIS is radicalizing young people, that is, that hits the news, the police are involved,
the FBI is going after it.
There's, you know, there is a real, oh my gosh, like, we cannot have, like, you know,
Islamist terror cells in America.
And, like, ISIS is recruiting our young people.
And here, like, that's the thing.
Some of that is very true.
That is, like, that's not, like, I want to be very clear.
That is a real thing that did happen.
That is a social problem.
That is a political problem.
That is a criminal problem.
That's still happening.
And we should address it.
It's still happening.
Still happening.
The tools to radicalize, the tools to advertise, all of these things have become
cheap as free.
Producing high quality, slick video in unbelievable quality.
quantities, we've taken these tools that are incredibly powerful. These are the machine guns.
And without any consideration of how they differ from muskets, we've put them into the hands of every
person in the world and said, well, it makes most people's lives better and easier. And so it's a net
good. And we'll just have to live with a certain amount of radicalization. But that's not how that works.
This stuff bleeds out. This stuff ruins lives. And I think that we have a responsibility to
do better. And here's the analogy that I thought of in the banking system, which is intensely
imperfect, right? So I want to like throw this out there. If you deposit money in a bank and I want to
open an account at a bank, we've covered this on the show. They don't do a great job of this,
but there are laws that require, they're called know your customer laws. There are laws that
require that if you are a bank, you have to do some amount of due diligence to try to prevent
money laundering, impersonation, and illegal activity.
That doesn't mean that people can't deposit their money at the bank. But it does mean that the bank
also doesn't have a, I don't know who you are and what you're going to, I'm not responsible
for how you use your money. I'm just the bank. Yeah. There's like the people who say like, well,
then we wouldn't have the internet as it stands if we had, you know, any regulation at all,
one or not being at all thoughtful and creative.
So there are thoughtful, creative ways to say, hey, you should have a robust series,
especially with AI being available now, you should have a robust series of demonstrable
and protections based on these, you know, best practice regulations.
You should be able to show that you're doing, you know, these kinds of audits of your systems
and the spaces in your systems looking for this activity.
Because, like, the other thing is that this doesn't happen everywhere.
And if it doesn't happen everywhere, it tells you that there are advantages to putting it over here.
What are the advantages?
Well, it's allowed.
Like, it's, this stuff was happening on, like, Facebook, and then Facebook changed their rules,
and now it doesn't happen as much on Facebook.
Yeah.
So, like, it's not like you can't reduce the amount of it.
And reduction, like, just because,
the filter doesn't narrow to zero doesn't mean that narrowing the filter has no value. Right. Right. So I think
like so many of the arguments like are such like bullshit arguments that they're being made
because people are afraid that the way that they enjoy interacting with the world online will change.
And I guess I'm saying we have to accept that the world, that the way we interact with the
world online needs to change or this. Yeah. And like this is unacceptable. Like the, the, the, the,
the proliferation of people gathering together in chat rooms and then going out into the real world
and slaughtering people.
That, like, we got to do what we can to fix it.
Like, even if we fixed half of it, that's half of that shit that we fixed.
You're not radicalizing people.
I want to read a couple of things so people understand what's happening on these networks, right?
So here's a piece from the article.
In the span of five years, they grew telegram, not telegram.
They grew teragram from a handful of accounts into a community with hundreds of chats and channels focused on recruiting, would-be terrorists, sharing grisly videos, and trading expertise on everything from assassination techniques to the best ways to sabotage water systems and electrical transmission lines.
One of the person who was actually arrested for this, one of their accounts posted a step-by-step instructions on making pipe bombs and synthesizing HMTD, a potent explosive.
ProPublica and Frontline identified 35 crimes linked to Teragram,
including bomb plots, stabbings, and shootings.
Each case involved an individual who posted in Terragam chats,
followed Terragam accounts,
or was a member of an organized group whose leaders participated in the Terragam community.
One of the crimes in 2022,
a shooting of an LGBTQ plus bar in Bratislava, Slovakia,
left two people dead and another injured,
An earlier story, ProPublican frontline, detailed to how the shooter was coached to kill over three years by members of Terror, the Terrogram Collective, a process that started when they were just 16 years old.
And this sort of thing, like, this is the product that you get.
But what happened was is that the people have been already pushing the other manifestos and ideas of the people who had come before.
And they call these people saints, right?
So these people who go out and murder and die, they call them saints for the movement of this white
power movement.
And this white power movement, one of them happened in New Zealand where they killed 51.
You're a podcast listener.
And this is a podcast ad heard only in Canada.
Reach great Canadian listeners like yourself with podcast advertising from Libson Ads.
Choose from hundreds of top podcasts offering host endorse.
or run a pre-produced ad like this one across thousands of shows to reach your target audience with Libson ads.
Email Bob at Libson.com to learn more. That's B-O-B at L-I-B-S-Y-N dot com.
Muslims and then I think it was they injured something like 40. It was that Christchurch shooting that was absolutely horrific.
Horrible. Then they did it again in the United States. They wound up killing people in a Walmart.
that was a bunch of black people got killed.
They wound up killing people like here,
like in Slovakia.
They killed gay people.
These are people who are targeting marginalized groups
to go after and murder them.
This is genuinely going after a group of people
and egging a group on to say,
you will be famous,
you will be a saint,
you will be someone who we will revere
like these other people
who we've already hung their plaque on the wall.
And they continue on sending this message
out to these young, impressionable people who have, they have this need and desire to want to be
involved in a community that wants them. And they're welcomed there.
Yeah, you know, I'm going to throw, it's hard for me to like read this article and not try to
think about what we should do about it. And I want to be really clear to the audience,
I don't know what to do about it. I don't want to, I don't know what to do about it.
But one of the things that I thought about, because I, like, when,
that, when I read the thing about that 16 year old, my first thought was like, why the fuck does a
16 year old have access to telegram? Right? Like, Australia has decided that under a certain age,
people should just not have access to social media. And the social media companies were like,
that's fucking impossible, we'll never be able to do it. There's all these problems with it.
And Australia was like, cool story. If you don't do it, there's massive penalties. And they
were like, turns out we know how to do it. And then they did it. Yeah. So, like, I,
read this and I'm like, maybe there should be some kind of age screening to have access to certain
things. Maybe there should Cecil be some kind of vulnerability screening. And I don't know what that
means. But like, these are not rights. Like, we don't have a right to access. I don't have a right
to go to every store that I want to walk into. Right. A store can be like, yeah, we don't serve the
public. A perfect example, Tom, would be you don't have a right to share with me child sexual
material.
Yeah.
You know what I mean?
We don't have a right to have that and to share it between us.
We don't have a right to those things because it's an illegal thing that we're not
allowed to have.
Right.
And it's illegal for all the right reasons, right?
It's illegal.
It's illegal.
And we all, there's not a group of people.
Sure, I'm sure there's some crazy people out there who maybe think that you should keep
it over, whatever.
But I'm saying like 99.9.9% of the people in the world are like, that is crazy shit that
no one should have access to, period.
Right, right.
And like, we have rule.
If there was a, if there was a, a flea market, let's see, there's a flea market.
There's a flea market.
And the, we all understood that, look, it's a flea market.
It sells everything from Pokemon cards to wicker furniture to, you know, stuffed deer heads.
But also, we know that there is some amount of, like, drugs and child porn and,
weapons. And we don't know, we cannot, it's a massive flea market. It's 10 miles by 10 miles.
There's hundreds of thousands of vendors and we can't guarantee that this isn't happening.
And in fact, we know it is happening. So wouldn't it make sense to say, all right, if we're going
to have this flea market, then we're going to try to have some limits on who goes into this
flea market, right? And then we're going to have some limits and some monitoring of what they leave
the flea market with. We would do that. Yeah. We would 100%. And monitoring. And monitoring.
the vendors. And we would monitor the vendors. When it comes to digital spaces, we pretend that
there's no way to do anything. Yeah. I'm not saying that you could control it the same way you control
meat space. I know you can't. But what we are doing to some degree is we're throwing our hands in the
air and saying there's no way to control everything. So there's nothing to do at all.
I tried nothing in them all out of ideas. Yeah. And the result of that, and I think, you and I talked
about this a little last week. There are spaces that are being carved out online for the worst
possible people to aggregate together and amplify the amount of harm that they do. And that's
something that we really need to recognize, the scaling effect. We talked, I think, last week or the
week before about the CNN article about the women whose husbands and boyfriends and family members
were drugging and raping them
and then posting those videos
to that motherless site online.
And motherless, their tagline
was something like, you know,
everything that's not illegal
is moral or some bullshit, right?
It was some, like,
everything that we can post here
is going to stay here kind of tagline.
And there's like a thousand people
that got together to rape their wives
and girlfriends and, you know,
family members and to post those videos
and to share with each other
tips and tricks and to sell back and forth the kinds of drugs that are most effective to
commit rape. We're scaling rape, right? Like, we're scaling these things by having absolutely
no enforcement mechanism and no monitoring mechanisms. And we're pretending that, like,
that's just the inevitable price that we have to pay to do my banking online? Yeah.
What are you talking about? None of that. And the numbers of people, like you were talking about,
last week, you had said that there was a thousand people in this group. And this thing had been viewed
65 million times. All the different videos had been viewed 65 million times. There's hundreds,
did you say hundreds of thousands of videos or something like that? It was like a, it was a hundred
thousand videos. I mean, it was like, it was an intense number of videos. The number of people that
were involved in this group was relatively small, but relatively small in the terms of like
YouTube views, right? When you think about YouTube viewers, you think, oh, a thousand doesn't feel like a lot.
So there's been this push to try to diminish this idea.
But when you think about it on the flip side and you say, no, that's a thousand people
that have come there with the intent to communicate with others and to share with others the raping of someone.
That's a crazy number when you think about that number as like a thousand, like a convention of rapists.
That's a terrifyingly large number.
And the same thing occurs here, right?
there's a part of this where they even address it in the article.
They say when compared to mainstream social media, the numbers were tiny.
But looked at a different way, they were stunning.
These people had built an online community of thousands of people dedicated to celebrating
and committing acts of terrorism.
That should scare the shit out of you.
The same way that people who built a group of people of a thousand people that were
celebrating and cheering tips on how to rape other people, that should terrify you too.
it seems like a small number when compared to like large numbers of the world or number of people that watch like Oprah every day, sure.
But the number of people should scare the shit out of you.
It should.
They find each other.
There's a thousand of them, man.
And there's more.
There's fucking more.
Yeah.
Like you have to, if you see a thousand people active on this site, you have to then know that each, it's not, this does not represent every human being who is doing this.
This just represents only the people on this site reported that are sharing this publicly.
So what you have is the tip of an iceberg.
Right?
You've got something that you have to assume for every one here.
There's more here.
But even if you didn't assume that, even if you just took it and said the only people on the
entire planet who are doing this pterogram shit or who drugging and raping their wives
were exactly the numbers and only the numbers that we identified and sourced and verified
in triplicate.
if you assumed that, you still have a vast problem.
Yeah.
A vast problem.
There is an attempt, I think, by people who I now distrust, like de facto distrust.
There is an attempt to sort of like look at this and say it's only and to minimize.
And to say, well, this isn't that big of a number.
And it's like, that's a fucking huge number.
Yeah.
Like there's a thousand.
Like if I look at this terragram thing and I think like, well, there's a thousand right.
wing lunatics who are connected in a way that they never would have been connected before
trading information back and forth on how to build a bomb. And somebody is like, it's only a thousand
people worldwide, a thousand divided by seven billion is a small percentage. And like, what do you say,
why are you doing that work? Why are you carrying water? Yeah. To minimize rapists and murderers and
terrorists. What is inspiring you to do that rather than to say, hey, what do we got to do to make
that number less than 1,000? Because a thousand people getting together to trade tips on how to be
awful is something we should be like, all right, let's get creative. How do we fix it? I don't want to
lose my right to free speech, right? I'm not suggesting that I want to lose my right to free speech.
This show is predicated on free speech. This show can only exist because of free speech.
But to just minimize and to try to reduce and to try to well actually this kind of thing,
all that does is provide cover.
It enables.
It doesn't help the situation.
It doesn't say, God damn, that's fucking unbelievable.
Can you believe a, we live in a world where that many people are doing this?
Yeah.
That's not the world I wanted to live in.
Shouldn't we be shocked and appalled and shouldn't we be doing work immediately to be like,
hey, you know, we shouldn't be able to trade
tips and tricks on how to produce high
explosives out of fertilizer.
Like, there should be rules. And when you break
those rules, the people who didn't
monitor for that, if they didn't
do it, if they have no,
then they go to jail.
Because I'll tell you what, man, like,
it's just like those guys in Australia that were like,
at first the social media company's like, well, there's no
way it's technically possible.
And then they're like, well, I bet it fucking is.
And they're just like, turns out it's technically
possible.
Very possible.
Yeah, it's fucking possible.
And I said this before, like, it's so possible, post a nipple to your Facebook.
Doesn't work, right?
These systems can decide.
These systems can find this stuff.
The incentives don't line up.
And I think that's the important part of this puzzle, is that they, of course, can fix this.
We've seen them fix this.
We've seen them go and take groups out, slash those groups, cut them out, make sure that those groups can't communicate with each other on other.
social networks. We've seen them do that.
There's been massive pushback on other networks like Twitter where they now have free range,
essentially whatever they want to post on Twitter they can.
Even still, they still have some community standards because there's part of this article
where they talk very specifically about trying to find some of this stuff on Twitter and
it's not available, although other stuff was available on Twitter.
So there is some content moderation that's happening there.
But they can fix this.
They can work towards this.
sometimes that working toward it may feel like overreach on free speech.
We may see that.
This happened to us when we were reporting on some anti-VAC stuff, our videos got taken down.
That's right.
And sometimes there's going to be those hard edges that hit each other where it's like, hey,
we're not anti-Vex.
We're literally talking about the vaccine as if it's a good thing and we're reporting on people
who are saying it's a bad thing.
You didn't bother to watch it.
You're just lazy.
You don't even have a human I can talk to to say, you got it wrong, right?
I don't even have an opportunity to do that.
I don't have an appeals process that reaches a person.
I reach another bot that's just differently programmed.
So it's not the same thing.
But the thing is that they just want to cut as many corners as they can, sell as many ads to you as possible,
and they want to keep the, they want to sort of fly as close to the sun as they can without melting their wings.
And they want to do this as often as they can because it makes them the most,
money. If they can have both Tom and I on that channel, as well as as as close as you can get to a
white supremacist without actually saying those words out loud, they will have them on. And they want to
try to reach as bigger group as possible. And they want to try to have as controversial views as
possible. And the other issue you have is with this telegram, telegram for a long time was not
selling things. They were just a venture capital firm. So they were a firm that was just getting venture
capital, just like Facebook was for so long, what they did was is they integrated into your life
so that they are so embedded in it when they finally get the rug out from underneath you.
They're full of ads or they're full of this other stuff because you've integrated it in your
life so deeply that you can't get rid of it now.
Now I've used Telegram to create all these networks.
I can't just be like, well, fucking these ads are everywhere.
Jesus Christ, I got to scroll through six things to find the next bomb thing.
Jesus Christ, I just want to find the schematics for my bomb, and I got to scroll through six ads.
Before they did that, they needed to get you as a user.
And in order to get more money, they had to make sure that your usership wasn't canceled by them,
and it made more sense for them to allow literally anything to happen, because then they could
go to these venture capital firms and be like, look at the user count we have.
Look at how many people are on our system right now.
You want to invest in this, because once we turn those fucking ads on, the money's going
a shit out of there like it's got like it's got a fucking food poisoning it's just going to shoot
fucking out of there on both ends there's going to be money fine so of course they're going to
want to inflate those numbers they want you to see that they're running a lean operation and that
the amount of money that they're going to be able to pull in based on this is huge and the amount
of moderation that they do is very minimal because it's going to fucking inflate those numbers it's
going to make it really really large so it's all capitalism that is creating this problem and the
amount of money that's going in there, these people, they keep that, what they do is they make sure
that they, that they keep it so it's just as dangerous as possible without getting in trouble.
And the thing is, we need to turn the knob up so we say, no, no, no, that can't happen anymore.
Because what you're doing is you're creating a whole cesspool of people who can find each other
and then trade bomb schematics.
Yeah, man. Yeah. And like, I want to, you had a really great point. Like, there are going to,
like, let's use the YouTube example. We got flagged in some of,
our content got taken down. And that was frustrating. And it could have potentially, if we made a lot of
money on YouTube, which we don't. It could have been, it could have had a significant financial impact.
So that sucks. I want to acknowledge that that sucks that if we create systems that say no
to some things, we'll catch some, we'll make some mistakes. Systems always make mistakes.
These are imperfect decision-making large machines, right? They're not going to be. But the other option
is rape and murder.
That's the other option.
So like we're making a choice here that says we're willing to allow a certain amount of systematized institutional rape and murder in order for us to have this thing that we like.
There's nothing I like that much.
There's nothing you should like that much.
there's nothing that is likable enough.
Like, no matter how much it matters to you
and how many of your deepest and dearest friends,
if somebody said, Tom, you can never see Cecil again
or this person gets, I'd be like, yeah, I'll never see Cecil again
if somebody's going to get raped.
And that is my best friend in the world.
I would be like, sorry, I love you.
We will never talk again because I don't want this person over here
to be murdered.
It's not worth it to me.
I know there are prices that will be paid.
to acknowledge them and some of them will be significant.
Yeah.
But like the other price that we are already paying is the violent radicalization of in cells
and right-wing extremists and white supremacists and, you know, ISIS recruiters.
That's the other, that's the other price.
Like, so there's no not paying a price that we have to acknowledge that together.
There is no everything runs smooth and it's free.
That's not a thing.
So I don't want to pay the price over here where somebody goes to a nightclub and gets shot and killed.
And that's a thing that happens directly because of this pterogram.
So that's a real example where people are just outside a nightclub and an attack was planned and fomented and encouraged and incentivized and reinforced in these spaces.
And now people are murdered for it.
Yeah.
I'm like, yeah, all right, cool, man.
I'll go to the bank and deposit my check until we figure this.
out. Let me read
why I think this is
something that we need to pay attention to.
And I think what's the goal
of these people, right?
Obviously they're white supremacists, but what is their goal?
What's their outlying thing that they're trying to cause?
And I want to read a piece of this article.
Doctorin is called
militant accelerationism,
which has become popular with neo-Nazis
over the past decade.
The chat logs show.
Militant accelerationsists want to speed
the collapse of society
by committing decent
stabilizing terrorist attacks and mass killings.
They frequently targeted their perceived enemies, including people of color, Muslims, Jews,
gays, and lesbians.
They had created a chat group on Telegram, which had encouraged the following of the
people to firebom businesses, torch homes of anti-fascists, and seek out radioactive material
to build dirty bombs and detonate them in American cities.
Someone else created a channel and uploaded a steady stream of violent people.
propaganda and they named the channel after it was taken down first.
So initially the channel was taken down.
Then they put it back up and just called it terror wave refined.
So like they initially had the same name.
And then they were like, yeah, we'll take that down.
And like, okay, well, we'll just name it something close.
Then they interview this guy on this podcast called Hate Lab.
And on Hate Lab, this person who goes by the name of Slavic Bro, he says, I decided to
become a fucking content producer.
I saw a niche and decided to fill it.
They are influencers.
What they are is hate influencers.
And they're popular on these networks.
And they, like Tom said, foment this group.
So they don't just post stuff.
They interact with this group.
And they encourage this group.
And they want this group to go out and do these horrible things.
These are people who are not going out and building these bombs.
They're not going out and shooting up these clubs.
They're not going out and attacking these mosques.
But they are creating the people who,
we're doing it. And we're letting them get in touch with these people. We're seeing that they're
widely connected through massive online spider web groups and we don't do anything to stop them.
Yep. And we don't. And like, I refuse to believe one, that the price is worth it. Two, that it's
technologically impossible. Like those is, you have to take those and just as far as I'm concerned,
those are things we throw away. And we know it's possible. Those are just ideas we throw away.
Tom, we know it's possible because the way in which they deal with ISIS versus how they deal with the white supremacists, we know it's possible.
They go after these Muslim organizations all the time and they wipe out all their telegram stuff.
So that stuff is cleaned out as often as possible.
They don't go after the white supremacist stuff.
Yeah, man.
Yeah.
Because a lot of these fucking, like, let's just call it what it is.
Because a lot of the people, this is also we're saying, there's only.
a handful of these major platforms, and they're owned by, like, one person, right?
Like, Twitter's owned by one guy.
Yeah.
And that guy is a fucking evil racist shithead.
Yeah.
So is it a surprise that his content moderation policies lean toward evil racist shit?
No.
And, like, what we're doing is we're saying, telegram, like, there's literally no reason to believe
that that guy's, you know, the guy who owns telegram.
runs telegram that his ethics are in line with any decent human beings ethics.
These are, there's also an intense concentration of power for these handful of really big
influential places, online spaces.
And we can't just trust that they're going to do a good job.
We can't just trust that they're going to do like an ethical thing and like police themselves.
That's crazy.
We don't let anything.
think about all the things that are lower stakes that we don't allow for them to just police themselves.
As soon as things reach a certain level where they interact with the social good, we police those
things through regulation. We police those things heavily by saying you have to have certain
requirements and I need to be able to come in and audit that you're doing these things.
And like, again, using the banking example, because I think it's a good example, the standard
for banks, and I know this because this is part of the industry I'm in, the standard for banks is not you have to catch 100% of the money laundering or you go out of business. Nobody is an asshole like that. What they say is like you have to have these rules and regulations and systems in place and you have to take them seriously and you have to monitor them and they all have to be done in good faith and they all have to be done on the regular and they have to be transparent to auditors. And if that stuff is in play, we'll catch a lot of it and we'll make it harder to.
to do. We will not eliminate it. We all know we're not going to eliminate. There's fucking
cryptocurrency. We're not going to eliminate money laundering. That's what cryptocurrency is literally for.
But like, that is how you do this. So, but like naysayers will say, because it can't be perfected,
it can't be done. Perfect is the enemy of the good. It's the Nirvana fallacy. Yeah. It's a fallacy.
It's a way to discount and say nothing can happen. We can't do anything because we can't fix it 100%.
I want to go back to, and I want to focus on the ideology that is pushing this stuff
because it's pervasive. It's not just in these tiny groups, right? This is a small group,
thousand people. They're sharing bombs, schematics and how to make stuff and how to attack
people and they're sharing a hit list of people who you should kill, et cetera, et cetera. So all that's
really terrible. But this stuff seeps out. And the ideology in these manifestos is what
seeps out into the world.
And what gets caught up into the main zeitgeist of all of the thoughts that other people
have and very specifically embeds itself in the right wing thoughtosphere.
And what's embedded itself in the last several years and has had a stranglehold on that
group is the great replacement theory.
So the great replacement theory is something that they share constantly amongst themselves.
It is a white supremacist completely made up.
a fallacious idea that white people are being replaced by migrants in all of these countries,
and it's being orchestrated by the Jews.
The Jews are orchestrating this big replacement of white people, and you are going to be
eradicated, is essentially what they're telling this group of people.
And so what happens is they become radicalized by this idea, and it not only supercharges
them to go after Jewish people, right? Because they think they're the ones who are in charge of the
whole thing. It also supercharges them to go after Muslims. It supercharges them to go after LGBT people.
Yeah. It supercharges them to go after tons of marginalized groups because for them, they're the enemy.
They're now the enemy. And this is not a unpopular conspiracy theory. It's talked about on Joe Rogan,
but it's, I mean, it was told to him by the vice president, right? So the guy who's a
applying for the job of vice presidency,
J.D. Vance comes in and tells him
that what they're doing is they're shipping migrants in here
to change the numbers on how we vote.
So they're trying to pack the numbers in Congress
by shipping people all over
and Joe buys it hook, line, and sinker,
and then talks about it for two years afterwards.
Talks about how this great replacement theory
is what is essentially driving American politics,
and they're trying to wipe you, Joe Rogan listener.
They're trying to wipe you off the map.
How do you think that that's not going to,
to radicalize people. People are going to hear that and they're going to hear these ideas that are
popping around on very popular programs all across the internet and on television. And then they're
going to think, well, gosh, that's really scary. What's happening? And they start searching for this
stuff. Maybe they're searching on telegram where they can reach out and touch one of these people
and then get adopted into these terror cells so easily because it's spoofed to them by these really
popular influencers who are already outside of this sphere.
So where they're cycling this, they're feeding it.
It's a fucking human centipede of fucking shit that keeps on cycling throughout this whole thing.
And this is a dangerous, dangerous conspiracy theory that is being spread every single day by
mainstream people now.
Yeah, well, you know, the white supremacist movement has been, they're on the record,
leaders of that movement are on the record over the last, a couple of dozen
years saying that they are intentionally shifting their strategy to be a strategy of hiding in
plain sight and normalizing. They are not going around the way that they had been in times
past with this very like aggressive in your face, visually different messaging. Now they are
attempting to normalize their message, normalize the people who are delivering this message
and integrate that thought process into day-to-day society,
and they've been incredibly successful at doing it.
It feels like Project 2025.
We know what the plan is because they've said it out loud.
They just said it.
I'm not like making this up there.
There are many articles where leaders of this movement have said,
hey, we realize that what we've got to do
is we've got to deliver our message in a different,
more palatable way that is going to normalize,
into society that is going to allow us to sort of like wedge into people's lives, and then we use
that to deepen the radicalization. I'm not making that up. That's the thing they said. And now that's
the thing they're doing. Like there's a guy that I know who my barber cuts my hair. He's a really nice guy.
He's not a particularly well-educated guy, but he's a curious guy and he's a really nice guy.
And I like him. And I was shocked. I think I brought this up on the show before. I was shocked
a few weeks ago when he asked me like some question, I forgot how we phrased it, but the question
implied that he was watching YouTube videos and he's a little worried about the Jews.
And I was like, how the fuck did this happen to Jonathan?
He's a super nice person.
He wants good things for himself and the people in his life.
He's not a bad dude.
But that normalizing, like the point of that normalizing is to make that tentative.
of hate and that tent of like believing in great replacement theory as big as possible.
And then they use these like other groups to funnel down the knives, right?
To find who are going to be the people who are going to be the people who go out and commit
the most heinous acts?
These are going to be the knives in the bunch.
And everything else is going to be the thing that normalizes and allows us to use that
and excuses it and enables it and creates space for it to grow.
They're being more successful than they've been my whole life.
and I'm almost 50 years old.
And this is real fucking unsettling.
Like, real fucking unsettling.
I just want to reiterate that I think that these things can change.
And I think we can have a better, safer place online.
I think sometimes people will have to deal with content moderation and be frustrated by it.
But I think that's a tiny, tiny price to pay for the.
the spreading of these hateful ideologies,
not just in one place but all over.
We know, we know it is true
that these hateful ideologies
that were posted in a manifesto
of a guy who shot up 50 people in a mosque,
those ideologies are being shared
on mainstream shows right now.
They're diluted, they're watered down,
but they're only watered down enough
so that they can get past all those pieces
that are in place to stop it, right?
They don't add the Jew part at the end, right?
They don't add the Jewish part.
What they do is they say all the rest of it.
Oh, they're coming in to take your jobs.
Oh, they're coming in to replace you.
Oh, they're going to, I'll breed you, et cetera, et cetera.
All that stuff get past the moderation.
And what they do is they don't add one at the end.
They wait for you to find this theory somewhere else behind a closed door so they can add those few pieces in.
Those bad, you know, those other pieces.
It's all bad.
But they get to add even more negative shit behind closed doors of these social media companies that allow it.
All right, that's going to wrap it up for this week on a long form.
I don't know if this came out before or after.
I have no idea.
But we'll definitely be back Monday.
So you can just come back then Monday and just listen to a show.
So we're going to leave you like we always do with the skeptics greed.
Credulity is not a virtue.
It's fortune cookie cutter.
issue, hypno-Babalon, bullshit.
Couched in
Scientician, double bubble, toil
and trouble, pseudo-quazi alternative
acupunctuating, pressurized,
stereogram, pyramidal,
free energy healing, water, downward
spiral, brain dead pan, sales
pitch, late-night info docutainment.
Leo Pisces, cancer cures, detox,
reflex, foot massage, death
and towers, tarot cars, psychic
healing, crystal balls,
Bigfoot, Yeti, aliens, churches,
mosques and synagogues, temples, dragons, giant worms, Atlantis, dolphins, truthers, birthers, witches, wizards,
vaccines, vaccine nuts. Shaman healers, evangelists, conspiracy, doublespeak, stigmata, nonsense.
Expose your sides. Thrust your hands. Bloody, evidential, conclusive.
Doubt even this.
Thanks for tuning in. If you enjoyed the show, consider supporting us.
us on Patreon at patreon.com forward slash dissonance pod. Help us spread the word by sharing our content.
Find us on TikTok, YouTube, Facebook, and Preds, all under the handle at Dissonance Pod.
This show is can credentialed, which means you can report instances of harassment, abuse, or other harm
on their hotline at 617-249-4255 or on their website at creatoraccountabilitynetwork.org.
Marketing is hard.
But I'll tell you a little secret.
It doesn't have to be.
Let me point something out.
You're listening to a podcast right now, and it's great.
You love the host.
You seek it out and download it.
You listen to it while driving, working out, cooking, even going to the bathroom.
Podcasts are a pretty close companion.
And this is a podcast ad.
Did I get your attention?
You can reach great listeners like yourself with podcast advertising from Lib Syn ads.
Choose from hundreds of top podcasts offering host endorsements
or run a pre-produced ad like this one,
across thousands of shows to reach your target audience in their favorite podcasts with Lib Synads.
Go to Lib Synads.com. That's L-I-B-S-Y-N-AIDS.com today.
