Angry Planet - A Killer True Crime Fandom & Islamic State’s Digital Caliphate
Episode Date: March 4, 2026Things have gotten very surreal in the dark corners of the internet. AI-generated prophets are preaching jihad in Facebook groups, Minecraft servers host digital caliphates, and school shooting fandom...s gather to study their heroes and plot how to up beat their score. It’s a double bill on this episode of Angry Planet as two experts from the Institute for Strategic Dialogue (ISD), a nonprofit that studies and works to mitigate violent extremists, discuss the brave new world of online-born violence.First up is Milo Comerford, the co-author of a study about nihilistic violence. Then we’ve got Moustafa Ayad to talk about how the Islamic State is circumventing bans and pushing its message on social media.Staying sane on the internetViolence without ideologyThe Comm764True Crime CommunitySaints CultureWhen fandom becomes a killingAn aesthetics driven movementOnline and offline have mergedModeration is impossibleYou don’t have to hand it to ISISBroken text postingCopyright strikes and the Islamic StateFacebook professional as the gold standardAI resurrects dead influencersJihad influencersEven IS is obsessed with the Epstein filesVirtual caliphates in Roblox and Minecraft“We must be careful about what we pretend to be.”Once again, it all comes back to 4chanSaying nice things about twitter dot comBeyond Extremism‘The Comm’: The Group Linked to a Nationwide Swatting RampageHow the True Crime Community generates its own killersSupport this show http://supporter.acast.com/warcollege. Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
Love this podcast.
Support this show through the ACAST supporter feature.
It's up to you how much you give, and there's no regular commitment.
Just click the link in the show description to support now.
Hello there, Angry Planet Listeners, Matthew here at the top.
I'm just going to tease what we've got.
It's two different interviews kind of strung together of the same theme, two guys from the same place.
They are gentlemen from the Institute for Strategic Dialogue.
There's a pair of reports out from them.
We're going to check in on one of our favorite topics, online extremism.
And we're going to get into some interesting things about criminal groups online and kind of how they interact.
And what's the Islamic state up to?
What are they posted about?
Are they using AI to resurrect Baghdadi and hanging out in Roblox?
Yes.
It's bizarre.
Stay tuned.
Here we go.
I was thinking about you in this report yesterday, which I was.
I'll introduce in a moment.
As I was looking at some particularly nasty things online,
and I was thinking, like, I have a, you know, like, especially lately,
it's been, like, a lot of the stuff has been wearing me down,
like a lot of the things that I see on the internet.
And I feel like I got nothing on what you're dealing with and what you're, like,
putting yourself through.
How do you manage that?
I think we all know someone who's dealing with an even more serious version of what we're
dealing with.
So I could point you in the direction.
of people that are having a really tough time.
But certainly, I mean, it's just wild.
I mean, I'm someone that's spent the last 10 years,
you know, looking at ISIS content,
a far right terrorist material.
But the stuff that we're going to talk about today,
you know, kind of hits different
and it's just on another level altogether.
I think there's something about the kind of deep nihilism
that kind of sits in a different way
to something that is ultimately kind of about a worldview
and a political project.
I think there's something about the kind of
total cynicism, that like mainstreamed misanthropy that really does grind in a different way
to nonetheless horrific, but certainly more kind of, in a way, aspirational material that I've
been looking at for the best part of a decade.
Say what you will about the tenets of national socialism. At least it's an ideology.
Right. Yeah. Your word's not mine.
It's the Cohen Brothers words, actually. So we were talking about
beyond extremism, platform responses to online subcultures of nihilistic violence from the Institute for Strategic Dialogue.
Sir, can you introduce yourself?
Of course.
My name is Mylo Comerford.
I'm a senior director for policy and research at ISD, the Institute for Strategic Dialogue.
We're a think tank that works in counter extremism, our 20-year birthday this year, and I've been working for a long time on essentially working with governments, with
practitioners with those on the front line
dealing with extremism challenges to
help them understand how these threats are evolving
and really to analyze and digest
those darkest parts of the internet to
help shine a light and to
empower responses.
So we've talked about
on this show before,
I think that the audience won't be completely
new to this, but I think this is a pretty different frame
from what we've put on it
before. So we've mostly
focused on some of the high profile
assassinations in the
US and how they seem to have no, like, they seem to be done for the love of the game.
And we've talked about it as part of a continuity of like online subculture that is kind of built towards this moment.
This is a very different like framing of it and an understanding of it and I think a deeper understanding of like what's going on.
So at the very top, can we just define what do we mean when we say nihilistic violence?
of course and I think it's useful to start with with those conversations about the assassinations
because historically if you saw that kind of violence you know what you might have understood
as political violence you'd have gone around searching for a clear justification and ideology
you'd have been hunting through you know say over the last five years the online profile of
the perpetrator you'd have been trying to do some sense making and it is striking that
the last few episodes of you know this kind of violence that's been really challenging to do
and it's required you know essentially a PhD in memes in online subculture
in understanding in group references.
And I think these aren't just sort of aberrations from the norm.
I think this is a new direction that we're facing.
And I certainly think it's the kind of context in which this specific conversation is happening.
So when we're talking about nihilistic violence, I think there's a lot of names for this stuff.
And I think it's taken a long time for people to kind of crystallize this as a specific phenomenon.
I think we've been seeing different parts of the elephant when we've been touching its ears and its, you know, its sides and its tail.
But I think we're getting, what's coming into view now is essentially a kind of cohesive online network of different subcultures, you know, across different platforms, which are really unified by a like fundamental nihilism and obsession with violence. And what I mean by nihilism is people who are not motivated by a political project, but rather, you know, they're carrying out violence for its own sake. And it's violence that is performative. It's violence that's seeking to kind of, you know, to, you know, to,
to gain attention and notoriety among these online communities.
But it's not looking to create an end state.
It's not looking to change a political system.
It's not looking to achieve social change.
But nonetheless has a sort of cohesion to it.
And certainly a set of discernible aesthetics, narratives, and communities,
which are really important to understand and make sense of.
Yeah, one of the ways that I kind of understood it,
as I was reading the research is that I think like 20, 30, 40 years ago, there are,
everyone knew someone that maybe they worked with this person or they went to school with
this person who was like really into serial killers.
Like maybe even wrote them letters and got like had a correspondence with them.
Maybe it was somebody you worked with.
Maybe it was like a friend's sister or something.
now those kinds of people that have that kind of attitude are like getting together and coordinating and not just writing letters to serial killers, but like making fan cams of them, celebrating it.
And in some extreme instances, like going out and carrying acts of violence out in the real world and encouraging them purely so that there are more acts of violence in the real world, right?
That is right.
And there has always been a community, as you say, around this of a sort.
It's been more than 25 years since the Columbine high school attacks.
And there was a sort of odd sense of community that existed in there.
And there were films made about those who were kind of obsessed with this sort of act of violence.
So it's not the community dynamic in itself isn't new.
What is new is, you know, the way that this is bringing together people from across the world
who have no kind of relationship with another into parasocial, you know, relationships with these mass attackers.
And it might have been that you, you know, you were interested in these attackers because you felt you had some ideological affinity with them. Maybe it was their, you know, white supremacy. Maybe it was, you know, in the case of Omar Martine, the targeting of LGBT, you know, communities. But in this case, it's actually more the violence itself. There's a sort of odd gamification that has happened where it's, it's about high scores. It's about comparing people regardless of what they were looking to achieve with their violence and almost seeing them as part of this kind of pantheon of mass attackers who were looking to burn the whole.
system down. So there's this sort of agnosticism that I think has become a new dimension of this. And of course,
the way that online communities operate now, you know, mean that there are unique ways of bringing
together people that are part of a continuum, if you like, people that have these obsessions,
through to people that, that, you know, might be vulnerable to other kinds of harms. You know,
they might have been part of, they might have been looking for information on other, you know,
kind of sensitive topics like, you know, self-harm, suicide, ideation. They've been
funneled into these kinds of more, these spaces. And then, of course, within that, there is a
subset of people that are interested in replicating this and kind of doing it themselves and
becoming enshrined in this pantheon. So in that sense, you know, that this has a long legacy,
this kind of dynamic, but there are some shifts and changes that have happened in the past
four or five years that I think have qualitatively shifted, you know, the nature of this
threat. I think one thing that I'd come back to is, you know, of course, school shootings have
been a tragic part of, you know, particularly American life for a very long time.
to a lesser extent in other parts of the world.
And I think these were previously seen as slightly disconnected, you know,
aggrieved individuals who were perhaps learning from the methodologies of others who were doing
this but weren't kind of tied together in any meaningful way.
But now increasingly you see attacks happening and you start to see a through line.
You start to see they're part of these specific online communities and they see themselves
as part of this broader fabric, online fabric.
And unfortunately now, you know, when analysts in my team who study this stuff day and day out
hear of the tragic news of an attack.
They can almost tell, you know, within 10 minutes
whether this is likely to be something that is
tied to these kind of communities.
And I think, you know, that is a phenomenon
that really requires wrapping your head around.
What are the tells?
Well, I think very quickly there is
an, you know, an online
breadcromb trail that becomes
very apparent very quickly.
I think there's a huge self-awareness
from these attackers that this,
this breadcromb trail is going to be poured over
very quickly. Sometimes you see people
trying to throw researchers off or throw law enforcement off and know full well, you know, the kind of
traps they might fall into. There's, there's, you know, specific attempts to kind of take people
down garden paths. And, you know, the kind of, I suppose, de-realization of these kind of attacks,
you know, over and over again, you know, they're filmed from a sort of first-person shooter perspective
or, you know, they're kind of talked about in manifestos, almost as if they are extensions of
video games. There's, you know, a lot of reference.
in those contexts that build on the examples of those that have come before that are very self-referential, that, you know, speak of themselves as part of a broader project, not a political project, but a project to sort of tear the whole thing down. So I do think, you know, the, the signs are there. And we can talk a little bit more about what those specifically are, maybe a bit later on.
Let's, let's back up, because you guys focus on three groups in this, in this paper. One of them I had never heard of,
before, which kind of shocked me.
The first, so let's, let's run through those because they're pretty distinct.
And I would say that they're all slightly different expressions of this, like this, this kind of dark through line.
Right.
So first is the calm, which to me is the scariest.
But.
Yeah, the calm is, is really the scariest.
And I think in some ways is the sort of the darkest time.
line, so to speak, of this version of nihilism. It comes out of essentially kind of abuse networks that
again, I've been around for quite a long time. There has been, you know, for as long as the internet
has existed. Unfortunately, there have been people that have been using it to weaponize these tools,
especially to reach kids, to groom, to abuse people. But, you know, for five years or so,
there's been, you know, specific networks that are about perpetuating this abuse, that are about,
creating kind of financial structures that can support it, making it, if you like, a sort of
self-contained ecosystem. And within the comm network, there are different elements. There are
those that are more sort of financially motivated. There are those that are more sort of nihilistic,
you know, that almost kind of are defined by their sort of attempt to burn systems down. And there
are those that are specifically looking to carry out acts of abuse and increasingly violence.
And within this, we saw a specific movement known as the 764 movement really emerged five years ago or so, which came from this sort of CSAM child sexual abuse material coercion space, but started to direct this increasingly towards carrying out acts of mass violence. So by compromising people, often very, very young people who are victims who are then turned into perpetrators, you essentially bring them in, you start to create compromising material on them. You ratchet them up.
through a kind of almost pre-established radicalization trajectory. You get them to carry out acts of
violence against pets. You get them to carry out self-harm imagery. And eventually, you know,
a groomer will get them to carry out acts of mass violence, which look a lot like terrorism.
You know, they might not be ideologically motivated, but there are numerous acts across the US and
across Europe now of mass violence, stabbings in particular that are linked to this sort of
shadowy movement. So in this sense, I'd say this is the part of the see.
system that most resembles a kind of organized group, like you might be familiar with that
kind of ISIS or, you know, the far right terrorist group, like the Terrogram Collective.
But, of course, it is divorced from wanting to achieve any political end.
But it uses many of the same methodologies.
And actually, to some extent, references some of the same aesthetics and cultural reference
points as these kind of neo-Nazi movements that have proliferated online for some time.
Yeah, it's, they're interesting to me.
because they are in some ways
they feel like a criminal organization
in that they are financially motivated a lot of the time.
But like you say they may blackmail someone
and then get them to commit increasing acts of violence
and they are documenting all of that.
They're filming it, they're creating video
and then they're feeding all of that back into the groups.
So these people become like characters
in this online world.
Yeah. And certainly in terms of the kind of methodology that has been established, it means that there is perhaps more, you know, there's more evidence. There's a greater paper trail. We've seen much more law enforcement action against groups like 764 and actually a huge spike in efforts, especially in the U.S. I should say, the FBI had been sort of laser focused in the last year across all 50 states on this specific threat. And I do think, as you say, the kind of criminal nature of those networks,
means there's perhaps a bit more of an established playbook for responding to these in the same way that you might with, you know, cyber hackers, with with other forms of criminality, with obviously, you know, child abuse, which has been prosecuted, you know, very strongly for a long time. So to some extent, I think this is the thinner end of the wedge in terms of law enforcement responses. And it's also notable that, um, a lot of five eyes governments have come out and explicitly talked and warned about the risk of the com. And there's more work across governments, um, on this part.
of the ecosystem than anyone else.
So they are coordinating ever more closely on disrupting this highly transnational network,
recognizing that law enforcement is going to have to work together on this.
There's not going to be a kind of country-level response to this huge,
sprawling network.
I think it's something that people can get their, you can get your brain around it a little bit
better than some of these others, right?
I wouldn't recommend it, but you certainly can.
You wouldn't recommend it, but like, adolescence exists on Netflix and was popular,
and I would say has calm themes.
And like people and like just like the financial like because so much of it is financially motivated,
it makes more sense.
One of my colleagues at 404 Media, Joseph Cox, has done a lot of reporting on the column.
And I remember one of the stories that he turned up was it was never quite clear how much of this was staged or really.
but some com members had broken into some arrival's house and had tied them up and they wanted a Bitcoin address.
And they were going to shoot him up with heroin if he didn't give it to them.
And they filmed all of this.
It was like at the end of that rainbow, there's a Bitcoin address.
Like there's money.
Right.
It's not just for the sake of the violence.
Right.
So I can get around it a little bit more.
But like any good pyramid scheme,
the kind of money funnels up and it's kind of ultimately, you know, it really does have kind of
that Ponzi dimension in that you're trying to bring people in the whole business model works on
bringing in another rung and essentially getting those people who are victims to then become
the people that victimized the next generation. So in that sense, it's quite a classic financial
model that we all know very well and it's also fundamentally unsustainable because it relies on kind
of a growing pool of people. Unfortunately, you know, that pool is,
is very wide. And given the completely remote nature of this and the fact that people don't even
need to be in the same content, let alone in the same country to compromise a victim, and there are
very sophisticated, you know, playbooks that have designed to do so. You know, this could go even
wider than it currently is. Tell me then about how, just to make it explicit real quick,
before we jump off of this, how are, what distinguishes the calm from 764?
The comm is essentially the umbrella group.
You know, the comm is the wider network.
It includes more of these sort of, I'd say it houses the more financially motivated activity that we have here.
764, I would say, is the bit that feels closest to a kind of online terrorist movement.
So I think that's where, you know, this is the group that's more associated with acts of violence.
You know, within 764, there is this kind of notion of no lives matter, a kind of subgrouping that split out,
which obviously is trying to riff off, you know, the Black Lives Matter and other movements,
but really gets to the heart of the sort of fundamental nihilism of what we're talking about here.
But yeah, to some extent, Com is the useful way of thinking about these very broad networks,
many of which don't have any relation to, say, acts of mass violence.
764 is the very specific set of aesthetics, cultural reference points that are really about driving on self-harm,
harm to others and sort of perpetuating violence linked to the movement.
So we'll move on from that, put a pin in that, and tell me about the true crime community.
Sounds innocuous. A lot of people love true crime.
That's right. And I think there is a consciousness to that. There is a deliberate attempt to
sort of frame this as an interest group. And really the best way to understand the true crime
community is understanding internet fandom. And again, this speaks to something that
been around for a long time, you know, since the early days of the internet. This is,
we have seen communities pop up that bring together odd and unusual fandums. And certainly there
have always been those who are interested in true crime. And there is a much broader constituency
of people, you know, of podcasts, of kind of other like sets of media, of fan fiction that are,
that are very interested in the true crime genre. There are ways that that is and isn't relevant
to this specific community. So on the one hand, they, it really, it really,
really is a bit of a funnel. You know, you have this very broad set of people that are interested in
true crime and then you have a subset of obsession, which again is the case with any fandom. But in
this case, the obsession goes beyond, you know, just the kind of deep, deep interest in these figures.
It really is about lionizing serial killers, terrorists, mass murderers. And in a subset of
these cases, also, you know, kind of inducing other similar acts of violence, so sort of copycat stuff.
a big part of this is something that you might have come across before, which is this notion of saints culture,
which actually kind of has its roots in in far right online ecosystems, but it's kind of migrated over here,
which is like quite literally replicating these mass attackers as saints of the kind of pantheon
of individuals who are ranked and rated according to their, you know, their ability to carry out acts of mass violence.
But, you know, this is really, in a way, it's on a spectrum with groups like 764 in that it is increasingly
materialising into acts of real world violence, particularly school shootings.
So our analysts at ISD have identified at least 15 school shootings or lots that have been
disrupted linked to the true crime community since 2024.
But this is, of course, a subset of this wider community of people that have very active
conversations.
What's notable about them is there's a lot of self-awareness within this community around
the minority of people within there that might be inclined to.
to replicate the violence that they are obsessed with.
And there's some self-policing.
There is some kind of parts of the community that say do not condone.
They are explicitly in their handle saying that we are interested and very laser-focused on
these acts and those that want to go a step further and replicate it.
It's often, I will say, very hard to kind of navigate between those two.
And sometimes it's very easy to, we've seen cases in the UK, for example,
of mass attackers or alleged mass attackers who claim that they are just researchers. They are just
people who are part of this community. They are storing in an obsessive way information about
mass attackers out of a kind of academic interest. So of course, it's very challenging to know
where the risk is in this community, but certainly an increasing number of real world acts of
violence that are emerging from this online ecosystem.
where are these people, and I'm not just talking about TCC, but I mean, like, what platforms
is the organization and the sharing of pictures and the conversations happening?
It's a really broad range of platforms.
I mean, TCC, as I said, it has its roots in fandom, and therefore it kind of reflects other forms
of, you know, of online fandom that cross over, for example, between TikTok and Humbley is a kind
of platform of real interest for different fandoms, Pinterest, you know, that kind of tried to
capture the aesthetic of these kind of individuals, whether it's people wearing, you know, the
clothes of previous school shooters or trying to sort of capture the essence of them in some way.
But there's really a, you know, there's a set of different platforms used for different purposes,
and some of it is about a much more mainstream conversation.
Some of it is about using the functionality of platforms, you know, in terms of algorithmic
amplification to sort of, you know, to broaden the community. And in others, it's about much more
in-depth discussion, you know, communication as a community. So platforms like Discord that have,
you know, kind of server functionality are much more amenable to very kind of, you know,
focused in-group discussions of people that all understand the same cultural reference points and
so on. So it, like all of us, it seamlessly spreads across platforms and these platforms are used for
their specific functionalities. And, you know, to be honest, there's not a kind of one platform
strategy, but rather you have to maintain kind of eyes across a wide range of different services
that are playing a role in some way in this kind of ecosystem.
It seems like it's really an aesthetic-driven movement is the wrong word, but phenomenon.
Yeah, I think aesthetics are a great way of identifying it. I think it's challenging when
people say, what is this and you say, well, you know it when you see it. In this case, that's quite
literally the case. You know, it's hard sometimes to digest. We're not talking about specific
symbols, you know, that you might have had with far-right terrorism. We're not talking about
logos of groups. We're really talking about, you know, a set of cultural reference points of
aesthetics and narratives that are constantly evolving. And as I said earlier, they have
considerable overlap with other adjacent online communities. So, you know, we within some of
these spaces, for example, see the proliferation of swastika,
or kind of symbols associated with other groups like the Order of Nine Angles,
a kind of neo-Satonist group.
That's not because these individuals are themselves kind of believers in these ideologies,
but rather they're just a set of references.
It's almost a kind of stereotypically evil aesthetic.
You know, it's kind of a lot of use of blood red.
It's kind of, there's almost a campness to what you're seeing within these communities.
And I think that, you know, along with denialism,
is a kind of certain
understanding of the ridiculousness
and the over-the-topness
of the aesthetic element.
You know, it's heavy use of vapor wave.
It's drawing on broader kind of
internet subculture that will be familiar
to those who kind of grew up in these online spaces.
And it's, I suppose,
a sort of literacy in all the different types of
this aesthetic from more chan board,
kind of DIY aesthetics to the more polished,
kind of stylized stuff
that's more associated with the far right.
So, yeah, in a way, it's a slippery concept and it's hard to pin down.
But all I'll say is that when you spend a lot of time in these communities, you know when you're there.
Let's put it that way.
If we can zoom out just a little bit, why is this happening?
Why are people losing themselves to this, do you think?
I think there's a few things happening at once.
If you are Gettie expert, you know, it will be a kind of picking three at random of the sort of 10 or 12 sociocultural things that are happening at the same time.
But, you know, there are elements of this that are that are sort of subcultural in nature, you know, that there is a group of people that are increasingly despondent that are increasingly, essentially the kind of wider pool of nihilism, of people feeling, you know, kind of cast out from main.
society and finding refuge in these alternative online subcultures, there's, there's not really
that much new about that phenomenon. That is something that is, you know, that this has always been
a haven for those who don't quite fit in. But I do think there is something to note, which is this
kind of broader alienation, loneliness and kind of sense of of failure to, to buy into mainstream
politics, mainstream society, that this stuff really speaks to and creates a kind of very resonant
context for individuals when they kind of come in and are able to provide, you know, to be
part of something that fundamentally rejects that and has a much more cynical lens. I think there's
also a context of the kind of normalization of violence within online spaces. It's, if you notice one
thing when you come into these environments, it's just how present violence is. You know, this is a,
there is something kind of totally desensitized to violence of all kinds. It's almost, it doesn't
matter what kind of violence it is. It might be a terrorist attack. It might be animal abuse. It might be
CSAM. But the kind of essentially wholesale, yeah, sort of mainstreaming of violence really
sets the groundwork for a sense of de-realization where people are not able to kind of appreciate
that there is a sort of real world consequence to, or there's not much of a jump, let's say,
between being immersed in online violence and taking, you know, taking part in acts of real
world violence. And then there's also a sort of final piece to this, which is partly about how
platforms themselves operate, which is, you know, the ubiquity of online communities and
subcultures that are, you know, on the one hand, are able to seamlessly move across different
services and platforms. You know, people just as naturally as you like are kind of jumping
between like, you know, gaming platforms and, you know, TikTok can.
communities and, you know, kind of forums and image boards. But, but as a result of the way that
platforms are built and developed and the kind of, I suppose, the business models and infrastructure
that is really about kind of monetizing these sort of communities and particularly communities
that are high interest that are obsessive and that are, you know, creating lots of boardline
activity, let's put it that way. You know, there is also a sort of basic function of this being a
slightly inevitable outcome of the direction of travel with with kind of platform design and all that
in the context of the kind of whole scale rolling back of trust and safety investment of attempts
to, you know, really systematically address these networks and anything other than a very
ad hoc kind of takedown-based approach, you know, these communities are very used to take downs.
You know, they have spent their whole lives, creating backup accounts, you know, working off burners,
knowing how to cloak IPs, but there has never really been any sophisticated effort to actually
kind of address the network dimension of this.
So they've grown up in the age of
whack-a-mole. They're very used to doing this.
But they know that ultimately,
you know, that these are going to be safe
spaces for them to be able to
engage in this harmful activity.
I've been thinking a lot lately
about
the Marshall McLuhan
of everything, the way that
the means
by which, like, we create art and
communicate with each other actually
shapes
that art and the way we communicate.
And I think that what you're talking about here is
some of that, right?
Like if you,
in a fan cam culture
on these platforms
that prized the aesthetic above all else,
you know, you're going to get some weird shit.
Yeah.
There is an evolutionary element, of course,
that is exciting,
that is, you know,
that is stimulating for people, like things are, things move very quickly within these communities.
They're highly iterative. They are, you know, user base. There are not many kind of gatekeepers.
And I think there is something like fundamentally thrilling about the dynamism of being in spaces that are,
that are ultimately geared towards co-creation. And it's, it's multimedia, as you say, it's, it's
fan fiction, it's art, it's bits of writing. You know, I think there's a lot of excitement about
experimentation with new joint of AI tools to kind of, you know, even further sort of promulgate this
aesthetic in a kind of, you know, sustainable way. And so while there's a lot of focus on the harm,
and, you know, that is, of course, as researchers and analysts that are worried about real world
violence, that's a big focus of ours. I think there's a failure to appreciate the kind of
fundamental appeal of this countercultural aesthetic of, you know, the sort of the haven that
this provides the kind of cabin in the woods from people when, you know, there's a kind of overall
relativism of meaning and other challenges in your life. There is a clear kind of black and white to
these communities. And I think there is a lot to be said for kind of understanding that. It's kind of
hard given one of the points of our paper was really to better empower platform responses
that are very much geared towards, you know, dealing with specific groups, you know,
tangible networks, maybe even people with a membership list.
But when you're talking about something that is much more community driven,
where finding the needle in the haystack is a lot more tricky,
where looking back retrospectively, all the signs were there,
but in real time, it's a lot harder to know, you know, where the risk is.
But ultimately, this is going to require a sort of fundamental rethink
of how these platforms are thinking about risk.
I want to end on like what platform.
do in what your recommendations are.
But before we get to that,
I think that there's a bigger picture thing here.
I think that there's like,
there's something material going on in these people's lives
that gets them into this space.
Not to say that like TikTok, Facebook,
Twitter, etc., don't have a responsibility here.
But I think that there's two pieces of this.
And so, like, that's one part of it.
But first I want to talk about like,
How do you talk people, how do you, how do you break through nihilism and make someone care about something other than like the aesthetics of serial killers?
Like what's like what do you, what do you do here outside of the platforms?
It's the hardest part of this. And the, again, to be glib about, say, extremist movements, you know, there is a, there is an energy to ideology based supremacy.
that can be redirected. There is something, when you're dealing with people that have, like, ultimately
a constructivist worldview that wants to, yes, maybe tear things down, also build something. You know,
like they want to destroy the West, but they want to build a caliphate. They want to kind of tear down,
you know, multicultural societies, but build an ethno state. There may be something in that latter
piece that can be channeled. The challenge when you're dealing with these fundamentally cynical,
nihilistic spaces is, yeah, what do you work with? I mean, what do you fundamentally kind of
piecing together.
And in that sense,
I'm a real evangelist for
a kind of prevention in this area.
It's going to be much, much
harder to reach the sort of
incredibly radicalized constituencies that are on the
cusp of doing something,
rather than be able to empower people
with spotting the signs early,
being able to see some of the kind of,
you know, indicators is
maybe the wrong word, but certainly the kind of
concerning signs that someone is moving
down the wrong path with these kind of
world views and perspectives, I suppose. And, you know, you said earlier that this is about people's
real world circumstances. I mean, one of the things that we've noticed in, you know, dealing with
these kind of cases is that there are very discernible accelerants that kind of move someone from
a interest and perhaps even an obsession into someone who wants to act upon this. And, you know,
these might well be a specific trigger, you know, in someone's life. It might be the loss of a job.
It might be, you know, a sudden financial change.
It might be, you know, a relationship shift.
And these are, of course, things are not enough themselves going to turn someone into a violent killer.
But nonetheless, they are almost uniformly present in cases of people that are going from experimentation, flirtation, into the kind of very harmful activity that we see here.
Very often, you know, one of the things that has been the case, not just for these communities, but for other forms of extremism, you have an odd pattern where you see.
see a kind of increasing and growing participation in these kind of harmful communities and then a
sudden drop off as they go into sort of operational security mode and they suddenly start to do the
planning they start to engage in the sort of more hardcore kind of operational dimensions of this
so there are patterns that that are signals and indicators of when you're getting to particular areas
of risk but yeah i don't i don't want to say that this is fundamentally a kind of online problem
although so many of the fundamental kind of identifiers, self-references, the culture that at least is used retrospectively to frame these acts of violence are so fundamental to how the online world works.
And I think if I was going to sum it all up in one, you know, the literacy required in these online ecosystems is perhaps the most important part of all.
You know, you need people that really understand what is chip posting and what is the acute harm here.
And it's very hard to kind of boil that down to a set of keywords.
You really have to know these space as well.
Which involves like sacrificing some of your sanity, spending time in those places, looking through their, like, getting, you know, becoming fluent in, in horror meme.
Correct.
Which, you know, screws you up a little bit.
It's funny you say that, like, I think that there's, I think there's a breaking down of the,
barrier between the online world and the real world that has happened a lot in like the last
five, ten years.
And I think it's maybe hard for people our age to kind of wrap our brains around that.
But I think it's important to understand that like, I think it is my suspicion that for
young people that are brought up where the online world is like ever present from the minute
you know, the conscious, like that separation isn't as clear as it used to be.
Especially, and I think that an under-discussed part of this is probably, you know, we just went through a global pandemic where a lot of people during their impressionable years, like that was their whole social world.
It was just online.
That's right.
And it's not exactly what we're talking about.
But when you look, for example, at the buffalo attacker, the supermarket, you know, you can see a direct correlation between how the pandemic cross-cut this attacker's life and sort of accelerated what was there before.
You know, there was certainly kind of challenges there, but it came at the exact worst time for him.
And he actually sort of documented, you know, that that correlating directly was an obsession with the manifestos of mass shooters like the Christchurch attacker.
you know, he maintained a discord diary that kind of really caught in real time, you know,
how during the pandemic, you know, that sort of total separation, that total de-realization was
happening. And I think this is definitely a feature that has been under explored as a kind of
catalyst for that real sense of abstraction that comes from this. And I'll say that it's two-way.
You know, it's kind of, it's seeing, it's seeing the, the, the, the, the, the, the, the, the, the, the,
the real world violence that some of these individuals
were gone to conduct as being
ultimately a kind of odd sort of non-real
perhaps kind of gamified scenario,
but it's also making sense of the world
through kind of online culture.
You know, what you see and here around you
is kind of refracted back through the lens
of the sense making of these communities.
So it's unfortunately not just about people
that think the offline is online.
It's people that make sense of the offline
through the prism of online.
Yeah, kind of almost like an online first worldview.
Correct. Yeah. And that's not unique to these individuals. There are plenty of mainstream
politics that have that same perspective.
Unfortunately, as below, as above so below right now, and maybe forever.
Unfortunately. Okay. So the other part of this, and there's a large part of the papers
about this is the responsibility of the,
the places where these conversations are being held, where people are posting their
saints, where the CSAM-like generation is happening, like, what do you, what do these
platforms need to be doing that they're not doing right now?
So just to start, we said earlier this isn't an entirely a platform problem.
I mean, this has to really be something that is thought about by policymakers, by law
enforcement, by practitioners, you know, there are those on the front line in communities who
are doing the social work, the educators.
it's not on them, but certainly being able to understand these elements and know what the risk factors are
is going to be a crucial part of this. This is while these individuals are very online, you know, there is a kind of whole of society to use a glib term dimension to this.
But certainly when it comes to platforms themselves, you know, all the evidence is that so far this type of threat, which is becoming increasingly, you know, understood, I think, by these platforms as a kind of real risk area that's growing.
It has been seen essentially as an extension of efforts that were built to counter-terrorist content, you know, over the last 10 years.
You know, a whole set of systems and frameworks would build up to deal with ISIS, bolted on later efforts to deal with far-right terrorist groups, but ultimately are geared towards understanding a ideologically based threat from a specific group, rather than understanding that we're really talking, as we've already spoken about, about violent cultures, aesthetics, and actually different kinds of harm.
We're not just talking about people getting radicalized into terrorism.
We're talking about child abuse.
We're talking about kind of scams.
It can almost go off in any number of directions.
So the way that we frame the response to this is this has to be multi-level.
So you have, first of all, I have to think about the way your platform is creating vulnerabilities.
So you have to basically come up with safeguards that you yourself are implementing as a platform.
What are some of the ways that you can bake in to your tools, safety interventions?
Can you be providing better resources to individuals?
you know, kind of providing them links out to, you know, to vetted, you know, kind of community-based
help or education campaigns around these kind of threats. And there's a lot of great material
that is coming out there. And rather than just, you know, thinking about this as taking down bad
content, how do you bake into the actual systems of the platforms themselves, you know, kind of safety
by design, essentially. The second is about community level interventions. These are fundamentally
communities. And as I mentioned earlier, in the case of the true crime community, there is
sort of funny self-policing that happens. You know, there are people in these communities
that are concerned about the reputational damage that these individuals are doing to the wider
fandom. So in this case, can you be working with moderators, you know, with community moderators in
these areas to help them to understand, you know, what the risks are, to provide more credible
kind of, you know, I guess kind of community-led and ultimately using the same reference points,
the same, you know, language into these harmful spaces. Can you get people to engage? Can you
with vulnerable individuals that are from the community themselves.
And I think that that's, again, that's something that platforms can be thinking about with
moderators in places like Discord servers.
You know, there are very clear, obvious people to work with on these kind of issues.
Same with subreddits and so on.
The way these platforms are built empowers naturally these kind of individuals.
And I think they are totally underutilized so far.
There's a whole sort of, you know, approaches to disrupting these ecosystems that, to be
honest, we've really gotten nowhere with so far.
So when it came to terrorist content, there was some effort.
This is kind of, you know, hit a bit of a break in the last year or so towards cross-platform work to tackle terrorist content that everyone could agree on.
So there was a database hosted by the global internet forum to counterterrorism where platforms would kind of come in, flag terrorist content for their peer platforms to be able to identify quickly and be able to respond in a more joined-up way, recognizing that this is a cross-platform challenge.
We've not seen any of that kind of coordination for these threats, but ultimately, you know,
there should in theory be the ability to share much more closely the signals, the evolving
threats, the risks, so that you can track an individual across these platforms or you can
understand a new publication that is being circulated widely across these platforms.
But that hasn't happened, and we need to be thinking much more creatively, like at a sectoral
level about that.
And then the final thing that we've been thinking about is essentially about what countercommunications
look like. So again, when you look at terrorism, there were efforts over the past decade to
reach people with positive messages, with counter narratives. Some of these were effective. Some of
these were actually did the opposite and they kind of drove people further into the hands.
When they were illiterate in the communities that they were engaging with, they were actually doing
more harm than good. But certainly there is room for much more sort of strategic efforts to reach
in and provide, you know, kind of more authentic content that can actually engage with the types
of humor and aesthetics that are part and parcel of these spaces. They're not going to de-radicalize
people. They're not going to be able to kind of come in and tell them about why this ideology is a bad
idea. But what they are going to be able to do is provide positive alternatives and,
and breach people, you know, in kind of ways that might resonate with, you know, their realities.
So I think learning from what didn't, didn't work from a generation of strategic communications,
efforts is also going to be a key part of the response.
And this is hard right now because we're in an era where the platforms aren't super bought in anymore at the moment, right?
There is kind of a retreat from the idea of moderation at all.
I think in part because there's like a free speech maximalist ideological thing going on,
but also like it's hard and expensive.
and it's not something that companies want on their bottom line, right?
That's right.
There's definitely a kind of crunch point that has been come to with content moderation.
It's certainly been framed as essentially a kind of speech balancing issue.
But the thing you forget about these movements is very often about kids.
And when we're talking about kids, there's a whole different set of calculations.
And there's actually a lot of bipartisan interest in this area.
I've been really struck by how it really has cut across the aisle in the States.
has been pretty unified as a set of risks that everyone can kind of agree on. It's been an area
that the new administration in the US has really doubled down on as an area of violent extremism.
This idea of nihilistic violent extremism is a new category of violent threat that has had a lot
of interest. And so I do think there are ways of going with the grain on this stuff that can
circumvent this kind of zero-sum approach to content moderation, censorship, etc. We're talking about
incredibly harmful predatory behaviors, which, like, the vast majority of which impacts on,
on minors. And I think that focusing on those highly uncontroversial issues and just surfacing
how high harm this stuff is, is going to be a really important part of the sort of political
backdrop to this work. But I think there is no reason why this should be seen as kind of
politically contentious. This stuff is, frankly, the worst of the worst. And a little could go a long
way in terms of incentivizing platforms to take much more action.
Everybody, well, not everybody.
Adults hate Roblox for a reason, right?
And it's a lot of it's because stuff like this keeps surfacing there.
So do you have any, what are you watching for the next year, the next two years?
What are you looking for these platforms to do?
what are you watching in these communities as they evolve? Great question. Ultimately,
the groups that we've talked about today, the true crime community, 764, you know, these are,
these might be kind of mayflies that disappear as specific groupings for the moment, but I have no
doubt that these broader communities of nihilistic violence will continue to evolve and
grow and expand. It's almost baked into the kind of expansion.
business model that they have. I think one of the things that we shouldn't do is essentially
just bolt on things like 764 and the true crime community as kind of, you know, the next
generation of terrorist threat, the next ISIS. I think that's a fundamental misunderstanding of
what they are and of what the risks are and you risk elevating as many people as you actually
reach with that. Instead, you know, I think what I'm really looking for is for platforms to be
taking a much more sort of culturally informed approach to dealing with these issues that is about
asking really tough questions about the ways that their functionality and business model might be
perpetuating it. I don't want this just to be trapped in a sort of content moderation mire
where this is really just about, we've removed 99% of this content before it was flagged by anyone.
That was the previous paradigm. That was, you know, that was efforts to deal with official group
propaganda and the recruitment and radicalization of people through that mechanism. That's a fundamental
misunderstanding of what we have here. And I think we have a window, you know, where these groups are
incredibly harmful, but still, you know, on the smaller side, to be able to, on the one hand,
get a full handle on how they are networking across platforms and to address that, but also to
educate people. Fundamentally, I want to get the word out there. And that's why I'm really excited
to speak to folks like you about sent you understanding what the risks are so that we can reach
parents. You know, you don't want to scare people. You know, the internet has always been a scary
place. But I think in this case, being able to tell the difference between someone, you know,
potting around on Roblox and someone who is being actively groomed into a violent subculture
is going to be really important. So that's where our work at ISD is very geared towards sort of
resources for caregivers, for practitioners. And I think that's going to be a really crucial part of
this as well. So we're going to be doing a lot of work on the platform piece, but it's also going to
require, you know, everyone kind of learning a little bit more about the horrors of these spaces,
as well as governments, you know, really thinking about prevention in a much more sophisticated way.
You know, the systems and structures need to be there to reach people who are at risk early
because it's going to be even more challenging to bring people back from the brink than it was
with someone who was being, you know, groomed by a far right terrorist or was about to join ISIS in Syria.
We are dealing with a kind of very, very risky context.
And so let's learn the lessons from what worked and what didn't from that generation of threat.
And, you know, actually get ahead of this this time around.
It's fascinating because it feels like in terms of moderation and fighting against these things online, you're always fighting the last war.
And I hope that we get some strategies together that work a little bit better this time before.
I mean, things are already out of hand.
You know.
Sir, thank you for coming on and talking us through this.
Thank you so much.
Great to be here.
So one of the things I think is kind of just as an interesting lead-in is that I remember like 10 years ago.
Yeah.
Having the sense that Islamic State was pretty good at this in general, right?
They kind of like from the beginning were pretty good at their propaganda operations.
Yeah.
Seems like they still are and have adapted.
Yeah.
I mean, I tend to try to stay away from saying that the Islamic State is good at it.
anything fair you don't want to be that guy fair they are they are very adept at uh exploiting platforms
spreading messages i mean it's been 10 years plus of it and they continue to be and have always
been early adopters of of technology so like it's not a new phenomenon it's just
an increasingly growing capability
that has spread and diffused across
supporters in different parts of the world.
Yeah.
What has changed in the last,
I mean, really since the Battle of Raqa,
but like, especially in the last year or two,
what's new that you're seeing?
So one of the things that we're seeing,
like, that's new at least for this year,
were higher than at like normal for someone who monitors the space on a day in a day out basis
um higher engagement metrics with the content so we saw much higher view content and especially
in languages that aren't necessarily um the most moderated i suppose or or at least from a
standpoint of the platforms, I think, don't necessarily have the best in terms of moderation
capabilities.
Again, don't quote me on that.
That's what I think the problem is.
There's always been an issue with Arabic.
And the way supporters have gotten around that and some of the support outlets have gotten
around that is, A, using a method called broken text posting.
It's like a toolkit that they essentially have for getting around, specifically getting
around moderation.
where you place either a number or a punctuation mark in the middle of a word rather than typing out the word.
And they're keywords that they know there, like for instance, kill, bomb, derogatory terms for Shia.
That sort of thing they'll use.
They also have one outlet developed a 47 emoji keyword list that you could use in posts to describe.
some of this stuff in order to get around moderation.
So they've learned over time and they continue to adapt those learnings to,
as they call it, the media battlefield.
What are some of the other languages that are hard for Western companies to moderate?
I think Pashto, Urdu, a lot of South Asia,
which is weird because, Ed, you'd think Hindi was well covered, but it's not.
But Bengali, some of the Russian isn't as much.
Arabic for some reason, even though it has been a key language across all these platforms.
You had the Arab Spring, you had the fallout of the Arab Spring, you had everything there.
still is not
picking up
primary Arabic
content
and especially terrorist content
like terrorist support
content to be more specific
What do you think
that there's that hole
in the moderation for these companies?
I'm not positive
if it
well I think
trust and safety teams
have been rolled back
over the past few years
so that's one aspect of it
I don't know how much of that has hit actually dangerous orgs and individuals, which typically this falls under.
The other part of that is a lot of this is outsourced to third-party companies who aren't necessarily experts in understanding if a piece of content came from the Islamic State.
So an Islamic State supporter or an outlet has more of a chance of succeeding and getting its content to stay on a platform if they, A, embedded it another piece of content.
So like we've seen videos that have been branded before BBC or Fox, you know, like in order to get around the actual logo of the Islamic State being on a piece of content.
or they'll take snippets out of context and use those snippets along with a support post and you might not necessarily, no violence in it.
So you might not necessarily understand that it came from this specific Islamic State video.
So I think part of it is that those third party companies aren't as good.
And I'm not sure AI is picking up on simple snippets or alterations in videos.
So we've seen a lot of that as well where you put,
where you do book ending, where the intro to a video is something completely innocuous.
And then the middle is the entire piece of content, Islamic State content.
And then the end is some, again, another piece of innocuous material.
It's funny because I see the same kind of thing on like YouTube to fell like copyright stuff.
Correct.
They'll, they'll, they'll, like, upload an entire season of cheers.
but it's mirrored and like warped just enough that it that it beats the bot, right?
Correct.
And this is like learnings from takedowns.
So like when you talk about takedowns, yes, takedowns are good sometimes, but we're perpetually playing again.
It feels like we have this conversation almost earlier whenever this stuff happens about whack-a-mole, you know, like that doesn't solve the problem.
You know, Europol does these big take-down days, and they're effective to some degree.
But the fact of the matter is, is the Islamic State is spread across an expanse of different platforms and messaging applications.
So if you do take them down in mass in one place, they're able to shift operations to another place, wait it out and regenerate on that platform once the takedown threat is done.
And it's not like you're dealing with an average user.
You're dealing with a user that's determined to, one, spread their ideology and then, two, exploit your platform to their own ends.
Which are intricately tied, of course.
Why is Facebook a central hub?
It's a central hub in the fact that it appears, and it's a central hub.
And I say that at the same time recognizing that Telegram continues to be a similar point of centrality.
for them. It is, especially for those in languages such as Arabic, Urdu, Pashto, still an important
platform. It's still a platform that is used on a daily basis. It similarly allows interactions between
outlets, and these are unofficial support outlets. They are not necessarily tied directly to
the Islamic State, but do support the Islamic State.
It allows that give and take between users.
They're similarly able to exploit all of the various functions there to their own end.
So we saw users turning on professional mode, which appears to be a plus, like an amped up profile version of yourself that allows you to potentially monetize content, give you a dashboard to.
understand your engagement better. All of those things really work for a quote-unquote media
mojahed, a media fighter who would want to know how to refine messages, want to know who's
interacting with their messages, how far in the spread is. We've even seen them post screenshots
of their metrics following a piece of content that they put up. And,
bragging about their impressions.
And so it's that sort of thing.
But in terms of it being a central hub,
it allows for a lot more interconnectivity
rather than these siloads channels and groups
that you would find on a messaging application.
It also gives them access to a full meta suite,
which is essentially Facebook, Instagram, WhatsApp,
and they can all nicely wrap it in there.
We also know that there is,
there are specialized support groups that dole out accounts.
So there used to be a group called Bank Alansard,
which is literally like Bank of the Supporters,
that advertise its ability to give you Facebook accounts,
at the time Twitter accounts, NowX,
and other social media platforms,
including digital phone numbers,
so you could set up your own accounts.
And a lot of times those were either,
accounts that were paid for in some shape or form, or they were hijacked. At one point, they were
doing this thing where they were taking digital phone numbers, running them through, I forgot
my account details, and then locking out existing users and then sliding into their account
and then ultimately changing what the profile looks like. So all of those sort of functionalities
and essentially the full meta suite makes it a central high.
hub for activity.
We also find a lot of outlinking to other elements of the Islamic State ecosystem.
So whether it's telegram groups, websites that are linked to the Islamic State and its various
support arms, or WhatsApp or Instagram, or any other sort of element of the ecosystem
system itself.
Tell me about
resurrected AI
influencers.
Yeah.
So ideologues
that were part of the Islamic State,
what they'll
do is now dead is
they'll take the image of
said person and put them into
I'm not
positive which tool
they're using. And they're
essentially creating these videos where the person is now moving, appears to be talking,
but has been dead for a very long time. One of those people is Abu Bakr al-Baghdéi.
And he's brought to life in these videos again. He isn't saying anything. He's just moving and talking.
So in a way, it's a sanctioned version of using AI for a for a, for a,
a quote unquote beloved leader, right?
And they've done this or placing him,
taking him outside of a context that he was in,
in a video and placing him into a,
like a meadow surrounded by like beautiful flowers
and things of that nature.
It's a,
it's like paying homage essentially.
You say that he's,
he's not saying anything,
but they have him talking.
Like he's like a voice.
like a voiceover, like from one of the speeches that they would run over it.
So they're not, they're not, they're not, they're not putting words in his mouth, but they'll, they'll, they'll make him move.
And then they'll put one of his things that he actually said like over the image.
Correct. Correct.
It takes, it feels like that takes more time than just posting a video in which he was talking.
Yeah.
But again, yeah, that's, yeah, it's a little strange, I think.
Yeah.
But again, some of these circles are strange.
I mean, we even saw, like, I mean, this is more recent, some of these Facebook, what we call, we call them influencers in the report because they're individuals have specific avatars that promote the Islamic State.
They're not linked to support groups or outlets, but they're essentially where everybody knows to come based on an avatar that is consistently used by the same person or, and,
whoever's behind the account, and hence has influence over the community.
And so we've seen like other AI usage, well, not AI uses, but the use of AI content.
I don't know if you've seen this photo of Bill Clinton and Donald Trump, like, in bed as part of like Epstein files release.
It's clearly fake.
However, we've seen them use that content and talk about, you know, Western degeneracy.
So in some ways, there's always some sort of use of AI in some of these posts.
But the original ones that are close to the, that are purely Islamic State support are ones like what I described around Abu Bakrubhqqqqqqqqqaddi or
taking written content and essentially getting an audio version of that and turning it into a piece of multimedia in a different language.
So you'll take an Arabic piece of content, turned it into English.
We've seen that.
Yeah.
Even Islamic State is talking about the Epstein files, huh?
Yes, 100%.
100%.
And I mean, part of that is also they want to throw, they want to throw, they want to
the pro shade on
the new president of Syria
yeah
who I believe the UN
noted has had
several assassination
attempts on his life over the past
year by
ISIS affiliates or ISIS
itself
and
they want to sort of
show that link that this guy betrayed
us completely
betrayed the ethos of being a jihadist
and is now with these, for lack of a better term, pedophiles is exactly how they would
describe it.
Yeah.
Tell me about Roblox and Minecraft and virtual caliphates.
Yeah.
So this has been going on for like five years, technically.
But it feels like it's got more.
more and more users that are engaging with it or creating content. But essentially, they're creating
these virtual worlds that mimic the Islamic States caliphate, literally calling it things like
Wileit Roadblocks or the province of roadblocks, or they'll give it a foreign sounding name
in Roadblocks. And they will also completely mimic the video styles.
of well-known Islamic State videos using Roblox characters.
This includes faux executions.
It includes Arabic and English voiceovers in the same cadence as an Islamic State narrator of one of these videos or spokesman, so to speak.
And then similarly, we'll have maps.
and alike that show their encroachment on or their expansion across various Roblox or
Minecraft territories, which is we've seen the end results of those videos.
They're often tied to discords where a number of users are creating this content.
They always claim that it's fake or a LARP.
but there's a lot of the mimicry is so
so prevalent
like you've had to watch the content
when you should be careful who we pretend to be
yeah exactly yeah
and a part of that
I mean there's literally a flames
everybody knows certain
I think everybody knows like certain Islamic State videos
like the Flames of War series
or inside the caliphate,
those videos are mimic to a T
in these
video game worlds
and then reposted as videos.
And they're violent
and it's terrorist support, right,
in those videos.
And to see them in this
video game
skins is
odd to say the least.
Yeah, there was something really
odd and perverse to me
about nostalgia for these like 10-year-old videos
re-contextualized into like Roblox and Minecraft is very odd.
There were also, they also, it's not just the Islamic State,
but even the precursor, there was a group called Jem,
at the Tohidu Jihad before it, that was very Iraq-focused.
The videos that they created for that group had a Sipia tone to it,
And it looked grainy.
The footage itself looked.
There's even that level of detail and care that's been applied to older,
very older iterations of the Islamic State,
which is also weirder for like a 12 or 13 year old to be engaging with.
You know,
um,
born,
born too late to,
uh,
participate in jihad,
born just in time to participate in jihad.
in Roblox.
Very strange.
Can you walk me through
let me ask you this. Explain the
flock of birds metaphor.
Yeah. So
when a platform is targeted, let's say
Telegram is targeted and a mass
of accounts and channels are taken or stripped
off that platform. What ends up
happening is that that community
one shifts to another platform.
This happened in 2019, shifted temporarily to Tam Tam.
Or another messaging platform.
What it allows them to do is to evade predators, like a flock of birds, and essentially
waded out, rebuild, and then repopulate.
It's something that they continue.
do. And it similarly happens when there is a new platform that they could likely exploit.
Back in 2020, I think it was 2020, when Getter was launched, a call went out on Facebook to Islamic
state supporters by one of these influencer accounts that essentially said Trump has a new platform,
we should exploit it. And then I think it was roughly
about 200 accounts were set up in a span of like three days or four days across Getter to pump out
Islamic State content.
So it's that sort of like flocking that can also be done.
It's funny.
It really reminds me of the way 4chan targeted harassment campaigns operated like a long time ago.
Yeah.
It's very similar models.
They call it a raid.
Yeah.
Yeah.
So I don't know if 4chan used to refer to it as a raid.
I think they did actually.
They did.
Yeah.
Back in the day, they would do like a raid on like have a hotel or the penguin game.
I can't remember what it was called and kind of flock in and do like a bunch of messed up stuff and generate images for the board and then get kicked out and then go back to go somewhere else.
Yeah, they're specialized Islamic States support raid groups.
that do that sort of thing where they put out a post in the comment section are targets.
Those targets are typically other Facebook pages, like to news outlets or their enemies.
And they'll comment bomb the links with content or Islamic State messages sort of the part of the raid.
How does this translate into any kind of real world action?
Or is it in lieu of real world action?
done for the love of the game. It's done, it's done very much for the love of the game. It's,
it's done for the fact that I might not be it as a user. I might not be able to participate in,
in the physical jihad, but I can participate in the electronic jihad. And so it's part of that.
And very much goes back to that inside the caliphate, um, uh, part.
Part 8 video where they say if you close one account, open up three.
If they close down three, open up 30.
That is an ethos these guys live and die by.
And that's how they continue to function.
And that's why takedowns will never really,
will never really address this problem because they'll find ways to get around
the moderation and they're persistent.
So what are your recommendations?
So I think, you know, takedowns are necessary around a lot of this content.
We can't get it around that.
That's part of it.
I think another part of this is that we need better coordination across platforms.
There are a number of mechanisms that we can use to do that.
One of them is the GIFCT, and they do great coordination around platforms dealing with terrorism content across the broad spectrum.
The other part of this is that the platform.
while they release data on the takedowns that they do.
We have no idea of like where the transparency around that isn't like we took down
this many Islamic State videos, right?
Like or Al-Qaeda videos as comparatively to other groups.
We have no idea there as well.
I think we could use better transparency around that.
And then in terms of recommendations,
a lot of these accounts are there needs to just be better moderation of under moderated languages
uh like in terms of being able to pinpoint this and we're not talking about content where
there's a gray area it's very clearly branded it's very clearly branded Islamic state or
an Islamic state affiliate support for violence support for the uh um
the killing of minorities, celebration of bombings,
pillaging that is happening in sub-Saharan Africa.
And it's just,
it's a shame that those languages aren't getting the same safety as others may be on these platforms.
So it's essentially creating better mechanisms,
I think, around this.
we definitely need alternative narratives in this space, but it's, but it's only one part of it, and I'm less, I'm less, I'm less confident in our ability to actually get these guys to, if the goal is to switch sides or stop what they're doing, it's probably not going to happen.
because they're determined.
And it's relatively the same set of accounts
and users that are high profile
in which those communities get built around.
And even when you do target them,
again, they pop back up.
One part of that is also like,
we noted in a report,
there are essentially these gray news sites
or alternative news outlets
where in a closed space,
they'll profess support for
the Islamic State.
And all their news
are essentially,
if you're familiar with the Islamic State
news bulletins, is the text stripped
out of those and then
repurposed under their own
brand as global
news or worldwide news
or you name it.
And that's all
they do, which is a play.
It's literally a way
to get around, again, moderation
as these outlets.
Now, I'm not positive how best to essentially verify independent news sources.
And I don't know if you want to get the tech companies doing that necessarily.
But that's the loophole that they're exploiting.
They're exploiting a loophole where you can't verify that this is a media source.
It has media source branding.
It claims to be a media outlet.
But it is only putting out one kind of news, which is we did.
this many attacks on Mozambique today.
And in it, there'll be a celebration post as the first comment, but not the actual comment.
And again, it takes a bit of expert moderation rather than just outsourcing moderation to the third parties.
And flagging them so that when they do regenerate, you're able to identify them,
even with new names, so to speak.
It's moderation is such an enormous and huge problem.
I imagine a lot of this is going to end up getting outsourced to automated systems.
Yeah.
So, yeah, I mean, it's a pernicious problem,
but I don't know how these tech companies are going to solve.
I imagine that for the next few years, they're not going to worry about it too much, unfortunately.
No.
And I mean, the thing is, is that these communities have existed on these platforms now for about 10 years in some shape or form.
You know, they've been targeted, mass targeted.
I think, you know, like a lot of people talk about how their failings on X, but X is actually one of the better platforms in getting Islamic State content off rather quickly somehow.
It's interesting.
How?
do you have any idea? I have no idea. I don't know what sort of tools they're using to identify it,
but it seems to be working. We're not finding as many accounts as we used to find across that
platform on that platform. And again, we're only using a sample here. It's 500 accounts and channels
across about nine platforms. But we've, we can tell that these.
ecosystem is much larger because singular accounts have tens of thousands of followers that are
pumping out actual Islamic State material or have 6,000 followers here and there. And that adds up.
It could be the same users. They could be bought accounts. Who knows? But at the end of the day,
the ecosystem itself hasn't been like rightly sized and we don't get the transparency reports
to understand size. It's more about.
about just numbers of things that have been taken down.
And even the numbers, the percentage, if I remember correctly.
So it's not as if we have a good idea of just how big this is and what the ebb and flow is,
essentially, in terms of how this specific problem is being addressed.
We're going to be publishing one every year.
We're going to be doing the next one on Al Qaeda.
because everybody's forgotten about al-Qaeda, but they still exist.
Yeah, I would be interested to see what they're up to.
Yeah, in a very boomer fashion, essentially.
Yeah, they're kind of behind the times, I would imagine, yeah.
They still use this platform called chirp wire.
I don't even think I've heard of that.
It was, I think it came out of South Asia.
It is a faux Twitter that was set up.
And they had, they essentially AQ populated that thing with all of their various media outlets.
They still use it.
It's very glitchy, very buggy.
That's funny.
Yeah.
All right, sir.
Thank you so much.
Of course.
Thank you, man.
That is all for this week.
Angry Planet listeners.
As always, Angry Planet is me, Matthew Galt and Kevin Nodell.
It's created by myself and Jason Fields.
If you like the show, please go to Angry Planet Pod.
and sign up.
You'll get early versions
and commercial free versions
of all the mainline episodes.
I'm also writing a thing over there,
kind of chronicling
just something I see every day
that upsets me.
And just trying to kind of purge that
from my system,
calling it life darn Trump time.
So if you want to read that,
go to angry planetpod.com.
We will be back again soon
with another conversation
about conflict on an angry planet.
Stay safe.
Until now.
