Factually! with Adam Conover - Online Extremism, White Supremacy and the Myth of the “Lone Wolf” with J.M. Berger
Episode Date: November 6, 2019Author and extremism researcher J.M. Berger joins Adam this week to discuss the surprising similarities between white supremacist and Islamist extremists, the ways extremist groups frame thei...r hatred to seem more appealing and what social media services need to do to stop its spread. This episode is sponsored by KiwiCo (www.kiwico.com/FACTUALLY). Learn more about your ad choices. Visit megaphone.fm/adchoices See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
You know, I got to confess, I have always been a sucker for Japanese treats.
I love going down a little Tokyo, heading to a convenience store,
and grabbing all those brightly colored, fun-packaged boxes off of the shelf.
But you know what? I don't get the chance to go down there as often as I would like to.
And that is why I am so thrilled that Bokksu, a Japanese snack subscription box,
chose to sponsor this episode.
What's gotten me so excited about Bokksu is that these aren't just your run-of-the-mill grocery store finds.
Each box comes packed with 20 unique snacks that you can only find in Japan itself.
Plus, they throw in a handy guide filled with info about each snack and about Japanese culture.
And let me tell you something, you are going to need that guide because this box comes with a lot of snacks.
I just got this one today, direct from Bokksu, and look at all of these things.
We got some sort of seaweed snack here.
We've got a buttercream cookie. We've got a dolce. I don't, I'm going to have to read the
guide to figure out what this one is. It looks like some sort of sponge cake. Oh my gosh. This
one is, I think it's some kind of maybe fried banana chip. Let's try it out and see. Is that what it is? Nope, it's not banana. Maybe it's a cassava
potato chip. I should have read the guide. Ah, here they are. Iburigako smoky chips. Potato
chips made with rice flour, providing a lighter texture and satisfying crunch. Oh my gosh, this
is so much fun. You got to get one of these for themselves and get this for the month of March.
Bokksu has a limited edition cherry blossom box and 12 month subscribers get a free kimono
style robe and get this while you're wearing your new duds, learning fascinating things
about your tasty snacks.
You can also rest assured that you have helped to support small family run businesses in
Japan because Bokksu works with 200 plus small makers to get their snacks delivered straight
to your door.
So if all of that sounds good, if you want a big box of delicious snacks like this for yourself,
use the code factually for $15 off your first order at Bokksu.com.
That's code factually for $15 off your first order on Bokksu.com. I don't know the way I don't know what to think
I don't know what to say
Yeah, but that's alright
Yeah, that's okay
I don't know anything
Hello everyone, welcome to Factually. I'm your host, Adam Conover. And today, let's talk about white supremacy.
I know, I know it's a tough one, but, you know, based on the events of the last year or two, I think we got to do it.
In 1992, a white supremacist named Louis Beam published an essay called Leaderless Resistance.
resistance. His big idea, which had a lot of currency in the pale and pasty circles of white nationalists at the time, was that white supremacists should take action on their own or in small groups
without taking orders from any leadership. See, he figured that if white supremacists used any form
of formal hierarchical organization, it would be too easy for the government to come in and bust it
up, cut the head off the snake snake and the body will die, right?
So, instead of snakes, leaderless resistance advocated that white nationalists should be more like worms
Or, I guess, even more like worms
Either way, it's fitting
Instead of there being anyone in charge, leaderless resistance advocated that there be cells
Which coordinated loosely with each other
Sharing information through newspapers and computers so that if one cell was divided or severed, the rest would survive.
And Beam hoped that so many of these cells would form that the FBI wouldn't be able to keep up.
But luckily for the rest of us, leaderless resistance didn't really work. You know,
it turns out that without leaders, it's kind of hard to organize
stuff and grand acts of white supremacy terrorism can be tough to plan. You know, if no one's above
you cracking the whip and sending out e-vites, it's pretty easy to just wake up at noon and
find yourself scrolling terrorist Twitter all day. So the result was this lack of leadership
actually ended up weakening the white supremacist movement. There
are actually relatively few major leaderless attacks. And, you know, some cite the Oklahoma
City bombing in 1995 as an example of leaderless resistance. But the fact is, Timothy McVeigh was
hardly unconnected to other people in his movement. He was in touch with a variety of extremists as he
planned the attack. He was even associated with Louis Beam himself. He wasn't a
lone wolf, he was one member of a pack of shitty hyenas. And while Islamist extremists picked up
on leaderless resistance in the 2000s, rebranding it Leaderless Jihad, a lot of supposedly leaderless
attacks were directed by a group like Al-Qaeda, or the FBI pretending to be Al-Qaeda, where they
would pick people up in elaborate sting operations.
So for a long time, it seemed like Beam was thankfully wrong
and that leaderless resistance didn't really work at all.
Until recently.
In the 2010s, it suddenly seems like Beam's prophecy has actually come to pass.
Horrifying leaderless extremist attacks like the one in Norway, Charleston, El Paso,
the Pulse nightclub, and San Bernardino
have inspired waves of further leaderless attacks.
Mass gun violence by extremists acting alone
has actually become a global phenomenon.
So what changed between then and now?
Can you guess?
That's right.
It's the same wonderful invention that's
allowing you to listen to my dulcet tones right now, the internet. Social media has created the
potential for just the sort of loose extremist networks that Beam envisioned. ISIS sympathizers
can now use consumer video editing software to make Hollywood-grade propaganda videos and pop
them up on YouTube no matter where they are in the world without needing direct leadership.
And online forums provide fertile ground for white supremacists to trade notes, discuss
past attacks, and share their plans for new ones without the need for a hierarchy.
Just this year alone, three major white supremacist attacks were linked to the forum 8chan3.
And after the El Paso attack, 8chan users referred to the attacker as, quote,
our guy and praised his body count, inspiring future acts of violence.
It's difficult to face, but the fact is the internet,
this revolutionary new communications tool that has given all of us
new ways to share information and understand each other,
and that's empowered formerly marginalized voices, has also empowered the people who
seek to hurt our most vulnerable. And they have been able to use the internet to connect,
recruit, and activate extremists across the globe on a scale we have never seen before.
And that's terrifying. But understanding this phenomenon is key to learning how to stop it.
Well, to discuss the online ecosystem of extremism, our guest today is J.M. Berger. He's the author of
the book Extremism and a research fellow with VoxPol, an EU-funded academic research network
studying online violent extremism. Again, this is such a difficult conversation,
but he walks us through it so well.
Please welcome J.M. Berger.
J.M., thank you so much for being on the show.
Thank you for having me.
So you study extremism for a living.
That's a fun gig.
Yeah, I'm a lot of fun at parties.
Well, so tell me, what is the phenomenon of extremism and why is it so important that we understand it?
So the definition of extremism is highly contested.
People really disagree about it a lot, and a lot of the definitions that are out there are in the service of power.
So if you're the regime that rules something, extremism is the people who are against you.
So what I'm pushing for is a definition that's more objective. And the definition I use,
there's two terms you need to use that I'll use over and over again. And one is in-group,
which means the group to which you belong, one person, you know, to which one belongs,
and the out-group, which is anybody who's not in your group. So to be an extremist, you have to believe that the health of your in-group
is inseparable from taking some kind of hostile action
against an out-group.
So it's not conditional.
It's not, you know,
we're having a dispute over the border,
we're having a dispute over business
that we can resolve.
It's a condition of your success
is that somebody else has to pay for it.
And it's a zero-sum game in that way, that you, if you're going to win, the other person or the
other group has to lose. And that specifically, you take action in order to harm them. Is that
part of it? Exactly. So there's one of the debates we have about this in my field a lot is whether that action has to be violent.
And I argue that it doesn't. You know, I think harassment, vandalism, you know,
racist vandalism is clearly extremist, even though it's not violent. So the action has to
be hostile. It has to be intended to harm the other party in some way.
Well, so this is really interesting because we normally talk about extremists or terrorists as being, you know, coming out of a specific ideology that specific people hold.
That religious extremists, their religion tells them, you know, to do it.
And that's really the problem where someone has a, which would imply that white nationalist extremists and Islamic extremists are similar in a lot of ways.
Is that the case?
Yes.
So, I mean, the contents of what an extremist group believes, like Sharia law or racial superiority, they're like shiny objects.
They're important.
Each group has its own contents and it's different,
but what's similar is the structure. The structure of the belief is always about that in-group,
out-group dynamic. And so how are, like just talking about, you know, white nationalist
extremists in the U.S. who want to conduct racial violence in order to achieve their goals and,
you know, religious extremists overseas, like those are not groups that we normally think of having
much in common, but what do they have in common? Well, they have a lot of broad structural things
in common in terms of types of belief. They believe in the same kinds of crises, for instance.
Crisis is an element of extremist belief. You believe that there's some kind of events happening
that the out group is causing some kind of negative impact on the in-group.
So that's a universal thing in extremism. And then they believe in a solution to the crisis,
which is always hostile to qualify as being extremist. Once you get beyond that, there's
actually a lot of other smaller similarities that ebb and flow over time. So we see similar usage
of social media. One thing we're seeing very recently among neo-Nazi groups is that they're openly plagiarizing ISIS propaganda styles and even in some cases like just taking ISIS posters and changing them to be racist instead of Islamist.
Wow, really?
Yeah.
What does that look like in practice?
What does that look like in practice?
So, you know, the kinds of things that they really focus on are exhorting people to be lone wolf.
I use sort of air quotes around that because I really hate that term.
But to go out and take action on your own without having an organization supporting you.
That kind of content is very common. So ISIS had a couple of posters, really, memes, online memes that said, you know, don't wait, go out and act. And the Atomwaffen Division, which is a neo-Nazi group, international neo-Nazi group, has recently appropriated some of that imagery.
Very, very close, obvious homage to ISIS.
And they probably do that in part because they figure people will write stories about it and they'll get press, which they do.
But they also just saw the ISIS folks online making memes that were exhorting people to violence and, hey, those are some pretty good violence memes.
We could learn something from ISIS, like in a direct way.
Yeah, yeah.
Like in a direct way.
Yeah, yeah.
And there's a small but not insignificant movement in those circles to even go further.
And they start talking about white jihad, you know, that they're really – see ISIS as a very effective group and that they should emulate what they do.
Wow. And that's, I mean, that's so shocking because that, you know, if we were just looking at it along the dimension of their ideology and like, here's the content of their beliefs, you would think that ISIS would be the exact people they would hate, like the people they'd be most, you know, the biggest boogeyman to them.
But the idea that they'd be like, oh, our methods are the same, like we kind of have the same dynamics going on.
So these are actually our compatriots in a way is really startling. I mean, they wouldn't be pals, you know, they wouldn't hang out and, you know,
share a coffee, but they're learning from the enemy is how they see it, you know, study the
enemy's tactics and learn from them. Are these groups, I mean, I don't want to overplay their
similarity if there are significant differences between them,, I mean, I don't want to overplay their similarity if there are significant
differences between them, but I mean, maybe there aren't.
Are there still important differences in how they function?
Oh, very much.
I mean, you know, there was a real brief spate of sort of gee whiz stories about them being
so similar.
Very recently, I was quoted in a couple of them and, you know, I was kind of
cringing a little bit because it's, you know, I mean, they're not the same. They have similar
dynamics and it's important for us to understand what makes them similar and where they differ.
So there are a lot of ways that ISIS and even an organized white nationalist group like Adam
Offen Division are different.
Primarily, ISIS is a very centralized organization.
It employs people who are loosely connected to the group to do things for it,
but it's got a hierarchy.
There's a guy in charge, the caliph, and everybody answers to him,
and they swear loyalty to him, whereas white nationalist groups in general and even some
very specific groups that are somewhat coherent in their organization tend to be much more
decentralized. And that comes really out of the 1980s when the FBI cracked down on white
nationalists in the United States. The response of some prominent white nationalists was to endorse
this idea called leaderless resistance.
And that means we don't have an organizational structure for the FBI to infiltrate and destroy.
Yeah, and in the intro monologue of this episode, I talked about leaderless resistance and how that pattern was really transformed by social media.
And then how social media and just the internet at large has really changed the way
these groups operate. Can you give us any more of an overview on that?
Definitely. So, you know, leaderless resistance was this kind of very optimistic concept introduced
in the 1980s, early 1990s, by a white nationalist named Louis Beam. Optimistic, really?
Well, it's optimistic from their perspective.
So it's this idea that their ideas were so powerful that people would just individually
step up and commit acts of terrorism in support of those ideas.
We're so persecuted by the law enforcement who's really tamping down on us, but, oh, our ideas are so strong they'll rise up no matter how much we're being crushed in some way.
Right, and it was a disaster.
It didn't work out at all.
Oh, okay.
So for decades, you know, these groups pursued this idea of leaderless resistance with the result that, you know, they were just increasingly factionalized, you know, with a lot of infighting. Nobody was really on the same page
and people don't feel inspired to act if nobody's behind them. I mean, it's just a human nature
thing. If you don't have a consensus of people supporting what you're going to do, you're less
likely to do it. So social media really revolutionized this and has kind of brought this around to,
I would not say success, but it's certainly very damaging strategy now. And on social media,
you can get the feeling of consensus without having physical exposure to people. So instead
of having to go to a Klan rally out in the woods, get a bunch of
people who actually are going to get together and go to a physical space and put hoods on and do
the thing, you can get all that social support online virtually from people who are anonymous,
who don't have security concerns anymore. And they can be available 24-7. It's not like,
you know, if you were in the Klan in the 80s, you know, this was
something you did on weekends maybe and evenings, but it wasn't all day, every day. Now you just
pull out your phone anytime you're bored and you can get this kind of feedback from people. So
what's happened is that people are now acting out with minimal guidance and support. It's not,
you know, they do have a social network. It's coming to them online
more than offline, although there's sometimes there's an offline component to it.
Yeah. And people are doing violence, but that doesn't mean they're achieving their strategic
aims. Like, you know, you're not necessarily getting anywhere except violence for violence
sake. Well, I've often thought how, you know, when I grew up in the 90s, you saw very little white supremacist or white
nationalist content in the media, right? Like the only access that the average person had to say,
you know, a white supremacist, for instance, was on a daytime talk show, they'd bring on,
you know, people from the Klan, and the crowd would basically throw tomatoes at them and boo
them, right? That was, or, you know, David Duke, you know, was reviled.
And, you know, you had these sort of like celebrity people
who we would hate in that way, like David Duke.
But there was no other outlet.
I guess they would photocopy newsletters if they, you know, or whatever.
But then with the internet, you know,
the wonderful thing about the internet is that
anyone from any interest group can get together, right?
You know, people can find their people.
And that's such a wonderful thing.
I found so many, you know, hobbies that I was interested in that I got together in,
or, you know, that changed the face of comedy.
You know, comedy fans would get together and that changed how comedy would progress.
Or we saw that in so many areas.
And it took me a while before I realized, oh, wait, that's happening with the extremists as well.
They're also finding each other and going, hey, I kind of hate these people.
Do you hate them too?
Oh, yeah, sure.
I kind of want to do some violence.
Oh, yeah, me too.
And the exact same process happened, but we sort of didn't realize it as it was happening.
Is that metaphor apt?
Yeah.
So there were
gatekeepers in the 80s and the 90s. And there were really very powerful gatekeepers. So a number of
factors limited the reach of these groups. First, it was very expensive to do a TV broadcast. You
know, you couldn't do what you do today, it's like pull out your phone,
post something on YouTube,
and 10 minutes later people are reading it,
you know, and you haven't spent any money on that.
Any kind of video was expensive.
If you were going to get into kind of a broadcast,
really is prohibitively expensive
to have a broadcast television station
supporting the Klan.
And if you were going to make a videotape, it had to be
physically carried around. It had to be sent out with postage. It was expensive. And so all these
things, and then you have these broadcast networks that are both regulated by the government and
also subject to market pressures. And so their object is to create content on these limited
channels that will appeal to the biggest possible
audience for advertising purposes. There's no micro-targeting. So you just want everybody to
be watching your show. And in order to do that, your show had to be inclusive in some way. So
all of these factors really just dramatically limited the reach of these groups. And when you
get on social media, suddenly all that stuff goes away. It's cheap. It's easy. It's ubiquitous. You carry it in your
pocket. And micro-targeting is a huge part of what happens on social media. So you find,
you know, you can go out and be profitable with a very small niche of an audience,
and you can put out a message that reaches a million people, that million people might be spread out all over the world.
They might not have any kind of physical proximity to each other,
but it's enough people to make your venture sustainable.
Yeah.
I remember this, you know, people saying a couple years ago,
oh, wow, it's so weird.
Nazis are back.
Like racists, avowed racists are back.
And I remember realizing, oh, wait, no, they were always there.
They were just not allowed on TV. Like we had this consensus, you know, the people making the
decisions, the gatekeepers, you know, both I think for the market pressure that you are talking about,
but also because, you know, of a cultural consensus that those views should not be allowed on
television or in a newspaper. You know, you can't write a letter to the op-ed, you know, David Duke is not getting his, his letter published in
the New York times. Right. But then this technology, which has empowered so many people to
have their voices heard, right. The fall of the gatekeepers has led to, you know, so many
marginalized groups, people of color, people of, you know, economically disadvantaged folks, LGBT community, not to mention, you know, just all types of people are able to make their
voices heard. And that's changed our conversation in our country, in our society so profoundly.
But at the same time, it's also unleashed all of these, you know, so much hatred,
all these folks who have hatred to do to do the same thing. And, and thinking about that in the last couple of years, I've had myself a couple of times go like,
I kind of miss the gatekeepers a little bit, not, not fully, but like, Oh, the gatekeepers were
doing like, you know, it's good for my comedy career, right? That the gatekeepers fell, right?
That was helpful to me. It was easier for me to, you know, to get started on, on the internet than,
than it would have been to get on SNL. Right? But the gatekeepers were doing like one or two good things by stifling these views that 99.9%
of the population agrees are abhorrent and lead to violence. And it makes me unsure how to think
about that change that we've been through. I wonder if you have any thoughts on that.
that change that we've been through. I wonder if you have any thoughts on that.
Yeah. So the gatekeepers slowed progress for more marginalized communities, but it didn't eliminate them. So what happened was, so if you look back at like, you know, Hollywood production
standards, the MPAA code back in the, you know, 40s and 50s, you look at broadcast standards in
the 70s and 80s and 90s, and those things were,
to some extent, regressive. They held back certain kinds of things. So in the, you know,
particularly the Hollywood production code, you know, interracial relationships were forbidden,
you know, so you would never see a gay or trans person. I don't think it was articulated in the code, but it was verboten. And what happened was, is those broadcasts,
those kinds of standards, because they're corporate products, they are susceptible to
demographics over time. So slowly, you know, what you would see is, you know, television networks
became more inclusive, more pluralistic. You see, you know, mixed marriages on television. You would see,
you know, increased presence of minorities. You would see gay people. You would see trans people.
And that, so there was a huge amount of friction on those communities, but there was kind of a
complete stop on the radical fringe. So, you know, you couldn't use the N-word on TV. You couldn't use it in a movie.
You know, it was forbidden.
And I think, you know, nobody looks back on those times wistfully and says, gee, I wish we could have production, you know, the MPAA code back.
But I think that they, you know, sort of fixed us in a sort of steady state with small steps toward progress.
And so while social media has been huge for marginalized communities and has provided a lot of voice to communities that wouldn't have had it before, on that far fringe, kind of extremist fringe, there was just this huge pent-up frustration, this huge volume of material and this complete brick wall that they encountered when they tried to get their views out for a big audience.
And so that just popped like a pressure valve.
So ultimately, I think the current environment or the environment for the past five years anyway has really favored extremists because they had nothing before. And now they have a fighting chance to get their message out and to get communities out, and they use it very aggressively.
Yeah, and we've seen the results of that, like how many attacks sort of connection is going to improve life so radically.
And I want to maintain that optimism that it's so good that the gatekeepers have fallen and that everybody is able to share their voice now.
But at the same time, it's like counterposing it against this huge negative.
I don't know how we can weigh it right now. But at the same time, it's like counterposing it against this huge negative. I don't know how we can weigh it right now. I think it'll take time to see how it sorts out,
but it certainly made me reevaluate that process. And it certainly has felt over the last five
years, I'd agree with you, that the extremists seem to be taking such powerful advantage of it?
I think that, you know, there are structural factors in social media that currently favor
them. And I think probably the existing social media companies are so broken and so based on
this bad model that it's going to be hard to salvage them.
I think the next generation of social media companies, you know, may be better.
Maybe we'll learn from what happened here.
I mean, there are a couple of real significant problems that really aggravate this.
One is manipulation.
So social media, because it breaks everything into sort of quantifiable chunks, it breaks human interaction into these, you know,
little quantifiable bits and bytes,
you can really design strategies to do weird things that wouldn't work in the real world.
So you can, you know, you can unleash bots,
you can amplify kinds of messages,
and that has a huge impact on the social construction of reality, which is something I'm looking at for my dissertation on extremist ideologies.
People are more inclined to believe something if they have a consensus to support it.
You know, your understanding of what's subjectively real is shaped by your peers.
So you can manufacture a consensus.
Yep. Yep. And so if you perceive that there's enough people, it doesn So you can manufacture a consensus. Yep, yep.
And so if you perceive that there's enough people,
it doesn't have to be a majority.
It just has to be a community of people
who are supporting your beliefs.
And you see this, I mean,
this isn't just an extremism problem.
You see it in like the return
of the Flat Earth Society, for instance.
There's a lot of ways that this will play out.
So if you can,
not only you can shop around
for whatever consensus suits your predispositions,
but you can also manipulate that perception of a consensus.
And you can do that in ways that are subtle.
It's not just like Russian bots tweeting out messages
about, you know, divisive messages against America.
It's like if something gets 5,000 retweets,
you pay more attention to it than if it gets 10. And, you know, so even just, you know,
silently amplifying, just creating that appearance of size can really have a dramatic difference.
And so what I'm really interested in studying going forward is sort of how extremist groups
frame their consensus to make it more appealing and look more powerful. And this is these groups taking
advantage of part of the architecture of, the most obvious example is Twitter here, right?
Like it's a, so in some way, Twitter bears a responsibility for how people are using the service, correct?
Sure.
The big three social platforms, YouTube, Twitter, and Facebook, all share the same fatal, I think, fatal flaw, which is that their business models are built on virality.
So content that goes viral gets clicks, generates ad views.
So, content that goes viral, gets clicks, generates ad views. And so, all you have to do is, and virality doesn't have to be 5 million retweets. It can be 5,000 retweets. It can see a lot of hate clicks, for instance. You know, you see people hate retweeting something or retweeting it with a comment on it, and all that engagement boosts it
up, even if it's negative engagement sometimes. So I think that, you know, these business models
that are based on virality are really a problem. And YouTube has made a couple of changes that
should soften that maybe over the next couple of years. We'll see.
Facebook and Twitter, not so much. Yeah, they've aggressively not changed anything,
despite the fact that, you know, especially Twitter is like, you know, the model of Twitter is, you know, an internet from 2005. You know, it's like this very old fashioned notion.
Hey, we've got handles, we've got avatars.
You know, it's a huge free for all.
There's no moderation of any kind.
Anyone can see what anyone else posts at any time
and reply to it.
And it sort of created this bizarre social jungle gym
that like operates according to these rules that
we know are really harmful, yet they refuse to change.
Yeah. I mean, it's mostly anarchy. There are some, you know, the companies, all three of these
companies have a strictly headline driven policy agenda. So when they start to get bad headlines about something,
then they will make a change.
Right, and YouTube got a lot of bad headlines last year.
Yes, so they made a change.
And that change was hard won, even with the headlines.
I mean, that was not like an easy internal process.
All of these companies are just making ad hoc changes
based on whatever the latest bad headline is.
And there are people at all these companies, well, at least two of them,
who are interested in having more consistent principles, you know,
like designing a system of laws, having a constitution,
rather than just, you know, banning whatever bad thing somebody happens to be doing that week.
And I think that the people who are interested in doing that
face a lot of really significant resistance
from the people at the top of these companies,
the heads of Facebook and Twitter and, to some extent, Google.
Google, when I look at Google,
what I see is a company that's more of a traditional corporation, which is, you know, certainly comes with its pros and cons.
I'm not like a big fan of corporatism.
But they're traditional corporations, whereas at Facebook and Twitter, you have, how should I put this? I mean, you have, let's call them visionary leaders who set the agenda for the company
and often have very eccentric ideas about society.
Right.
Yeah, Twitter's basically run by a mad monk.
That's a much more direct way to put it than what I just said.
So, you know, I think that, I guess when I look at Google,
I see a company that's kind of more more can be shaped more traditionally by corporate pressure.
Whereas when I look at Facebook and Twitter, what I see is each of those companies has a white male at the top of it who is going to do things their own way until that option is taken away from them.
Right.
And I just want to underline this, like it's so powerful how much these companies, the way they have structured these sites determines the way that we interact
on them and the dialogue that they have. You know, it's like a video game where, uh, if the only
verbs you have are shoot and, you know, verbally, you know, talk to someone near you, you're going to end up with a lot of
abusive verbal behavior, right? Whereas like other video games, there's a wonderful game called
Journey where you can interact with people, but it inclines you so that you can really only
interact in helpful ways, or at least that is the behavior that you end up exhibiting when you play
the game because that's how it's designed, right. And Twitter is so similar that it's just got a couple of verbs, you know, it's got, it's got this short post reboot, you know,
retweet what someone else writes. Um, and so that leads to a certain suite of behaviors. And so
like Twitter bears a responsibility for what is happening on the platform, but they've,
they're stuck in this old fashioned idea of, well, no, we're just, uh, we're just the pipes.
Don't, you can't sue the pipes for what's going
through the pipes. I used to agree with that in 1999. That was a very sensible way to think about
the internet, but now it really clearly isn't. Well, and what you just articulated is essentially
a summary of the law for telephone companies, right? So you can't sue a telephone company for somebody using a telephone to call a hitman and order a murder.
The telephone company is explicitly not liable for that under the law.
And that law has been extended to these social media companies,
but these companies really aren't like telephone companies.
They're broadcasters.
And so it's a whole different set of dynamics.
The analogy I use a lot is it's like if you throw a party and 100 people show up
and a vase gets broken, you know, and you say, I didn't break the vase.
Yeah, but you threw the party.
Right?
Like you bought the beer.
You know, you were the one who scheduled at 11 p.m.
Like you bear some responsibility for what goes on in this space.
And you said – I'm very interested that you said that a new generation of social media companies are rising because I've spoken with – my interview with Scott Galloway a few months ago who writes about monopoly power in tech.
These companies are so powerful that they're able
to kill smaller companies as they're rising up, right? You know, Facebook buying WhatsApp, etc.
These are able to, you know, stifle smaller companies in the crib. Do you really have
hope that a new generation of social media apps will arise?
media apps will arise? Depends on what day you ask me. You know, I try to be optimistic.
You know, given the line of work I'm in, I try very hard to be optimistic because the alternative is pretty depressing. And, you know, I don't know. Certainly, if, you know, we end up with a candidate who is interested in taking a reality-based antitrust look at these companies, that could be better.
But there's a lot of ways that can go wrong too.
I mean, the current administration is investigating these companies for alleged bias against conservatives, which is clearly untrue.
And they are winning concessions from these companies for doing it.
So, you know, there's a huge amount of uncertainty about the direction this country is going in generally
and the direction that these, the business environment that these companies operate in. And uncertainty, incidentally, is one of the things that really fosters extremism.
Social uncertainty is a huge factor that makes people more susceptible to extremist ideas.
So, you know, the chaos is, you know, it's a self-licking ice cream cone.
Well, I want to move on from tech and get back to talking about extremism itself.
But on that incredible image of a self-licking ice cream cone, we have to take a quick break.
We'll be right back with more J.M. Berger.
All right, I'm back with J.M. Berger.
So you just said, this was really fascinating to me,
that extremism often increases during times of uncertainty.
So let's get back to talking about extremism itself.
When and how does it arise and how does it perpetuate and propagate itself?
So there are a number of studies that suggest extremism.
People are more susceptible to extremist belief when they're experiencing either personal or social uncertainty.
So if you lose your job or everybody in the town loses your jobs and nobody knows where they're going to get any kind of money and they're looking for something to identify with. People will more strongly identify with
a group like a religion or a race or any kind of group really when they're feeling anxiety
and uncertainty. And this principle I think has not been very widely adopted in the community of people who study terrorism and extremism.
And I think that's led to a lot of bad ideas.
So during the Obama administration, I would sometimes go to Washington, D.C. and talk to people, policymakers, about countering extremism.
And invariably, you know, they would hit on the same couple of things. They'd be like, well, you know, if we want to counter extremism, we have to, you know, give people jobs and we have to,
you know, make sure they're educated. And, you know, what we've seen, and it's been a hard
lesson to learn, and even today, some people still are very resistant to it, is that poverty
doesn't cause extremism. Lack of education doesn't cause extremism. In fact,
extremists are often more educated than other people. So if these things feel intuitively
connected to us, but they're not held up in the research, then how do you understand it?
And what I've come to think is that you can square the circle by understanding it in terms of uncertainty.
So if you live in a community that is poor and has always been poor, that's not necessarily going to create a high risk of extremism in that area. But if you live in an area that's always
been middle class and suddenly everybody's poor or the industry, you know, if you worked in an
industry, that industry is going away and nobody has any training.
Nobody knows what kind of jobs they're going to have.
And so they experience uncertainty around that.
So big changes might be more destabilizing and more prone to creating extremism than simply poverty, simply an economic kind of explanation. And you can sort of see that, you know, if you
think about Saudi Arabia, you know, which has had a significant extremism problem that came up
around the same time that their oil industry was coming up. So the country's becoming
fabulously wealthy, not evenly distributed, but there's huge amount of money pouring into the,
to this economy. And at the same time, there's this, you know, real extremist fringe that's becoming more and more powerful.
And so that's not even a form of uncertainty that is necessarily negative.
That's just large changes and people not sure what the coming year will bring.
Right. And that can also apply to sort of a feeling of opportunity.
If you look at, you know, like a tech bubble or, you know, some new technology comes in
that promises to transform everything for the better,
people see opportunity.
If they see things are changing,
that's an opportunity to make that change work for your group.
So, you know, what you're trying to do is ultimately
you're trying to situate yourself
in a confusing and dynamic society, and you need an anchor to hold on to.
And that anchor can take a lot of different forms.
One form it can take is like a religious identity or a national identity or a racial identity.
I mean, this also certainly adds up to when I think about, you know, extremism in the Middle East.
That's an area that has gone through massive changes. U.S. caused changes over the last few decades that I can imagine the same dynamic
happening there. And it also makes sense in the United States that we've gone through
such an economic transformation, you know, since the 60s and 70s for both bad and good,
depending on who you are and where you are,
but the constant being change, change, change, change, change.
And, you know, I mean, when we were talking in the last segment
about the social media companies,
it's not just the actual characteristics of those companies,
just the existence of those companies.
Communications changes are one of the biggest changes
that can happen in a society.
And when the printing press was invented,
you know, it did not take long at all for that
to go to the purposes of apocalyptic cults
and printing all kinds of insane conspiracy,
religious conspiracy theories.
And, you know, the nature of the medium
has to do with how that change plays out.
But any kind of new communications technology and anything that changes how people connect with each other and connect with ideas is going to have that effect.
So, I mean, this is crass, but this is like the old joke about whenever we invent a new communications technology, people figure out how to use it for porn.
That's the most innocuous use for it so right right porn is porn is the least of our problems uh i i i mean a joke i i do
in my uh live show i've been touring right now is uh that you know people used to say oh the worst
thing about the internet is that there's porn on it like oh no like that's where people are getting
their porn that's actually the most wholesome thing on the internet now. So, but yeah, I mean, that's, this is like the non-joke
version of that, where whenever there's a new form of communication that this is like one of
the first uses it's put to, is that what you're saying? Yeah. And, you know, there's a early
adoption is, you know, extremist groups are often well positioned to be early adopters of technology.
And one thing we saw with social media kind of interestingly was that they were slow to pick it up because they had security concerns.
It wasn't really until ISIS really revolutionized how these groups use social media that we saw them adopt it very widely.
And, of course, it was immediately tremendous impact from that. So yeah, I mean, anything that changes how people connect.
Extremist ideas have to be transmitted in order to be adopted, right? I mean, you aren't like
raised by wolves and out in the woods and suddenly you decide, hey, I'm a Nazi today.
You have to find out about Nazis. You have to have somebody explain why you should be a Nazi.
You have to read about them.
They have to be communicated.
So when you have a social media environment that is largely the untrammeled transmission of ideas in every direction wildly, that really provides a lot of opportunity to transmit extremist ideologies along with everything else.
Right. And that's so contrary to the way that we're – we used to talk about the internet that like, hey, being able to transmit ideas in all directions is good because the good ideas will always win.
And that was maybe a little bit too Pollyanna-ish to a certain extent.
Pollyanna-ish to a certain extent.
Even if good ideas,
you can even grant that
and say, hey,
good ideas are the ones
that will win,
that will go the furthest.
But that still means
bad ideas
and harmful,
hateful ideas
can spread a lot more
than we would like them to.
Sure.
And being on the attack,
when you're on the attack,
it forces other people
to adapt to you.
So if you are out there with a positive message about inclusion or diversity or peace and harmony, that doesn't require a response from anybody, right?
If somebody is not interested in that message, they can just ignore it.
Whereas if you're being attacked by somebody, you need to adapt to it in some way.
You can try to ignore it.
Sometimes that will work.
Often it won't. Sometimes you're going to put way. You can try and ignore it. Sometimes that will work. Often it won't.
Sometimes you're going to put your life at risk if you ignore it.
And so, yeah, the whole, you know, the more talk, defeat extremism with more talk rather than by restricting it, I think is misguided.
And I'm, you know, I say that very reluctantly.
I mean, I used to be a journalist before I started doing this kind of work.
I'm very into freedom of speech.
And what I see in these environments is, you know, communities without any kind of rules or guidelines.
And if you just let this run amok, it's not just the town hall, right?
You know, it's not like somebody on the street corner with a sign.
It's not a conversation in your house.
And we're not
talking about arresting people for their speech, which is what the First Amendment really is
designed to protect against. But ultimately, it's, you know, this new technology kind of comes into
your living room, becomes a part of your life, and you need to have some ability to control how
people act in a shared space. Yeah. But we're uncomfortable with that,
with that idea. Like it's very easy to say, Hey, well, free speech, speech is speech. All speech
is good, right? The more speech, the better. That's a very comforting and easy sort of
conclusion to have. We don't, as Americans, we're uncomfortable with the idea that, like, we need to find some way, some guidelines to put down.
Yet it certainly seems that, yeah, unfettered speech anarchy is – there are negative consequences of that.
I mean, I think, you know, I think we should be uncomfortable with it.
So I think it's – you know, I don't think it should be something that's just sort of like an easy choice.
And we can see, if you look at China, for instance, you can see it's very possible to abuse this concept.
So I think it's really important that we take all these things into consideration and move slowly in some ways, which is also something that works to the advantage of extremists because they can move
very quickly, whereas the mainstream of society needs to move slowly for a lot of reasons,
good reasons. So it's tough, but, you know, ultimately this has just been such a transformative
time for our social relationships that, you know, there's just fires breaking out everywhere and we have to figure out how to put some of them out.
Well, so let's talk about how extremists actually use the Internet, use social media.
What have you found and what sort of, you know, how do you process that data and what have you found in looking at it?
How do you process that data and what have you found in looking at it?
Well, these platforms, particularly Twitter, because of what you mentioned before about its sort of simplicity and also the fact that they make their data available in ways that other companies don't, really opens up a lot of opportunities for people like me to do analysis. So it does help inform our study of these groups.
And you can kind of get enamored of the analysis itself and sort of miss the fact that it's telling
you something about what these groups are talking about and what they're doing. Ultimately,
technology is just creating a lot of ways for us to do more kind of empirical, objective study of what extremist groups say and what they believe
and how they act.
So in a lot of ways, that's a great thing.
It doesn't necessarily offset the utility
that these groups get from the platform,
but it does really empower a lot of scholarships.
Twitter helps you study the bad shit people are doing on Twitter.
Great.
You know,
it's better than nothing.
There was a very,
I was pretty
forward leaning
and sort of talking about
taking a lot of these guys
off of Twitter
and there was a huge
amount of pushback
from some people
in the field
who were like,
well, no,
we want this data
to be available
because we want to
study these groups.
And I was like,
you know, it's doing more for them than it is for you.
It's like you can still study these groups.
It's going to be harder for you to study these groups.
It's going to be harder for me to study these groups.
Wait, don't take the weapons away.
We're trying to study how people use those weapons.
It's like a little bit.
Yeah, yeah.
I mean, the best way to learn about bank robberies is to let people do bank robberies.
And then you'll learn a lot about bank robberies.
Exactly.
So, but, you know, there's really, you know, huge new exciting stuff happening every day in terms of, you know, we can sort of break down the text that they use and, you know, analyze it in new ways that kind of give us insights. And that's something that I'm, that's my dissertation,
essentially using some pre-social media kind of ideological texts
to really take some of these tools to that and see what kind of insights you can pull out.
And one that I think is, you know, possibly most relevant
to sort of how we handle things going forward right now
is the prevalence of crisis narratives. So in my book, I argue that
there are, you know, three key components to extremism. One is identity. It's the groups,
in groups and out groups. One is a crisis narrative, which stipulates that the out group
is doing something that's having a negative impact on the in group. And one is the solution,
which is here's what the in group is going to do to the out group, and one is the solution, which is here's what the in-group is going to do to the out-group, the violence or the harassment or genocide or whatever it's going to be.
And one thing that I sort of had seen anecdotally and now I'm starting to be able to quantify
through this kind of machine learning and other kinds of analysis using the computer is that the
crisis narratives are the overwhelmingly most important part of this,
most common part of this that really spreads. If people believe there's a crisis,
it will cause them to cling to identity and it will cause them to be more susceptible to solutions.
So even though these three components are all kind of necessary components of extremism,
what I increasingly see is that the crisis narrative is probably the thing
that's fueling the growth of it most now. And it's the thing that, if you look at an extremist
ideological text, it's going to be more focused on crisis than it is on identity or solution.
They want you to believe that fear, to have that fear and that panic.
And something that's really interesting on social media when we talk about how these companies try and deal with this problem is that there are many outlets that are just devoted to the crisis part of this equation.
So there are social media outlets, for instance, that are all about black crime or immigrant crime, and they don't have a complete extremist ideology expressed on them. They're not saying we're neo-Nazis and we're concerned about immigrant crime. all day long, you know, true, real and fake and distorted stories that say immigrants are doing
terrible things. And so that's a real dilemma for the companies when they're trying to come up with
a policy to deal with, you know, extremism is that you can mix and match the pieces. So you only have,
you know, one, you might have one webpage that's just about identity
and just says, you know, white people are great
and doesn't put out anything else.
And then you have another one that says,
look at all this black crime.
And then people, and then you have the extremist group,
and this is the hardest part for them to put out on social media,
is the solution is like, well, if white people are great and there's all this black crime, then white people should do bad things to black people.
And that part, the social media companies are more comfortable kind of cracking down on and it's sort of easier to deal with.
But those first two, if you have those first two parts, it doesn't take a rocket scientist to come up with a third person. Yeah, you can connect the dots if you're of that mindset
and you're coming across this stuff in an unsuspecting manner,
you can be pulled in by that narrative.
Yep.
And so the crisis piece is the big piece because it's sexy.
It's easiest to make viral.
It can be decontextualized in a way that makes it hard to peg it as being extremist.
So, you know, certainly if you look at immigration discussion, there are a lot of mainstream politicians who will, you know, gravely inform you that there's an immigration crisis.
It's hard for, if you're a company, it's hard to crack down on that. And if you are trying to identify extremists, just because somebody says they're concerned about immigration doesn't mean they're an extremist.
And so you have to look deeper and sort of contextualize things in new and more complicated ways.
And so it gets messy.
messy. I find that another way that extremists sort of, you know, skirt these lines on social media is by, you know, cloaking what they're saying in irony or, you know, in sort of shit
posting memes that are very difficult to piece apart exactly what they mean. You know, so you
end up sort of engaging in this metatextual analysis like you're a college lit major trying to figure out what the
actual message is here. Is that a strategy you've seen as well? Oh, yeah. It's very widespread
in the far right in particular. You do see some humor in jihadist social media, but not that much relative to what you see in like the alt-right
and the far-right. And, you know, there's, when we talked about sort of the loss, you know, the
loss of the gatekeepers, we've seen some great things in that space. So if you look at something,
but it's very hard to argue if you're, you know, if you want to try and come up with guidelines
for what's acceptable speech, or you're trying to even, if you want to try and come up with guidelines for what's acceptable
speech, or you're trying to even just understand for yourself what's a positive influence on
society versus negative influence on society, it can get complicated. So we lose the gatekeepers
and we get South Park. South Park is great, but it's hard if you were going to explain the
difference between why South Park can say something and why 8chan says something, it should be taken in a different context.
You can get into kind of complicated, murky waters.
And so, you know, that sort of surge in transgressive humor, which I personally enjoy a lot, you know, it also sort of opens the door for people with sort of bad intentions to take that idea, take this sort of trappings of humor and use it to convey a message that is much more pernicious.
Look, I mean, I'm a comedian. It's hard enough just in comedy, right?
We have endless conversations with my friends who are comedians about like, well, is that joke, is this a joke about racism or is this a racist joke?
And sometimes is it not a racist joke at all and it's just a racist statement said by a comedian, right?
This happens, right? And I'm not, as a comedian, I take exception to, you know, some comics will say that as a matter of definition, anything a comedian ever says is not to be taken seriously.
It's by definition.
They don't mean it.
And that's clearly not true.
Right.
A lot of a lot of times comedians make jokes about things they don't mean.
You know, Anthony Jezzle next whole career is based on that, for instance.
But plenty of comedians make jokes about things they do mean.
That's the kind of comedy I do.
Right.
And plenty of comedians do both.
And plenty of comedians do both.
And, you know, you need to sort of, but you have to have a conversation and like figure out which is happening in which context. So you know how to feel about it.
So you know what you mean when you're talking about it.
But that's just when we're talking about comedy.
When people are using this, you know, this very murky area and they're actual extremists.
Now we've got, you know, we've got murkiness on top of a potential to do real damage.
We're not just fucking comics in front of a microphone at an open mic.
One of the things that I said about extremism in the opening of my book,
which is kind of an easy way to sort of get your head around it,
is a lot of our attitudes towards extremism is that it's like
the Supreme Court's famous statement on pornography. I can't define it, but I know it when I see it.
And that's something like, I'm hugely trying to, you know, come up with ways that we can get away
from that. And I think you see that with humor. So it's like, you know, am I trying to be funny
or am I trying to be hateful or am I trying to be funny and hateful or am I just trying
to be hateful? And it's tough. It's really tough when you get into sort of the use of humor and
irony to parse that stuff out. And it's also, I mean, you know, like it's not that hard to know
that a certain kind of joke is obviously hateful and offensive. What's harder to do is to understand the motive of the person who's making that joke.
Right.
You know, and so then you get into, it just gets very, it gets messy.
It's a way to muddy the waters.
And it's a way to, you know, sort of slip ideas into somebody's consciousness without, you know, just showing them a picture of a
concentration camp and say, I want to do this. Yeah. So how is this technique being used in like
a purposeful way by these groups? There are people who use it in a purposeful way.
There are people who use it in a purposeful way. A lot of what we see in the alt-right today is kind of derivative of a movement from France in the, I think in the 70s and 1980s called the new right. And one of the things that they were really interested in was using memes, basically. They didn't call it that then, but they wanted to use memes to spread their ideas the idea is that you have to get your your your arguments out in front of people in whatever way you can and uh
so humor uh and again you know this is where like hate clicks and stuff really can pay off for
somebody is uh it gets this out in front of people um If you're not putting your ideas out in front of people,
you can't propagate them.
And so any method that you can use to do that
is an effective tool for you.
And humor lets you circumvent some of the gatekeepers,
you know, we don't have the formal gatekeepers now,
but sort of the social red lines.
You know, if you do something
that's, that's construed as humor, you can use that as a justification to avoid the red line
and, and skirt it or sometimes plow right through it. And, you know, man, I just wish we had a more
as a comedian, I wish we had a more complex conversation about comedy in this country,
or even in my own industry, because there's this sort of odd belief that anything that's funny is value neutral and OK.
And it's just for a laugh. And I'm like, no way. Comedy is like an art form and like any art form.
It can be used for different purposes. Right. Like I attempt to use comedy to educate and to inform and to stoke curiosity in people that's at least my
goal insofar as i can do that um and you know comedy makes those ideas go down easier but like
if we agree that there are bad ideas out there that we would you know hope are not propagated
that far and we agree that comedy can be used to make ideas go down easier well ergo comedy can be
used to make bad ideas go down easier, right?
And that's not to say that should be illegal for that to happen,
but we should also be able to call it what it is when it is happening, right?
Sure.
I mean, if you look at the history of editorial cartooning, for instance,
I mean, editorial cartoons obviously are intended to convey a message
and they can sway people politically.
Yeah.
And a lot of what you see on, you know, like these far-right forums will ape the format of editorial cartoons.
They are, you know, sometimes they're in a very primitive way, sometimes in a more, I'm not going to say sophisticated, but a more developed way.
Where it's like obviously meant to be an editorial cartoon, except that it's advancing something terrible.
I never thought of that, that these folks are literally like the like Pepe memes, racist Pepe memes are literally just reproducing the turn of the century editorial cartoon.
But they are in so many, so many times.
so many times.
Yeah, and I mean,
there are people who do this stuff who very self-consciously
adopt the traditional
editorial cartoon format.
And then a lot of this
is what we're talking about here
is really kind of new media
like webcomic formats
where you have sort of
like a quick one-off joke
that's not necessarily as complex
as something that you would see
in a newspaper editorial cartoon.
Well, we're getting close to the end here.
I just have a couple more questions.
The most important which being,
what can we be doing either individually
or more importantly on a, you know,
more general level, you know, societally or what can companies be doing in order to quell the rise of extremism?
I think that's a project that everyone would favor, right, in order for there to be less violent extremism out there.
How can we do that?
In your research, have you found particular steps that you feel could be taken in order to address this issue?
So there are some things that have some remediating value that are relatively easy.
One of which is to, you know, as much as there are negative effects from social media, there are some positive effects.
And one of them is that you
can interact with people who are different from you. So you can build social networks and
relationships with people that aren't constrained by geography. So I think that when you look at
research, what you see is that people who have more experience of diversity are more happy about
diversity. They like diversity if they experience it. And if they don't experience it, then it's
something that has fear associated with it. A lot of the uncertainty that is involved with
extremism has to do with fear. It's a question of if this person is different from me, they're not
in my group, I don't know how they're going to act. And then I don't know how I should act around them.
And maybe they're going to present a threat to me.
So the more you know people,
the more social bonds that cross over those,
you know, these sort of existing group boundaries,
I think the better off we are.
When you get to the broader kind of societal questions,
then it becomes extraordinarily complicated.
So, and particularly,
you know, we've not talked about politics too much in this conversation, which is a pleasant relief for me, but it does have to come up at some point. There are a lot of types of programs
and initiatives to counter extremism that people are going to be more inclined to trust
coming from one kind of politician than coming from another kind of politician.
So when you have a politician like Donald Trump, who is run on a sort of explicitly
anti-immigrant and anti-Muslim campaign and persists in those kinds of characterizations that are extremely problematic,
you can't do community engagement like the Obama administration would do, where you go in and you
have a town hall and people from the administration come and everybody feels good about it.
So what you need to figure out, the hard part to figure out is how do you get government involved in this that in a way
that is somewhat insulated from the politics of it so that you don't have to just throw everything
out? Because I mean, you know, what we saw really with the change of administration is a lot of
people who were working on countering extremism dropped out from government funding. Governments
were really the major source of funding for anti-extremism initiatives. And now
the social media companies are, which is also problematic. Yeah, to say the least. So, you know,
I think that you can envision a approach to this that relies on the insights about uncertainty to
say, you know, we should diagnose communities that are being affected by uncertainty in particular
ways where, you know, something has happened, like sudden economic shift, for instance,
or a sudden political shift or a natural disaster or something. Look at communities where
that uncertainty is likely to rise and try and treat the effects of the uncertainty. So rather,
we're not going to go out and start anti-poverty programs around the world,
which, I mean, that would be great to do.
It's not countering extremism.
It's fighting poverty, which is good.
We have to separate out, like, all our good intentions, right?
You know, so if we want to fight poverty,
I'm all in favor of fighting poverty.
Just don't expect that it's necessarily
a solution to extremism.
So if you're trying to deal with extremism,
what you want to look for is places where there's economic turmoil, where there's sudden reversals, stuff like that,
and kind of go in and try and mitigate the uncertainty in those environments. And it might
be possible to create kind of ideologically neutral programs that do that. So I think there's
some prospects for that kind of thing. We also have a lot of – they're in the wake of September 11th in particular, and then subsequently with this sort of resurgence we're seeing in white nationalism, we do see a lot of community-based organizations that are working independently or cobbling together funding streams from a bunch of different sources so that they have some stability.
Who are doing innovative stuff to address this.
So in some cases, they have former extremists who can be used to talk to current extremists.
They don't necessarily – like a former extremist doesn't necessarily become an expert on what causes extremism, but they know the language. They know how to reach out to somebody and they can be useful in pulling somebody out of that kind of environment.
We also have some very innovative uses of social media. You use methods that redirect searches.
For instance, there's a company called Moonshot TV that does this and they buy ads. So if you
search, how do I become a Nazi? The first thing you're going to get is a sponsored listing.
And that sponsored listing is going to take you to a bunch of anti-extremist content and other
kinds of content that might be useful. And I think there's probably, I think this is a pretty
promising area to work in. I think there's probably some gains to be made there too.
But it's going to be a long process. And, you know, there's really no period in human history where extremism hasn't existed.
And there probably isn't a period in human history going forward where it's not going to.
I was going to ask if you had optimism, but I think you just answered my question.
I'm optimistic that things can be better.
I mean, I think that, you know, it's like crime.
It's like we're never, you know, it's difficult to, except for a couple of, you know, sort of excellent dystopian tradeoffs that you see in a movie.
You can't really imagine a society without crime.
Yeah.
But, you know, we can be better at managing crime and we can be worse at managing crime and we can get better at managing this.
What about extremism don't we know? Like where is the field progressing in your view?
Well, my own work, I'm very excited to really explore what makes different types of extremism similar to each other. So what I'm doing next is really a very deep dive into
what extremists say they believe and figuring out how those things are similar or different
from group to group. And there's a lot of people who are coming up and really doing the same kind
of thing. In fact, I was at a conference recently where somebody came up and wanted to talk to me
about her dissertation. And when she described it, I was like, oh, damn, that's awfully close to my dissertation.
So, you know, and I've talked to a number of people about it since.
So I think that's a really exciting area that people are just coming to.
And, you know, once we get past that, I really, I do think this uncertainty area, there's a fairly robust group of studies that really sort of laid this out for
me in the first place. And I think there's a lot more research that can be done in that. And it's
this area called social psychology. And, you know, basically the studies up until now are very,
very laboratory oriented where you get a bunch of people in a room and, you know, you have them
read something that makes them feel uncertain, and then you ask them questions about how that
makes them feel. And that's how we have this sort of empirical support for this. So I think that
there's probably ways to design, you know, projects, pilot projects and interventions
that would be based on insights that come out of that and then gather data on it
and really try and understand what kind of approaches, what rhetorical approaches,
and what subject matter approaches are going to be most effective.
Well, it's obviously a long road ahead, but I thank you so much for doing that work and for
coming on the show today to talk to us about it. Thanks. It was a great conversation.
Thanks so much, J.M.
thanks it was a great conversation thanks so much JM
well thank you again to JM Berger for coming on the show
thank you folks for listening
I know that's a tough conversation but it is such
an important one and that is it
for us this week on Factually we'll be back next
week with another topic I want to thank our producer
Dana Wickens, our researcher Sam Roudman
and or WK for our theme song
I Don't Know Anything.
You can follow me on Twitter at Adam Conover.
You can sign up for my mailing list at adamconover.net.
And until then, we'll see you next week.
Thanks so much for listening.
I don't know anything.
I don't know anything.
That was a HeadGum Podcast.