Irregular Warfare Podcast - Subversion: The Strategic Weaponization of Narratives
Episode Date: October 20, 2023Be sure to visit the Irregular Warfare Initiative website to see all of the new articles, podcast episodes, and other content the IWI team is publishing! As the global information environment rapidly ...changes, revisionist states are increasingly enabled to wage information warfare. They leverage networked information systems to sow political chaos in target societies. But as states weaponize strategic narratives to advance their interests, what can democracies and their populations do to protect against foreign information operations? To explore this challenging topic, this episode features a conversation with Dr. Andreas Krieg, a senior lecturer at the School of Security Studies at King’s College London and the author of Subversion: The Strategic Weaponization of Narratives, and Dr. Andrew Whiskeyman, an associate professor at the National Defense University’s College of Information and Cyberspace and former chief of US Central Command's Information Operations Division. Intro music: "Unsilenced" by Ketsa Outro music: "Launch" by Ketsa CC BY-NC-ND 4.0
Transcript
Discussion (0)
Subversion can entail elements of sabotage.
It can entail elements of espionage as well
that then can be leaked to manipulate how people think.
But subversion is really trying to get someone
to do something that they otherwise wouldn't want to do.
And getting them to do this without them feeling they were pressured
into changing their
attitude or their behavior. A couple of countries, in particular China and Russia, have been investing
very heavily in the information space. They look at the role that information played in the demise
of the Soviet Union and begin at that point investing very heavily. So you end up several decades later where we're
at right now, and you have on a global scale, investment by autocratic regimes in mass media
and an ability to influence on a scale that has been unprecedented in human history.
Welcome to the Irregular Warfare podcast. I'm your host, Ben Jebb, and my co-host today is
Adam Darnley-Stewart.
Today's episode examines how states weaponize strategic narratives to control the information
space and achieve their interests.
Our guests begin by addressing how recent changes in the information environment have
enabled revisionist states to wage information warfare.
They then talk about how states like the UAE, Russia and China
are able to leverage networked information systems
to spread political chaos within civil societies.
Finally, our guests conclude with a discussion
about how autocratic regimes are exploiting vulnerabilities
within constitutional democracies to sow discord
and what liberal democracies and societies can do
to gird themselves against foreign
information operations.
Dr. Andreas Krieg is a senior lecturer at the School of Security Studies at King's College
London and a fellow at the Institute of Middle Eastern Studies.
He has spent over a decade throughout the Middle East and North Africa, working with
regional policymakers and community leaders and studying the pernicious effects of information
operations.
policymakers, and community leaders, and studying the pernicious effects of information operations.
Dr. Krieg is the author of Subversion, the Strategic Weaponization of Narratives,
which serves as the anchor for today's conversation. Dr. Andrew Whiskeyman is an associate professor at the National Defense University's College of Information and
Cyberspace, where he teaches on the topics of leadership, disruptive technology, and information
warfare. Before entering the halls of academia, disruptive technology, and information warfare.
Before entering the halls of academia as a professor, Dr. Whiskeyman served in the U.S.
Army for over 27 years and was the chief of the Information Operations Division within
U.S. Central Command.
You are listening to the Irregular Warfare Podcast, a joint production of the Princeton
Empirical Studies of Conflict Project and the Modern War Institute at West Point, dedicated to bridging the gap between scholars and practitioners to support
the community of irregular warfare professionals. Here's our conversation with Dr. Andreas Krieg
and Dr. Andrew Whiskeyman. Andreas, Andrew, it's great to have you both on the show today,
and thanks for joining us on the pod. Hey, fantastic to be here. Thank you.
Thank you for having me. Great to be here.
Right. So this episode centers around Andreas's new book, Subversion, The Strategic Weaponization
of Narratives, which looks at how states exploit narratives to achieve their strategic interests.
In other words, it examines how malicious state and non-state actors take advantage of a chaotic
information space to both sow discord and achieve their interests. So just to kind of start off, Andreas, what motivated you to write this book?
A lot of different things, really. The actual trigger into looking into that part of cyber
warfare was the Gulf crisis of 2017, where I felt that the substance of this diplomatic crisis
between several Persian Gulf countries, Saudi Arabia, the United Arab Emirates, Bahrain, and they were all targeting and ganging up on Qatar, was a crisis that wasn't really founded on any substantial issue-based discord, but really was more about narratives and was proliferated for many, many years in the information environment. And I was quite astounded and astonished really of how narratives were
traveling and how they could be manipulated in a way and how information networks that all these
countries were putting together to defend themselves, but also to attack each other,
how these information networks were curated, orchestrated, and then used to also change not
just public opinion in the region, but also public opinion in Washington, London, Brussels, Paris,
and change even policy-relevant discourse in the era of Donald Trump. And that's kind of got me really interested in how this works. And obviously, as you do, you go from one extreme to another and
ended up with Russia and looking at Russian information operations on psychological
operations as well. And that kind of triggered me into writing this book.
Perfect. Thanks very much for that understanding that it's actually a very interconnected world
at the moment. And this is one really good lens through subversion information operations to start
diving deep into the detail. Before we get into the meat of the book, I'd first like to ask Andrew
how the information environment has changed over the past several decades. Information warfare has
always been present in conflict, but it seems like both state and non-state actors
have really invested heavily in information operations,
specifically over the past two decades.
Andrew, could you help us understand why this might be the case?
That's a great question.
We have to put it in context as well.
We're back with a long history of the development of information,
information technologies. But thinking through that, that would offer us helpful back to the
printing press, if you take Gutenberg and an ability to communicate ideas on a mass scale
through the development of radio, really into the early 1920s of this century, television
into the 1940s, and then the internet, really
1990s.
Each of those inflection points was critical in terms of development and theory of mass
communications and influence.
But in particular, if you look to the confluence of events really from 1989 to the present,
is really where I think we see a hyperdevelopment and a hyper-specialization
in that. And a couple of countries, in particular China and Russia, have been investing very heavily
in the information space and the development of the technology and the access within that.
If you look to early November 1989 and the fall of the Berlin Wall and the theory, Fukuyama's theory, end of history, not to just sink his whole theory into a small one liner, but that concept that liberal democracies are the high watermark of political development.
I would argue that China and Russia don't look at it that way.
They look at the role that information played in the demise of the Soviet Union and begin at that point investing very heavily.
So you end up several decades later where we're at right now, and you have on a global scale investment by autocratic regimes in mass media and an ability to influence on a scale that has been unprecedented in human history. Now, other nations, like Andreas
works through in his book, it's absolutely fascinating, have been using this art form
coupled with developments in technology, while the West has largely, I would argue,
been asleep in this space because of assumptions it's made about progress of development. And so
that's the space that we're in right now. The advances in technology,
the investments that other nations have made from an asymmetric approach to gain an advantage,
and an abdication of engagement in the information space by the West has led to
the spot that we're in. I absolutely agree. I think what I would add before we get to talking
about cybernetics is just trying to understand ontologically as well, like how I see the information environment and I think how we need to understand the information environment.
Because much of how we do it in the Western world and how we teach cyber and information security is we always look at the information space.
And most people think it's purely based around the digital social media environment, when in reality, it's obviously a much more complex environment that really includes everything and everyone,
because everything can be used, can be weaponized in order to change how people perceive and see
particular things. Hence why I talk about weaponized narratives, which is about weaponized
storylines that we kind of share with one another and how we kind of make sense of the world around
us. And I would look at this information environment as a network-based, web-based sort of environment, which is very decentralized, where things are
not linear, but extremely complex, where different nodes compete with one another,
and where it's very difficult to actually manage and control information flows. And I think the
idea that many in the military, most people in uniform still have, is that they think they can
control information flows. Even authoritarian regimes, and I completely agree with Andrew, authoritarian
regimes are obviously on the forefront of that. They have learned how to reverse engineer what was
at one point perceived as liberation technology and make it work for themselves. And they have
gotten to the next level. And obviously, we in the West, you know, liberal countries, liberal states
have found it very difficult to do that without actually infringing on some of the core values that we cherish so much. And the problem
of that is that, A, we don't have a grand strategic narrative. We're not very good in building
narratives. We think that, you know, going back to the point about Fukuyama, I think it's the end of
history and we've won. The West has won the Cold War. And there's been this hubris of trying to
think that this will go on forever. We completely disregarded how smaller states, and you know,
my book at the United Arab Emirates, smaller authoritarian countries
who are in full control of the information environment, full control of the narrative,
how they can reverse engineer liberation technology and make it work for them in
liberal societies to change how people look at particular things. And the key ingredient for
good information operations, that's kind of the point I'm making also in the book,
is information networks, how you build, curate, orchestrate networks to advance your narratives
and change how target audiences more widely look at particular issues. And also it's an information
environment where this type of conflict doesn't have a beginning or end. It's not like we're in
this competition for a short period of time. We're in this competition for the long run.
It will not have a start. It will definitely not have an end. We have to constantly engage. And then that's
another problem for most Western militaries in our strategic mindset and our strategic thinking.
We kind of find it very difficult to really engage in an operation without actually knowing where
it's going, not knowing where it's going to end, not having very tangible objectives, and then
really staying indefinitely committed to what are essentially protected
conflicts in the information environment. Yeah, and that's some really key points there.
I'm in full agreement with the challenges with militaries, particularly in the West.
Now, some of that is due to the separation, having the military not running the government,
which I would argue is a good thing. I don't want a military dictatorship running the government.
is a good thing. I don't want a military dictatorship running the government. But that linear thinking in terms of causality and the spatial constraints of thinking that conflict
is merely defined during a war, and there is such a thing as a victory, has been problematic
in terms of militaries writ large. And I would argue more broadly, at least on the U.S. side,
the government recognizing this is competitive space. Now, Dr. Everett Goldman, who was out of the Air War College, wrote a book on pure strategy,
is the title of it. And in that paraphrasing for me, he says, strategy is the art of maintaining
a continual advantage. Well, when you juxtapose that against what, for instance, the Army War
College teaches broadly, the strategy is ends, means. You see quite a difference between thinking about a continuity or a continuum of
conflict. There's an end state where there's victory and we go home. There is no victory.
And it's not linear. So I really want to get into some of the issues we just got into to include
discussing why liberal societies are struggling with information
operations. But Andreas, earlier you mentioned cybernetics, and let me apologize in advance for
a long question, but I do want to get into defining a few complex terms from your book,
just so we're speaking the same language. So you mentioned the field of cybernetic research,
which looks at how actors deliver information to the target audience and
gets them to accept the message. And second, you discuss the concept of subversion, which I should
note differs from espionage and sabotage, and basically how subversion taps into existing
insurgencies. So could you delve into these two concepts and explain how they interact with one
another in the information domain? Yeah, happy to do that.
Obviously, this is very conceptually very dense and it can be somewhat dry.
And obviously, that's what you do as an academic.
You kind of sit in your ivory tower and have to define concepts.
The concepts don't really matter that much.
So the way I looked at influence, and this is how this book is all about influence,
and that's what warfare is all about.
I always look at it substantially about making your opponent do something
that he or she otherwise wouldn't do. And it's about,
you know, a contest of wills where you're just trying to get into someone's head.
And for most of history, we've always looked at coercion as a way of actually getting people to
do something they otherwise wouldn't do. And so subversion is somewhat the antithesis to that,
where I'm saying, if you actually look what Russia is trying to do, if you look what China's trying
to do, you know, even what the UAE are trying to do in their fairly limited space,
it's not about coercion necessarily, although there are coercive elements to it,
but it's actually trying to really build sometimes consensus, or at least the illusion of consensus,
that people do something that they naturally want to do or think they want to do
without them ever reflecting upon why they did what they did.
And that is essentially something that came out of cybernetics. Cybernetic research, obviously, you know, there are different phases of cybernetic
research, but cybernetics is about how systems work and how you can get systems to work.
And networks, if you look at the information environment, as Andrew was saying, a network
of different actors, social media, you've got your think tanks, you've got your media outlets,
you've got your policymakers, anyone who's really involved in human to human interaction, they're all part, they're all nodes
in the information network. How do you get them to do something that you want them to do? How do
you kind of install in them a particular perception that makes them act in a certain way? So you first
want to obviously change their attitude, but later on, you also want to change their behavior.
But later on, you also want to change their behavior. And I would kind of juxtapose subversion to coercion and compulsion as a kind of a continuum. Obviously, subversion can have elements of coercion in it. And when I talk about Russia, particularly when I talk about the UAE, is a very, very rich state in the Gulf that has, you know, it's the greatest funder of research and think tanks in Washington, for example. And they all do this as matters of really penetrating the policy discourse
around the Hill. The way they've done it, much of it is based on coercion because it's very
transactional. They put money into think tanks, thereby changing the research agendas, changing
the narratives that these think tankers use, change the recommendations they make to policymakers, change how they engage with staffers, change how policies are essentially
being made later on.
And that is very coercive.
The best subversive or the best information warriors, as I would describe them in the
book, are those who actually can change consensus.
Because if you can change consensus, people never reflect upon what they do and why they
do what they do.
And it's very basic social psychological studies. And I would give an analogy to this.
If you wanted someone to go through a door, coercion would be the one where you pressure
someone, you actually force someone through the door physically. Compulsion would be the one where
you give someone an incentive, you incentivize them to go through the door. Consensus is when
you put a fire exit label on the door and then shout fire and people go through the door. Consensus is when you put a fire exit label on the door and
then shout fire and people run through the door. It's in the last instance, those people who are
consensually going through that door because they thought they're running away from fire will never
reflect on how they changed their mind. And that's kind of where you want to get with good subversion
operations, where you change the attitudes, you change the ontology, the way that people look at the world intrinsically
over time, that they are becoming conducive and receptive to your will. That's kind of what
subversion essentially is trying to do. Subversion can entail elements of sabotage. It can entail
elements of espionage as well that then can be leaked to manipulate how people think about a
particular issue, coerce people to change their
attitude or change consensus of people and their attitude on particular issues. So it is a very
conceptually dense area, and I'm not always sure that it helps, but subversion is really trying to
get someone to do something that they otherwise wouldn't want to do. And getting them to do this
without them feeling they were pressured into changing their
attitude or their behavior. And that key piece in terms of the level of sophistication involved
with actors who engage in subversion, I think is worth delving into just a bit. Too many leaders,
tying back to an observation Andreas had made about linear thinking, particularly in industrial age
militaries who are now face into the digital age, is the expectation that it's somehow easy,
the expectation that subversion really is a low cost effort. Now, in terms of resourcing,
no, I don't necessarily need to build tanks and bombers and aircraft carriers. But there is an investment
that's required in this. There's a need to understand one's adversary or the networks
and how they interact. That's an intelligence requirement that needs to be spent and resourced
to and prioritized. And then in terms of building those networks, there's a time element and a trust
element. Because to be able to manipulate, which is really the heart of the those networks, there's a time element and a trust element. Because to be able to
manipulate, which is really the heart of the subversion, it takes some time and effort to be
able to do that. And oftentimes, having built out, I look in particular, you know, you mentioned the
think tanks, Andreas. And yes, absolutely. I tended to a little bit more recently at what China's done
with that, Russia's done with that. They have invested. So that compelence piece or the coercion piece of adding money or withholding
money, the confusion institutes that were in the news not all that long ago with China,
that's a coercive element of finance. The manipulation piece comes in, though,
is as that almost gets laundered through, those think tanks have a verisimilitude of trust
that they are somehow
independent, doing independent research, unbiased in the questions they're asking,
the methodology they're using, the data that they're using, the conclusions that they're coming
to. And so they swallow that proverbial spoonful of sugar helps the medicine go down. They're
wrapped in that element of truth in terms of being easily digested by broader publics and other networks that then get propagated across those and almost laundered in a way
in terms of the information space that now has a crescendo of believability.
That's when you reverse engineer to get back to those basics of those investments that
are being made.
I would offer that also helps for the West to think through ways to protect itself or
inoculate itself against the onslaught of attacks that it's under in the information
space.
Well, if I could add to that, again, looking back at the network element of it, is subversion
operations are very complex and they require a huge element of orchestration.
And I'm kind of trying to look, it's very difficult to obviously measure impact of what sort of information operation actually is subversion or to what extent is it actually
subversive. I find it very difficult to measure. And anybody who comes up with an objective metric
of saying, okay, this is now a subversion operation would lie to you. But what I've
tried to do is kind of putting it on a continuum where on one hand, we've got an element of
orchestration, how much orchestration is required to generate that impact. And the other one is mobilization effect. To what extent were people actually mobilized
or demobilized for that matter? And the orchestration element is the difficult one.
Because just to think, I'm thinking at some of the more tactical operations that militaries are
involved in. Also in the UK, we've got, you know, the 7-7 Brigade, for example,
that do information operations, but they do it mostly very tactical and very operational. And they, you know, it's about, they're trying to generate likes and
engagement with particular operations and likes and engagement per se doesn't really translate
into impact. I think it really matters to what extent are you able to mobilize? Who are you able
to mobilize? And does your operation really depend on a lot of different interacting, sometimes simultaneously
executed activities?
And then how do you bundle all these activities into a final outcome?
That element of orchestration is very complex.
It requires strategic patience.
It requires a huge army of useful idiots, how I would call them, people who are willing
to become your pawn.
And if you look at the UAE in particular, because they have a freedom to maneuver in a way that Russia or China doesn't have because they don't register as a competitor,
they can go in and Washington has a great case study for that. They can go in using shell
companies, intermediaries, American intermediaries who you think seemingly have no connection to the
UAE to basically bring money to universities, bring money to think tanks, bring money to
staffers, or at least information to staffers, and thereby change subtly the discourse over a period of time.
But there is a lot of orchestration involved in actually getting to that final outcome. And it
might be cheaper than buying an aircraft carrier, but it's not cheap. It's not free of cost or free
of charge. No, it's not free. And the interesting irony I find in this, and the UAA is a great case study. It's a great example of really flying below the radar in terms of being a friend, but yet manipulative. That's why I think it's particularly useful as well to examine. It's not just Russia and China that are engaged in this. There are multiple actors that are engaged in this. And really, there needs to be a healthy skepticism amongst policymakers
in terms of the information that's being consumed. But more broadly, you look at the American,
in particular, collection of entities and groups. It's not a new challenge.
Edward Bernays, a pioneer in terms of propaganda from the 1920s in the U.S.,
in 1941, he delivered a speech to a press club, and I could erase
Soviet Union from what he said and write in UAE, China, Russia, Iran, for that matter,
in terms of the challenges that the U.S. in particular face with its diverse groups
and the fracture points that are easily exacerbated by foreign actors. And you look at what in particular
Russia and China are doing in the US, you look at election interference, that's what they're going
to. Yeah, absolutely. There are some far right groups that are being fueled by this. But there's
also groups on the left, there are groups in the middle. There's multiple groups that are being fed
a narrative they're already attuned to receive. they're just now getting it at a higher volume
and it's looking like they have a bigger group with them perhaps than they really do, which then
leads to actions. They feel emboldened, I would offer in terms of that. Thanks Andreas and Andrew
for both trying to ratchet back some really complex concepts and subjects down to something
tangible that I think our audience will appreciate and we'll begin to understand better how influence operations
integrates within the rest of the regular warfare spectrum. I think maybe we might deep dive briefly
before we move on to the next set of questions, specifically around the topic of plausible
deniability. I'll throw it to you first, Andrew. In the age of information proliferation, how do you view states maintaining
plausible deniability while also building attribution profiles? Yeah, great question again.
Plausible deniability comes in a number of ways. One is through obfuscation, right, of sources
of the information. So the more layers that you have through, or I called it laundering
the information before, but the more you have wrapped that lie in the element of truthful or
trusted sources, the more difficult it is and time consuming it is to really trace it back to where
it came from originally. So once you've got a group and you have them trusting a source,
they may not even believe
you if you have the truth as to where that's come from.
So your risk lowers even further in terms of being found out in the space.
And they're just in the information space tends to be a much higher threshold to getting
angry about it.
If you had a foreign actor on your soil, it's much more visceral in terms of a physical act of sabotage by some intellectual act of sabotage.
It just doesn't seem to register the same sort of reaction.
And so there's lower consequences in space.
You look at cyber attacks, there's a much lower consequence in terms of what's going on in cyberspace than necessarily goes on in one of the physical domains.
Not that cyber is not physical. I got that.
But in terms of the conceptualization as well, it's just ones and zeros.
Nobody got hurt.
So a couple of bots died.
That threshold, I think, emboldens actors to take more risks in the space other than the West and really, really try to pour resources into that because they're seeing that they're making gains or potential gains for relatively low cost, or even if they are
caught relatively low to no consequences in that.
Andreas, you in your book basically say that subversion is warfare by other means if it
meets three specific criteria.
Could you define and explain what those criterion are?
What I was trying to get at is that we need to have a different degree of urgency,
because a lot of books that were written over the last 10 years make that argument that
cyber war will never take place because it will never be war. It's just some activities that we
do. It's just new technology, but it's kind of information operations just revisited in the way
that we've always done it throughout history. And I kind of said, actually, if we look at what has been achieved, what can be achieved,
and looking at certain parameters, you could make the case that this is warfare by different means,
and potentially even similar means, because it can be violent. And I kind of said, if three of
the criteria are met, we should consider it equal to an act of war. I mean, it's all about mobilization.
Subversion is about trying to mobilize or demobilize an audience. And mobilization can,
in its most extreme form, be fairly violent, even if it's just as a ripple effect of that engagement
by the information warrior. And kind of what I'm getting at, first of all, it needs to be violent
for it to be subversive. And violence can be political violence, but it can be also physical violence. I would also say, to go back to Andrew's point, it's not just about disinformation,
because disinformation is fairly easy to spot. Disinformation is something that we could
potentially flag up. And that's why I kind of stayed away from it. When I started the book,
I thought it was about disinformation-based narratives. But actually, it is about weaponized
narratives. And a narrative doesn't have to be untruthful. It can be truthful. It can be a spin on the truth. And that's why it makes it so difficult to engage with. You can't
flag up a weaponized narrative because there are different viewpoints on particular matters. And
it's not the disinformation that necessarily mobilizes, but if you keep on hearing a spin
on a particular truth over and over and over again, it will change. Particularly when you're in
an echo chamber on social media, where we're exposed to the same spin over and over again,
it can mobilize you. So mobilization is important. Violence is important. The second element would be
it has to be political. So as the Clausewitzian statement of saying that war is a continuation
of politics by other means, that kind of still has to ring true if we're saying this is an act of war. So we need to have an information warrior
that is politically motivated in one way or the other. There are occasions, particularly in the
cyber domain, where we've got these lone wolf guys who sit in some basement somewhere in Southeast
Asia, for example, and create a worm or a virus that goes viral. We've got people who come up
with their own, just for the fun of it, viral information campaigns. But if there is no political motive behind it, I don't think we could declare this
an act of war. So information warriors usually are states or state surrogates who work on behalf
of the state for a political agenda. And the third element is, and I think that's kind of
important when we move into the next space, it has to interact with humans and human wills.
And we see a lot of that. I found that out during
the Gulf crisis as someone who looks at the Gulf. I was exposed heavily to bots and trolls who I
engaged with and then only later on realized, actually, these are all automated. They're very
important in transporting the narrative. But if it stays within the virtual domain, it doesn't
spill into the physical domain. And most the interaction is actually takes place between bots and automated trolls then it can't be an act of war so it needs to
kind of resonate and interact with human cognition andreas you mentioned those three big criteria
and it just occurred to me i mean mobilization of violence must be political in nature and it
must interact with human will noting we've had a few discussions previously
on the podcast directly related to terrorism, how do those three criteria differ in this context
from what would traditionally be viewed as three criteria for terrorism? Or doesn't it?
It's a subset of it. Over to you, mate. Well, it doesn't have to differ. Again,
terrorism in itself is a great narrative, right? I mean, there's so many different viewpoints of what actually constitutes terrorism. I do think though that because terrorism usually
leads to a law enforcement response, I think this is something that requires a military response as
well, because it goes far beyond just the law enforcement sector in terms of dealing with this
issue. And that's kind of what a lot of the literature does is just minimizing it or making
it much less urgent than it actually is and saying, oh, this is just a problem of law enforcement that we need to deal with internally. Yes,
we do have to mobilize all elements of the nation's power to kind of counter this.
Law enforcement is part of it, but it does require also some international action and it does require
some military action as well. And some of the activities that were some of the responses that
were taken by the U.S. government post-2016,
you know, when it comes to the Russian meddling in elections, were responses that needed to be mobilized on the national level, and they had nothing to do with law enforcement. They required
a military response as well, particularly when it's about targeting people overseas and, you
know, sanctioning people overseas. So I think it goes beyond terrorism because it's also being used
as a tool of statecraft.
Andrew, I'd like to talk about why combating information operations is so difficult.
So to an extent, I understand kinetic operations, right?
If there's a mechanized M3 platoon headed your way on a major causeway, you can place physical obstacles in front of them to slow them down or neutralize their capabilities
or whatever.
But that seems infinitely harder to do in a highly vulnerable information environment.
So could you talk about why it seems to be the case
that combating information operations
just seems hard or complex?
Sure.
Besides the obvious that it is,
and human cognition is a challenge to understand sometimes
what motivates people to do things.
Even if you think you have the answer, that might not have been, in reality, the reason that something happened.
So it is complex not to minimize that.
And I have five quick points that will go fast, believe me.
I grew up in the 80s with albums and records, so I date myself a little bit.
Although I think vinyl is coming back now.
It's in vogue in clubs and play.
At any rate, albums came with liner notes and liner notes explain things. And so liner,
L-I-N-E-R is the acronym I'll use to talk about five quick reasons why this is so difficult.
The first is lexicon. Across the information community, there is not clarity in lexicon.
And misusing terms, talking past each other, not having a clear
professional jargon causes a lot of problems. You need to be clear in your thinking and clear
in speech as to what you mean. Information sharing is another one. This goes across the board.
You look at cyber vulnerabilities, you look at information space. There's often an unwillingness
to share information or in the case of what goes
on as you look at espionage and subversion to over classify items and not share with the broader
communities. And so there's a real stove piping of the information, if you will. So intelligence
sharing, information sharing is a critical challenge to why information warfare is so hard.
information sharing is a critical challenge to why information warfare is so hard.
Normalization. Speaking from a Department of Defense lens, the information space is not normalized with the other domains. It is seen as something separate as opposed to something
integrated. Now, Andreas mentioned this. It's manifestation in the physical. There's multiple
nodes where the human becomes a critical piece of it. It's not just a bot-on-bot concept. That's not necessarily in the understanding of many leaders in the West,
that it's actually a confluence of that. And so it's something that's done separately.
We're going to do the real work here by moving material around and moving forces around,
and that's the real fight. No, the fight is cognitive. Warfare has always been cognitive,
and that information piece is integral. But that's not normalized in terms of discussions, in terms of
thinking, education, training, et cetera. Expectation management. You know, Ben, you
mentioned the infantry platoon coming. And I expect a call for fire that artillery is going
to attrit a certain percentage of that armored attack force. So if I have obstacles, it's going
to slow them by a certain percentage. And I've modeled that with computer simulations, multiple iterations at a
combat training center. I have pretty good data to get me to a point of what I can expect from a
reaction to that force in terms of what I'm doing to attrit that force. They don't necessarily have
that in the information space. I find it becomes one of two poles, either it's utopian or dystopian. Either there's the magic tweet out there of 240 characters that will cause
Jeffersonian democracy to bloom in the heart of everyone around the world, or you're going to
send a tweet that's going to cause World War III. Well, it can't be both. But you heard it said that
amateurs talk tactics and professionals talk logistics. I would add to that that experts talk information.
And we need to be thinking about that as a community and we're not.
And then the final piece that makes it difficult is risk.
And that gets to kind of that utopian dystopian piece.
There's a reticence in the part of the West oftentimes to even tell the truth because
they're afraid of what the repercussions are of putting information
out or not putting information out. Well, that risk calculus or executing a cyber attack on
something minimal, right? We're not causing World War III, but that risk factor of not really
knowing what happens when you put information out, not being able to model it the way we do
with an artillery round or dropping a particular bomb, that's problematic for leaders as well. And that confluence of challenges, those
five challenges in the information space make it particularly difficult for leaders within the
Department of Defense in particular to conceptualize a way forward of fighting in this fight.
Andreas, you discussed several case studies in your book to include the UAE efforts to
delegitimize foreign political parties and Russia's unrelenting attack on Western democracies.
Could you discuss those cases and explain how they illustrate
some of the concepts we've been discussing today?
Yeah, happily.
I'm quite fascinated by the UAE because they're a fairly small country,
but they have immense strategic patience.
And it's quite impressive how strategically minded they are on a lot of occasions. And they are learning from the Russians of how to do it. Both of them are
authoritarian countries. Both of them have a fear of civil society. Both of them have a fear
of revolutions. Both of them looked at the Arab Spring and the color revolutions as a strategic
challenge to regime legitimacy. And this is why they see the information environment as the most important element of
securing the regime and kind of making sure that, you know, there is resilience in terms of regime
maintenance. And the information space is used and exploited, subversion is used and was used
first and foremost in both of these countries domestically to kind of demobilize the home front. And where we always look at the coercive element of it, of saying, you know,
they're constraining the information environment, which is what Russia has done, what the UAE have
done, what Saudi have done, what Iran has been doing. The old school element was always, let's
shut down the internet. Let's make it very difficult for people to mobilize. Let's really
undermine the civil society by making it physically impossible for them to meet. That is no longer possible. That was even no longer really possible in the early days of
the Arab Spring in 2011, where countries like Egypt, for example, were trying to just shut
off the internet. But the costs that they were incurring, even for just a day, were billions
of dollars that they couldn't afford to lose. And obviously now, more than a decade on, it's
impossible for a state to just switch off the internet. So what you need to do is you need to subvert civil society.
You need to subvert the mobilization ability of civil society to kind of come and build
a consensus on a particular issue.
And, you know, this is where they reverse engineered liberation technology from kind
of creating fake trends and delegitimizing and derailing legitimate discourse or organic
discourse with pro-regime
cheerleading. That's how they started off domestically. That's how the UAE started off.
That's how the Russians started off. And that capability was then exported across the world.
For the Emiratis, they exported it to Libya. They've exported it to Yemen, where they use,
and they're doing it now in the war in Sudan as well, where the Emiratis have their own physical
proxies on the ground that they're supplying with arms and money, but also provide them with information support
by basically allowing them to push their narratives and win on the cognitive battlefield, if you
will.
So they've done it then across the Arab world and the Middle East.
The Russians have done it across the Russian Federation in Eastern Europe.
And then bit by bit, as the Russians were going into Europe and the Russians were going into Western countries, into Western information spaces, the Emiratis have also entered
that information space. And some of the revelations that came out over the last two weeks under
hashtag Abu Dhabi leaks showed how the Emiratis are actually doing it. They've used an intelligence
firm based in Switzerland who were really trying to target Muslims in Europe, but not only Muslims,
but also anyone who basically
speaks for democracy and liberal values in the Arab world, and try to tarnish them and smear
them with allegations of them being terrorists. And that is something that came in support for
Russia as well. Because what the Emirati and the Russians have done is they've built organic
audiences among the far right, people who don't like the European Union in Europe, people who are
particularly anti-woke, anti-liberal, anti-democrat in the United States, more on the far right of the
Republican spectrum, and very much almost fascist in Europe. Both of them try to polarize domestic
discourse, particularly when it comes to issues to do with Islam and integration and migration,
both of them, which is quite funny, because obviously the UAE is still a Muslim country. It's a majority Muslim country. And they themselves
have kind of pushed a particular narrative of saying that if you are too Muslim, you're praying
too much and you use the mosque as a way to mobilize and using religion to mobilize politically,
we will call you a terrorist. The best case study I have here in the UK is where the UAE have not
only changed discourse on Islamism and Islam, changed discourse on the Muslim Brotherhood, call you a terrorist. The best case study I have here in the UK is where the UAE have not only
changed discourse on Islamism and Islam, changed discourse on the Muslim Brotherhood, but also
changed discourse on Muslim civil societal groups in the UK, and then really put pressure on the UK
government in 2014 to lead an investigation in the Muslim Brotherhood in the UK with a particular
goal to ostracize that organization and other NGOs, Muslim NGOs,
and labeling them terrorist organizations. In the end, it didn't succeed, but the UK government
still launched that investigation. And that was only possible because the policy-relevant
discourse on that matter was changed through think tanks that the Emiratis funded,
researches that were funded by the UAE, trips by policymakers, UK policymakers going to
the UAE, staffers in parliament here in the UK being exposed to that particular Emirati-led
narrative. They've subverted the CVE discourse, the Countering Violent Extremism discourse,
also in the United States, by basically saying any sort of Muslim organization that is too
involved in civil society is on the terrorism
spectrum. They came up with that theory of saying, if you start off as a moderate Muslim Islamist,
you will end up with Al-Qaeda and ISIS. That kind of very simplistic narrative is one that
obviously had a lot of useful idiots on the ground. And it had specific impact on the far
right. And many of these far right groups have come to power in Europe.
specific impact on the far right. And many of these far right groups have come to power in Europe.
So before we get into policy recommendations, I do want to discuss the tension between digital authoritarian regimes and liberal democracies. I'm sure that most of our
listeners would agree that open societies are generally an unqualified good thing.
But in some sense, they seem to be particularly vulnerable to information warfare.
So could you discuss why that might be the case and how techno surveillance states like the PRC and others are manipulating information flows to the detriment of the free world? And that's an open question, but I'll direct it to Andrew first.
Yeah, just a couple of thoughts. It reminds me of a quote actually from Ecclesiastes 1.9.
Basically, there's nothing new under the sun.
And I say that, I just reread
Animal Farm, and I hadn't read it since
the 80s. I picked it up, and my son was
reading it for a class he's taking.
And I look at the techniques that
Orwell sort of extrapolates out,
tells the story from the animals.
And I see that being played out today in the information
space. Those techniques are not new.
What's new is the use of the technology that we have.
But the techniques are not new.
So when you look at the tensions between authoritarian regimes and liberal democracies, yes, they
both have vulnerabilities.
They both have strengths and weaknesses.
On a side note, it's funny that Aristotle argues,
I think that democracies are a really bad thing. We ought to have a benevolent dictator.
I read an article recently where China is using Aristotle to explain why its regime
is actually the best thing in the world and ought to be led. It's an interesting take on
typical Western philosophy and thought. But when you juxtapose those two, the authoritarian regime can more easily harness a cohesive narrative because it is command directed.
It is unilateral and it can laser focus that effort against perceived vulnerabilities in liberal democracies where you do allow difference of opinion.
Now, by allowing that difference of opinion, you open an inherent vulnerability in that system for two groups to clash.
Now, it may be as simple as a disagreement, a minor disagreement.
It could be something major over a particular value or belief.
It's just an inherent vulnerability.
And because of that, and because of the reach of technology today, I think you see that exacerbated and playing out before us in time and space.
But those are naturals in the systems themselves.
But recognizing, you know, knowing yourself and knowing your enemy, paraphrasing a quote from Sun Tzu, right?
We oftentimes don't know ourselves.
We don't recognize that what we trumpet as a strength also has vulnerabilities.
So we need to be thinking about how we protect those vulnerabilities.
Again, it gets to that, what is a reasonable level of foreign interference in internal affairs?
Completely agree with that point Andrew is making.
I would say also that for an authoritarian, any sort of information that's freely flowing is
already inherently subversive. So for them, the threshold of saying this is subversive or not
is extremely low. For us, it is very difficult to really find that threshold. And we're not
really having a discussion over it, although we should. And obviously, no one who's ever going
to say that we need to infringe on our fundamental freedoms and civil liberties, obviously not.
on our fundamental freedoms and civil liberties.
Obviously not.
But we need to retain a degree of agency among our actors and nodes in the information environment
where we say discourse has to be organic.
A degree of organic discourse needs to be retained
for it to retain its integrity.
And agency is a very important element.
To what extent do we still have agency
when our information environment is heavily polluted?
And I feel that
with social media companies in particular, for them, information is something that they use to
extract profit. So virality is something that they need in order to advance their profit.
Twitter, for example, is an interesting case study because it is so integral to our discourse,
despite the fact that there are alternatives out there. But since Elon Musk has come in, we've seen that this platform, which I think is a very important
point for kind of shaping discourse between media, academia, and even practitioners,
which was an environment which was always algorithmically curated, is now overly curated
by algorithms in a way that it lends itself to polarization. And we've seen that
already on Facebook and on other social media platforms. The entire surface of the information
environment is not owned by states. And from this come a lot of vulnerabilities. And I'm not
suggesting we should clamp down on it. I think we, you know, in the Anglo-Saxon world, we've always
find a way to managing it without the state interfering in it. Well, you know, China, Russia,
UAE are heavily managing and controlling the information environment in it. Well, you know, China, Russia, UAE are heavily
managing and controlling the information environment, even though they don't own the
surfaces. We see that also European countries on the continent, in particular, Germany and France
have passed legislations to kind of contain and constrain free information flow using hate speech,
for example, or libel as ways to kind of manage and moderate the flow of information,
which is an interference in the information environment that I would not readily accept.
I think the US and the UK have always said, you know, we need to find a way of the system of the
information environment to self-regulate itself, which is great as a philosophical concept. And
I'm standing for it. But I think what we need is a multi-stakeholder platform where all the stakeholders in the information environment could come together and agree where that threshold lies of how much interference we are allowing ourselves, how much unauthentic or disorganic, unorganic discourse we're willing to accept and tolerate, and how we manage information flows. And again, it's not about disinformation,
because disinformation could be a fairly easy thing to flag up, although there's loads of research that shows that it doesn't actually work, as Andrew was alluding to already.
But the problem is with weaponized narratives, we want to retain that plurality of different
opinions. And in order to do that, we still need to find a management mechanism. So here in the UK,
we've got Ofcom, which is a
government regulator of the information environment, which is not owned by the government,
but it's kind of a multi-stakeholder platform to do that. And it only deals with media,
you know, traditional broadcast media. Now, why don't we expand something like this to also
include other elements of the information environment to retain its integrity and
making sure that discourse is organic? That's a natural segue. So to wrap up, I'd like to ask Andrew that based on today's conversation,
what are the relevant implications for policymakers, academics, and practitioners
who work in the information space? So first for policymakers, don't be afraid of information.
I've seen too often on the U.S. side a fear of
saying something that it might be the wrong thing and therefore not engaging. I think that's
absolutely the wrong approach. You need to work through the right way to engage, the right level
to engage, but you must engage. You can't cede the information space any more than you can cede
cyberspace. I mean, the U.S. created the internet out of ARPANET,
right? But it has seemingly seeded that entire terrain to adversaries to do what they want with,
with very little on the U.S. side demonstrable in terms of levels that we're seeing from China
and Russia. You can't seed the space. It doesn't go away. You can't bury your head under the pillow,
right? From academics, it's really a multidisciplinary approach.
I think we have failed in some respects with niche specializations rather, not general
issues, in terms of being hyper-focused.
Yes, it is important to study cybersecurity and to study the technology.
Yes, it is important to study narrative.
Yes, it is important to study military movements.
Yes, it's important to study X, Y, it is important to study military movements. Yes,
it's important to study X, Y, Z, for my British comrade. But bringing that group together,
almost I think of the Santa Fe Institute with complexity theory studies. How do we bring
together a multidisciplinary approach to look at this? And that's something any academic
institution could sponsor. Hey, we're going to host a conference on this. We're going to start publishing some papers on it that are
peer-reviewed across a number of disciplines to start entering into ability to trust some sources
of what it is that we're looking at that tend to be, not that we all don't have biases, right?
But it brings it back closer to the middle, going back to Aquinas or Aristotle in terms of the middle way of this. And then finally, for practitioners, get smart. You can't outsource
this. I can't tell you how many times I hear certain leaders say, well, you know, digital
natives can do this, and I just don't know. That is unacceptable. I'm sorry. Full stop. It is
unacceptable. You must start to understand capabilities, limitations, and
employment principles of information. You don't have to code Python to understand how to use
something in cyberspace. You don't have to be a master communicator and witty on your tweets on
Twitter to understand the impact of information, but you have to be thinking about this. And from a practitioner's perspective,
that gets to demanding it be in exercises, it be in professional education, it be courses in
universities, and it be able to be played out in terms of staff organizations and resources.
We want to take this space seriously. You have to be serious about it.
I wholeheartedly agree with Andrew, particularly on the last point. I don't know how many times I hear from people that, oh,
I don't know how computers work and I'm not tech savvy, so I can't really care about cyber.
But, you know, we haven't stopped thinking about strategy in the 1950s just because we weren't all
nuclear scientists. And that's kind of the problem that we face today. People feel like this is a
domain that nobody understands, although we're all involved in it. All academics, all practitioners, we're all part of it. We're all
members of that information space. I think the question is, how do you build resilience? The
answer is resilience, but true resilience in a way where we're able to bounce forward rather
than bouncing backwards when it comes to responding to these kind of interferences.
On the academic side of things, I think we do need to have more scrutiny on where money comes from
and how does it impact the integrity of the research that's being done. Journalists as well
and social media companies, they need to be part of a discourse. They need to be brought in. They
need to understand and appreciate what they're doing and how that impacts the information
environment. And for most practitioners, and, you know, we need to think outside the box,
particularly when it comes to state institutions,
because we need a whole of nation approach to this. This can't be run only by the military, can't be run by state institutions alone. I think we need to build counter networks to
counter networks. So if you want to operate in a network centric space, you need to use networks
to do it. You can't have an operating hierarchically structured institution to do that.
That doesn't mean that we need to get rid of the military. But those units that are supposed to deal with that threat need to organize in a
completely different way. We need to think strategically in a completely different way.
We need to think about horizontal integration, not vertical integration, when we actually do
these sort of things. We need more mission command than we already have in the military space,
which allows people to take risks, not just on the tactical level, but operationally, strategically. If you look at the Russians, the Emiratis, the Chinese,
when they are successful, they've probably failed 99 times to succeed once. And the problem is the
military mindset today and the mindset of most institutions who are conducting statecraft for
Western liberal democracies, they still measure victory in absolute terms. They think like any
input needs to have an immediate output, any effect needs to be measured. And that's not how the information
space operates. A lot of it is coincidental. You need to have the ability to try and also to fail.
And that's not in the military mindset. And that's why I think situating any of the response units
in the military context is just the wrong way of dealing with this. They do have a role to
play, but the strategic side of things can't be run by the military. It has to be run by a
multi-stakeholder entity that can really mobilize the whole of the nation to counter these kind of
interference in the information space. Andrew, I really like your point about not having to be
fluent in Python to understand the information environment because i'm not a big coder myself but gentlemen that was a fascinating conversation on how actors
weaponize information so thank you so much for joining us on the irregular warfare podcast today
thanks for having us absolute pleasure great discussion
thank you again for joining us on the Irregular Warfare podcast.
We release a new episode every two weeks.
Next episode, Ben and Lisa will discuss competition in the maritime domain.
Following that, Laura and Ben will air a special episode
with Lieutenant General Retired Ben Hodges and Ravi Agrawal on the war in Ukraine.
Be sure to subscribe to the Irregular Warfare podcast so you don't miss an episode.
The podcast is a product of the Irregular Warfare Initiative.
We are a team of all-volunteer practitioners and researchers dedicated to bridging the
gap between scholars and practitioners to support the community of irregular warfare
professionals.
You can follow and engage with us on Facebook, Twitter, Instagram, YouTube, or LinkedIn.
You can also subscribe to our monthly e-newsletter for access to our content and upcoming community events.
The newsletter sign-up is found at irregularwarfare.org.
If you enjoyed today's episode, please leave a comment and positive rating on Apple Podcasts or wherever you listen to the Regular Warfare podcast.
It really helps expose the show to new listeners.
Warfare podcast. It really helps expose the show to new listeners. And one last note, what you hear in this episode are the views of the participants and do not represent those of Princeton, West
Point, or any agency of the US government. Thanks again, and we'll see you next time. Thank you. Thank you.