Angry Planet - The Hard Limits of Cyber War and Subversion Operations
Episode Date: July 2, 2024Listen to this episode commercial free at https://angryplanetpod.com/Influence campaigns, both subtle and unsubtle, are as old as statecraft. Agencies like the CIA, KGB, and Israel’s Mossad have all... attempted to force friends and rivals to change. It doesn’t work as often as you’d think. Subversion campaigns are often so secretive that their effectiveness is hard to quantify. But Lennart Maschmeyer decided to try.Maschmeyer is on this episode of Angry Planet to tell us all about the limits of cyber war and subversion operations. It’s the subject of his new book Subversion: From Covert Operations to Cyber Conflict. Maschmeyer is a senior researcher at the Center for Security Studies at ETH Zurich and his book is a deep look at what works and what doesn’t when countries try to influence each other. It throws cold water on Russia’s much-hyped “Hybrid War” and the idea of cyber Pearl Harbor.Support this show http://supporter.acast.com/warcollege. Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
Love this podcast. Support this show through the ACAST supporter feature. It's up to you how much you give, and there's no regular commitment. Just click the link in the show description to support now.
Can you introduce yourself and tell me the title of the book real quick?
Sure. Yeah, my name is Leonard Marshmeyer. I'm a senior researcher here at ETH Zurich at the Center for Security Studies.
And the book I just published, and that we'll talk about today, is the called Subversion from Covert of
operations to cyber conflict.
It's a really wonderful book, and I'm glad it kind of comes out at this time when I have been thinking a lot about what the effectiveness is of information campaigns, what effect art and pop culture have on us, which is not what this book is about.
But it is this kind of, in a lot of ways, I saw it as an attempt to understand.
how effective this stuff is.
And so being able to read something
that kind of really dove into that
was extremely helpful for me
and I really appreciate it.
And I think the case studies and the kind of the history
that you go into are pretty interesting.
Here up at the top, can you define subversion for me?
Yeah, sure.
I think it's a good question to start with too,
because it's really a term that's thrown around a lot,
but it's very rarely clear what's meant by it.
So I read a lot of the Cold War literature to prepare and build kind of Aryan for this book.
And if you read that, it's aversion is usually, if it is defined at all,
it's defined by a goal and that goal is overthrowing a regime,
which makes sense because that's what the two Cold War powers were doing to each other,
or not to each other, two kind of proxy states, right?
To everybody in between them?
Exactly. But if you think about that, if the goal is to overthrowing regime, you can also do that with military force, right? It's not really clear what's distinctive about subversion. Overthrowing Saddam's regime, that was the goal of the Operation Iraqi Freedom in 2003, right? So I was really, the goal was, as my starting point, to really nail down what is subversion. And there, I realized it's not about the goal itself, but it's about the mechanism, right? The way you achieve that goal.
And that's really what makes it distinct.
There is the mode of operation and then the mechanism by which you do.
The mode of operation just means that it's secret,
which already puts it kind of in the world of covert operations.
It's one type of covert operations,
but it's also indirect because it's really going through your adversary's systems indirectly.
Rather than, you know, you project your military capabilities.
You use your stuff basically.
You throw your stuff at your enemy.
That's usually how it works.
warfare, but here it's different. You use something that belongs to your enemy to hurt your enemy
with it. And the way that works is by infiltrating adversary societies traditionally,
finding some vulnerability, some weakness, finding a way in, gaining access, and then manipulating
society from within or organizations, institutions at a smaller scale to basically do something
they're not supposed to that serves your interest as a sponsor of the operation and then where it's
your adversary, right? It's very abstract, but that's how the mechanism works. Traditionally,
it's just in practice, it's worked through or in states have done this for for ages.
You use spies, undercover spies that infiltrate your adversary society, wherever you need them,
under false identities, and then start, you know, gaining influence to the right people or maybe
equipment or whatever that you need to manipulate and then manipulate it without ideally alerting the victim to
to what's going on.
And now the difference is now states can also manipulate computer systems,
heck into systems, make them do things they're not supposed to.
Obviously, it's a different system.
But this underlying mechanism of finding some vulnerability,
exploiting it, manipulating a system to do something it's not supposed to,
that works ultimately the same way.
There was a starting point.
There was kind of my aha moment building this whole theory here.
That means also is useful to look back in history a bit
and understand what subversion has been used for, what it can and can't do.
Why did you get started in on this?
Why were you interested in subversion?
Yeah, I've said a bit to that already.
For me, when I started this project,
seven years ago, I was a PhD student in Canada.
And like many at that time, I expected that information, technology,
you know, hacking, cyber operations,
they really revolutionized conflict.
That this is a new tool that states have that can achieve things
they couldn't previously achieve.
So what I wanted to do was write a book about this revolution.
What I have ended up doing is writing a book about subversion that shows that this is actually
not a revolution.
So, and then what got me interested in subversion, as I said, was that aha moment, right?
Because I started out from warfare and I was thinking about what changes in warfare.
But then I was thinking of the mechanism, how to cyber operations actually produce outcomes
that matter in politics.
and it's a very different mechanism to force,
that's the tool you use in war, right?
You basically produce violence, hurt your enemy,
kill people, destroy things,
as efficiently, effectively as you can.
But that's really what,
and there's more and more evidence coming out,
showing that that's really what cyber operations
are not very good at.
And it's kind of, it's obvious, right?
Because cyber operations manipulate computers
that are not generally designed,
at least not in the world we live in today,
to kill and hurt people and, you know, destroy things.
So to still do that through computers is really hard.
Like one Infosec person, I think, put on Twitter is if the thing was not built to go kabum,
it's very hard to, you know, make it go kaboom.
So that's, that was the starting point.
And then I was looking into, so if that's more the analogy here and also the parallels
to these subversive operations rather than warfare,
If that's a type of conflict that cyber operations are relevant in and that they kind of can reproduce, then the question is obviously, what does new technology change about this type of conflict?
So that's what I then started focusing on really systematically in the book.
Yeah, it's fascinating because as I was reading it, I was thinking about 10 years ago.
And you definitely go here in the book.
There was this, we were people who were paying a.
attention were very frightened of Russia and they're oh, they're pioneering this new way of war.
There's this thing called hybrid war that they're doing and, you know, they're running
subversion campaigns in Estonia and we think about the early days of their invasion of Georgia
where they cut off the internet.
But to your point, this new, this hybrid war thing, it was not, uh,
It does not seem to have been an effective form of political control over their neighbors, right?
It's like in the ultimate example, of course, would be Ukraine, where now it has led to a ground invasion.
If they were in earnest attempting like some sort of subversion campaign to change the country from within, it didn't pan out, right?
Yeah, which is really ironic because Ukraine was the kind of case that really gave rise to this whole idea of hybrid war, right?
2014, basically when Russia took over Crimea, very quickly in this kind of foggy, subversive
operation and then a referendum, there was this kind of panic in a lot of Western analysts.
How did they do this, right?
How did they achieve this feat?
Caught everyone off guard?
And, yeah, the reaction was that, well, this is a new type of warfare linked to, I think,
a large extent to similar expectations about cyber operations.
That's a theme that really comes out in the,
hybrid war literature. The idea is that so on the one hand, it depends, right? They're really varying
levels of excitement or a panic about this when if you read those texts. On the one end,
there are people who just ignore all of history basically or, you know, the fact that covert
operations have been around for a long time and just say, well, this is completely new. They're not
just shooting missiles, but they're doing, you know, all this kind of nefarious activity. But most of
the people who wrote on the hybrid war stuff, they are aware of this, right? I mean,
using subversion, using covert operations, also secret warfare, disinformation campaigns, and so on.
None of those are really new tools, but there's new technology to basically run these kind of
operations.
So the expectation is or has been at least throughout the literature that now with the new
technology is a more effective, more intense also means of achieving your goals as a state
short of going to war.
And as I said, Ukraine was really the paradigmatic example.
for that. That's also the reason why I chose that case in the book. And it is ironic because Ukraine
really proves that expectation wrong in the most horrible really way possible now that, as you said,
right, now it's clear in hindsight. It just didn't work. Russia tried to get Ukraine to change
course. Basically in 2013, Ukraine started this pro-Western or also officially started this
pro-Western foreign policy where they agreed in the parliament to start the secession negotiations
with the European Union, which for Russia was the fear of losing Ukraine in a way from its
sphere of influence. And then Russia started using everything in its subversive arsenal to stop
that. So they ran a bunch of cyber operations. Ukraine has been called Russia's test lab for cyber
warfare in a bunch of commentary, right? And the idea also that they're testing it to use against the
West against the United States later.
But they also used really traditional means, just bribing officials.
They didn't even need to because there was a pro-Russian government in Ukraine until 2014.
And having these proxy groups, especially in Eastern Ukraine, that were also instrumental in
the end and mounting that operation to take over Crimea.
But none of it worked.
Ukraine maintained that pro-Western foreign policy, despite all of these efforts.
And that's why Russia invaded, to put it very simple.
simply, right? They failed to achieve their goals with the means short of war. So they escalated to
the next level, went to war. The alternative from their, I think, perspective was either to give up
or to escalate. And it shows the limits of what you can achieve in this space.
When you had a classic, you had one of the most, in Ukraine specifically, one of the most
classic subversion techniques of all time, which was like a high, the president was corrupt
and basically in the pocket of the Kremlin, right?
And even when you have at that level,
somebody at that level of power that you've turned,
it's not necessarily a guarantee
that the country is going to be yours, right?
Yeah, I mean, it's fascinating in many ways what happened there.
The episode you just mentioned, right?
Yanukovych was president there in 2013, 2014,
also who was then basically dethroned.
His government was overthrown by the Euromidan protests.
But that was, they were triggered by a real success from Russian kind of shadow diplomacy.
There was this secret meeting between Yanukovych and Putin on some undisclosed military airport in Russia,
where Putin somehow strong-armed basically Yanukovych to stop these negotiations with the European Union that I mentioned, right?
And he did.
So it was a great success for Russia.
But what they didn't expect was that Ukrainians went to the street.
to protest against it because Russian propaganda that I'm quite convinced Putin mostly believes in
had said that, you know, Ukrainians were just euphoric to just join kind of Mother Russia again.
It didn't really play out that way, right?
So it's fascinating how you have this not only in that case, also in the Cold War,
Russia kind of sometimes achieves these amazing feats in a way, but then they shoot themselves
in the foot because they believe in their own disinformation and propaganda campaigns that then
leave everything to derail. I mean, we saw a similar phenomenon in the invasion ultimately,
right? Same happened. They believed that Ukrainians would welcome them with open arms. A lot of
the soldiers did at least. That's not really what happened. It's funny. It speaks to one of the
hard limits of propaganda, right? It's like you can sell this thing and you can say so many people
have watched it. So many people have read it. We're getting this much engagement on it. But that doesn't
necessarily speak to how people are engaging with the propaganda, what they think of it when they
see it. Just because 500 million people have seen a thing doesn't mean 500 million people believe
the thing. It might, in fact, make all of them very mad. Right. Yeah. Exactly. You can't really
control the effects or, you know, what happens after, really, after you set a message out. Who
reads it, what they do with it.
And I mean, that's overall, it's a real
limitation of these kinds of
operations because it depends on
something that your target,
your enemy, your victim does,
right? A system on the one hand
and then what the effect of that system on another
kind of target is. You have this
two things that have
to behave in the way that you want
and still
do something they're not supposed to,
but do exactly what you want
them to do, which rarely play
out. It's just, it's very complex and in most cases impossible to control.
So since we're talking about some of the problems with all this stuff, there's an important
concept in this book called the trilemma. Yeah. What is subversion's trilemma?
That goes really back to, I mean, the starting point is similarly what I was just saying.
You have these limits in control. But the crux of the whole.
problem, as I was reading also a lot of the Cold War literature, is that in theory, right,
subversion is really a perfect weapon. Because if you succeed, you have a way to manipulate,
weaken your enemy from within secretly, without your enemy even realizing if you do it,
well, at least until it's too late, you know, until if you produce the effect you want.
And maybe if you're successful, producing similar outcomes to going to war with less cost and less
risk. So it's cheap, effective, and still not very risky. And you can imagine a lot of effects
in theory, right? You mentioned disinformation campaigns. We have now these big fears around also
fake news and deep fakes. Technology makes it possible to reach millions, hundreds of millions of
people. But that doesn't mean that in practice, you know, you're going to convince them. Similarly,
as that the internet is global in reach. So anyone with a laptop can, you know, hit anything in
in the world, but that doesn't mean that anyone with the laptop, everyone with the laptop
can mount these state-level critical infrastructure cyber strikes.
And very much the same is true with the traditional subversive operations.
They have often fallen short.
So throughout the Cold War, most of the operations failed.
There are few people that have looked at it systematically, but one of them is Lindsay
O'Rourke, who looked through all U.S.-sponsored regime change operations, regime change in the
Cold War, and thought that the majority of them failed.
And so I got interested in what are the reasons for this failure, right?
And that's something that's missing in a lot of the literature there because there isn't,
the theory of the mechanism involved basically is missing.
It's kind of scattered throughout, but it's not there.
So as I was thinking about that mechanism, I was thinking about the constraints,
the challenges involved, right, and the steps there.
So as I said, you exploit a system.
That means you have to find a vulnerability.
That means you have to learn how some system that others have built,
whether it's an institution or, let's say, political organization, or a media conglomerate, or, you know, some industrial facility you want to sabotage.
Whatever it is, you have to learn how that system works and you have to learn it so well that you find some flaw that even the designers and the people who, you know, operate in that system daily, they have missed and they haven't expected.
So that ultimately just, it's a learning process that takes time, that really slows down your operational speed.
So you have the first constraint is in the speed.
the second one is in the intensity of effects you can achieve for similar reasons because it's about manipulating a complex system others have designed while you have to stay secret.
So that means you have to operate carefully and you have to stay secret because if you get discovered before you produce whatever effect you want to produce,
it's relatively easy for the victim to neutralize your operation.
I mean, just think about a spy who gets discovered, they get arrested or killed.
So this is kind of operating carefully and trying to manipulate something without alerting your victim to your present.
Presence, that means the intensity of effects you want to achieve is limited as well.
And finally, and I've said some about this already, you have this, and I think this is the most fundamental problem here really is are the limits of control.
You make a system you haven't designed, do something it's not supposed to, but you still want to make sure it's what you expect and intent.
that serves your purposes, and you can't really test it either in most cases, for the same reason
that, you know, once you alert your victim to your presence, you usually fail, you get discovered
and the operation is over. And it's not that just you have these constraints individually,
that, you know, the speed and the intensity and the control you have in these operations is limited,
but they all interact in a way that means you can really, at most, you can have two out of the three.
That's why it's a trilemma, because the more you kind of maximize,
one of these variables, the more you tend to lose out on the others.
And it's quite intuitive once you think about it.
Because so say you proceed very careful, sorry, very, very quickly.
That means you're not as careful.
You can't be, right?
You also don't learn the system as well as you could if you would spend,
would have spent more time.
So the chance that something goes wrong, that you miss something that you get discovered,
increases.
And while the intensity of the effects you can achieve, the damage you can cause,
tends to decrease.
Similarly, if you want to really cause a lot of damage,
there's a much higher risk that you create some collateral damage,
some unintended consequences.
And you will also tend to take much more time.
So these kind of interactions,
these different configurations,
mean that in practice,
contrary to the fears around subversion and also cyber operations,
it tends to be either too slow or too weak
or too volatile, to unpredictable,
to be a useful tool that creates,
strategic value for states when and where it's really needed.
Do you think that state power got super interested in subversion as a,
as kind of an operating first principle after World War II?
Or is this some, is, like, we're talking about it.
I really think of it in Cold War terms.
I think about, you know, John LaCarr, I think about spy novels.
I think about all of these stories of attempted subversion of attempted coups, both by America and the Soviets.
Do you think that this was a method of power projection that was used as much before World War II?
It's been around forever, basically.
Someone, a review actually of a paper that I published a while ago pointed out to me a book on the strategy of the Byzantine.
empire and how subversion was key to its success because they were surrounded by much more powerful
empires that they could not win against militarily and they were also spread around geographically
around the Mediterranean Sea. So what they did was rely on subversion where they were bribing other
empires, you know, using diplomacy also trying to kind of turn people. There is a story of how
they turned the commander of the Persian Empire to their cause.
who then, you know, stopped attacking them.
It's obviously not confirmed.
We don't have any records, unfortunately, for the time.
But it's definitely plausible.
So it's been around forever.
And basically, for the same reason that I was saying,
it's a really attractive alternative to war.
If war is too costly, and for the Byzantine Empire,
it was at that time because they didn't think they could win.
Subversion becomes quite attractive.
And the more costly kind of war looks like,
the more attractive it becomes, even though it might often fail.
And in the Cold War, after, you know, 1945 start of nuclear weapons, obviously war became
incredibly costly between the two superpowers.
And there is a quote from John F. Kennedy, I think it's also in the book somewhere,
about basically under that shadow of nuclear deterrence, subversion, covert warfare, and so on,
those became the main tools by which the Cold War was fought,
because they were, you know, doing everything they could to avoid direct confrontation,
which is with each other.
And if that's the case, then what do you do, right?
This indirect, shadowy activity, you can still weaken, hurt your adversary, becomes very
attractive.
And you have the fears associated with it, too.
I mean, now there's a panic.
Ten years ago it was panic of hybrid warfare.
Now it's deep, fake cyber war has been in there, too.
And in the Cold War, they're very much the same fears.
There's people who predict that the Soviets are going to win because they're just so good at all this active measures and subversion stuff.
I found a paper from the Active Measures Committee or a report in the U.S. Congress in the 80s that basically says, and that's like three years before the end of the Cold War, because before the Soviet Empire collapsed, predicting that they're going to win because Western societies are open.
The media system is open.
so there is just an open season for any kind of disinformation campaigns,
and there is nothing we can do against it because they have perfected the Soviets over decades, right?
We know it didn't play out that way, fortunately.
And I argue, obviously, that's because of the inherent kind of limitations of this instrument
that haven't changed, even though the technology is different now.
Why do you think we're so bad at seeing the limitations?
Well, on the one hand, because there's so little data information available, right?
And it's activity where the sponsors, they hide their identity usually.
In most cases, they also hide the activity itself.
And the victims usually have an interest in, you know, not revealing any kind of embarrassing interference
that, you know, basically caught them with their pants down, where they weren't watching well enough either.
So it's rare that information comes up.
And even within administration, within governments, intelligence services, you know, they're going to be very reluctant to say, hey, there was this sabotage operation.
We completely missed it.
Or, you know, there was this subversive operation by our adversaries.
Oh, on the other end, saying that, hey, this stuff doesn't really work.
I think you should cut our budget because, you know, it's just not that effective.
So on the one end, it's kind of exaggerating the threat or the potential that it has, either if you need the resources for,
defensive or offensive purposes.
On the other hand, it's the lack of really any data
that gives you metrics
to assess the success and failure.
On top of it, there aren't any clearly established
matters of success and failure either.
That's, I think, a real problem, too.
So mostly, and most of the literature
in the Cold War, as I said, too, it's really,
it's either historical or almost kind of veering
into this world of spy fiction, you know,
where it's all very exciting kind of James Bond stuff,
you know,
happening behind the scenes and everyone pulling the strings carefully.
And in reality,
there are just,
yeah,
there are a lot of failures that make it not look as exciting as it is,
but it's not in the interest of neither side,
really,
to play up those failures.
Well,
tell me about Prague Spring.
So it's like one of the major case studies from the Cold War that's in the book.
Yeah, it shows that because it's in many ways really similar to what happened in Ukraine.
Basically, it's a much weaker state than the Soviet Union that was in its sphere of influence.
At that time, it was part of the Soviet Union too.
But there was a change in policy.
In 1968, a new kind of young leader came to power there within the Communist Party system.
So he became the leader of the Communist Party in Czechoslovakia, Alexander Dupchev.
and he brought, he attempted to bring this country, Czechoslovakia, on a kind of liberalizing course.
So it was still supposed to be communist, but the name they gave their policy was kind of communism with the human face.
Doesn't sound bad, but for the Kremlin, that sounded awful.
And they were afraid of basically losing Czechoslovakia to the West because it was about human rights and, you know, human rights from the Soviet perspective,
that was just a kind of mask for Western interference.
So what the Soviets did was do everything they could, short of going to war,
to stop this kind of liberalizing reforms,
bring Czechoslovakia back into course or on course,
and also undermine really domestic support for this policy.
It's been the same goals in Ukraine.
And they also, and that makes it really interesting,
used a kind of new tool in their arsenal to achieve these goals.
So in Ukraine, a new tool were cyber operations.
In Czechoslovakia, it was the KGB illegal agents, which they had had for a while.
They're the best trained undercover agents with really carefully constructed cover identities
that were built up over years that would be able to speak, you know, accent-free English,
for example.
But they were trained for deployment in Western countries.
So they had never used these agents within the Soviet Union behind the Iron Curtain.
So this was the first time they used these agents to, on the one end, infiltrate government and civil society in Czechoslovakia to just gather information,
but then also to mount this active measures campaign called Kodoki that was about discrediting civil society there and the whole liberal reform movement.
and they really ran a kind of the full gamut of subversive campaigns or goals you can achieve.
On the one hand, first they tried kidnapping leaders of civil society, which didn't work out
because the agents who were supposed to do that didn't speak the language, a bit of a problem.
So they couldn't convince them to jump in the car with them.
The second one, really cynical kind of crazy, almost depraved plot.
was to kill the Russian spouses of Czechoslovakians
and then blame it on the reform movement.
And they didn't do this.
Someone, I think, leaked this information
or this plan from within.
And finally, and the one that was most consequential
was about really building a coalition covertly
within the government in Czechoslovakia
of what they called healthy forces
who were pro-Soviet.
and try to mount a coup from within and take over control in Czechoslovakia from within this way.
But they didn't trust subversion enough to do this independently, but they still used military force.
So this giant invasion that happened was what the Prague Spring has become famous for.
That's what they did really was invade and then try to take over control, mount a coup from within.
take control at the same time that the invasion happened because they didn't trust that this coalition
they had would do it alone and it didn't work out so ultimately what they had was this military force
there all their subversive efforts had really fallen short similar to ukraine there was too much
domestic support from the soviet side for the liberal reform movement their coup attempt had
failed. Their attempts to discredit the reform movement had failed too. So among other things,
they also put these false weapons catches in just basically digging a hole, putting weapons
there that said made in USA and then claiming that this was CIA support and then publishing it
in the main communist newspaper Prada the next day. But what they had somehow not considered
was that it would be suspicious that those made in USA weapons were placed in Soviet-made
bags, which journalists in Czechoslovakia then pointed out. And, you know, people, and not only
people, the interior minister of Czechoslovakia went on the record, put a published an op-ed in the
newspaper there saying that, hey, this is an operation, you know, by the Soviets. There's nothing
true about this. There is no counter-revolution. So Czechoslovakians didn't believe in it. The coup
failed. And then Russia had this occupying army. And I discovered all these letters from Soviet soldiers
in the former KGB archive in Ukraine
that document the situation they faced
because they were similarly to now in Ukraine,
they were told that they would be welcome
with open arms as liberators.
And instead they said people are throwing shit at them in the street,
not giving him any food,
and they're just hidden somewhere in a foggy and muddy forest
waiting for something to happen,
regularly thinking of putting a gun to their head.
That's how one of the soldiers put it.
So very much almost complete failure here as well, apart from the invasion, which worked, obviously, because they had overwhelmingly more power.
All right, angry planet listeners.
We're going to pause there for a break.
We'll be right back after this.
Welcome back, Angry Planet listeners.
Can you give me an example of a subversion campaign that worked?
Yeah, this is a good question, I think.
I've thought about it a bit before.
there was one thing where actually spent some time today to prepare,
thinking of what does it look like if it worked, right?
So, because mostly, as I said, it's really a history of failure.
And the problems also, it's hard to define what a success would look like
because you don't really have that established kind of framework.
And also, success at the highest level would mean that you would only feel the shadows of
the subversion campaign.
Yeah, exactly.
A successful operation means you just think like, oh, this is not good.
But, you know, it looks like a good example is what just happened in Germany.
I don't know if this came up in US news prominently.
That was there was a fire and a weapons manufacturing plant in Berlin, a huge fire that led to the whole thing collapsing.
And it's one of the manufacturers that produces this air defense system that Ukraine needs urgently.
And it was supposed to be just an accident that happened in February, but it looked already suspicious.
And now, yesterday, Wall Street Journal published a story saying that almost certainly this was the work of Russian saboteurs that just made it look like an accident, which means basically it's a harm out of a successful operation, right?
It looked like an accident to the victim.
Yeah, but in any case, so what I think the biggest problem really is if you look at individual operations, because you don't know usually,
who's behind it, it's not clear also what the objectives are. You can really only guess. So it's
reasonably clear normally who is behind it, but it's almost never clear what they were trying to
achieve because no one ever states this in public, right? No intelligence agencies says, hey,
we ran this operation, we wanted to achieve this outcome. So that means you have almost
endless speculation about possible outcomes that can all be equally kind of plausible.
that's why I've decided in the book also to really look at one level higher,
not the operational level, but the strategic level, where it is relatively clear.
We know what the goals of Russia, for example, are in Ukraine.
They even state those publicly, right?
They have their war goals, for example.
So then in that context, success or an operation that work would mean that it measurably contributes
towards these strategic goals in any way, or at least plausibly makes.
a contribution. That's the way I also, that's the kind of threshold I put in the book for an
operational campaign that works. But the goals, they differ in scope, really, of subversive
operations and campaigns. I mentioned the most popular one, it's overthrow of a regime. So that's
very clear, right? If it works, the regime is gone. And a good example is, I think, the CIA
operation against Mosadegh in Iran in 1953.
brought the Shah to power, that worked.
He was toppled.
The Shah came to power.
So the operation itself achieved its goal.
But then was it, if you look, you know, kind of zoomed further out.
Was it a strategic success since that directly kind of built, you know, the fertile ground for
the Islamic Revolution because they helped the autocratic regime to power that was very unpopular
there?
and now we still deal we as the West with this Iranian regime
that is operating not really in line with U.S. interests, right?
And you wonder, would the same have happened without the operation?
Yeah, it becomes really impossible to know.
But still, if you look at an operation or a campaign itself,
I think it's a clear example of success.
The second one is manipulation of policy from within,
which is really kind of similar means.
it's about manipulating changing government policy from within but less scope right you just want to change what a government does not necessarily bring down a government itself and i think here a good example is in in germany there was a scandal uncovered a few months ago that a foundation that was directly involved in kind of facilitating the approval of this north stream two pipeline that was ultimately suburbia
had received millions of euros donations from gas from the Russian states-owned oil giant.
And the politician involved in there was also directly kind of facilitating pushing basically for the approval of this pipeline.
And that's, it looks like a, you know, very clear example of a plausible success here of manipulating policy from within.
And not only this one, there are more examples in Germany, especially of Russian kind of interference at the highest levels.
Also in the Interior Ministry to spies where I've covered that were directly involved in policy related to the energy transition.
So also here we basically, if there are policy, either the changes, policy changes in the interest of the sponsor of the operation or doesn't, that is a relatively clear success.
But on the other end with erosion strategies, that's the kind of third type of subversive strategy.
It becomes much harder because erosion is really not about some specific outcome,
but it's really about long-term kind of weakening your enemy from within undermining trust,
cohesion and society, sabotaging infrastructure, just making things not work as well as they used to,
but without revealing the influence.
But then the question becomes, right, how do you, and the challenge as a researcher is, how do you measure this change if it's a really diffuse thing that happens over a long, long period?
So I have this running, long running debate with the authors of the cyber persistence theory book.
I don't know if you've come across it.
That informs US strategy now that's called persistent engagement, right, the US cyber strategy, which is very much based on the assumption that cyber operations are this very,
this very effective means of projecting power short of war.
And that's very much aligned with this erosion strategy,
because the idea is that it's kind of a strategy of a thousand cuts,
where each individual operation doesn't really change anything,
but over the long term, it still kind of slowly bleeds you out as a victim.
But then the challenge I have posed to them is if that's true, right,
why don't we see that in Ukraine?
Because it's been 10 years.
So how much time would you need?
And one of them suggested that, well, the time in Ukraine is not enough.
You need more time for it to work.
But if you think about, say, you know, Ukraine, you have 10 years.
So if you think about 20 years of this covert influence, how much changes in a country over 20 years, right?
And globally, the economic situation changes.
There are different governments in place.
Geopolitical competition changes too.
So there are so many other important and relevant factors that it becomes basically impossible to isolate the,
influence of a subversive operation or cyber operation or even campaigns that run over multiple
years. So you could imagine in Ukraine also that maybe Ukraine is worse off now than it would
have been without the Russian interference. It's plausible, but it's almost impossible to prove.
And I think that partially also explains why there is this lingering fear about subversive
influence, disinformation campaigns, cyber operations, cyber warfare too, because we're
we can't plausibly imagine that this stuff works and has an effect.
And there's kind of anecdotal evidence that supports that fear, right?
I mean, the conspiracy theory about the moon landing, we know that's a KGB operation that kind of, you know, fed and amplified it.
And a lot of people still believe in that.
So was the, maybe I'll get some listeners mad at me, but Kennedy assassination as well was a was a KGB disinformation campaign,
or at least was,
was aided by it in its early stages.
Because,
you know,
they had a,
because there was a Marine
that had gone to the Soviet Union
and came back,
and they,
they said,
oh,
this is,
this is not good for us.
We should probably throw,
we should probably throw some curve balls here
just in case it comes back on us.
Yeah.
Yeah,
exactly.
There are a bunch of examples of,
you know,
success at least in influencing some people.
But it's very,
hard to pin down and isolated, right? I mean, because it always, subversion always feeds on some
existing vulnerabilities, some existing tension. It just is about amplifying it. And there are enough
people who distrust the government. So in Ukraine, for example, the trust in government has been
extremely low over the last 10 years too. So then Russian operations are trying to amplify this,
make it worse, but you would expect that if that's the case, you would see at least a kind of
negative trajectory of trust in the government. But that's not the case. I was digging out the
survey data from the Kiev International Institute of Sociology, and that shows the opposite.
Even though trust is really catastrophically low, it's like, I think 8% in 2015. I have to look
at the numbers directly, but something around that. And then after the end of, you know,
after basically 10 years of Russian campaigns, eight years, it's at 12 or 13 percent.
So actually the trust has slightly increased.
It's not evident that at the national level, right, you have an impact of these campaigns,
even though at the individual level, I ran a study with two other scholars, three other scholars.
We were looking at the impact of disinformation campaigns on people.
We ran a survey in Ukraine.
It's clear that in the individual level, at that level there,
bunch of people who believe in the most kind of outlandish stuff that the U.S. has built
biolaps in Ukraine, for example, and are spreading some viruses. So clearly some people believe in
it, but then linking that to some change in politics at the national or maybe international level
is much harder. It's a much harder kind of higher kind of threshold to pass. And I don't see a lot of,
I don't haven't seen actually any evidence that proves that disinformation campaigns, for example,
can pass that threshold. Similar with the.
the cyber campaigns over a long term.
Well, it's funny because there was this fear that I really think in America reached a fever
pitch around the 2016 election, that cyber campaigns, social media, born disinformation
campaigns were going to change the very fabric of truth, and we were all screwed, right?
And now we're worried about deep fakes and AI-driven eras.
of false truth.
But it seems to be what you're saying from the data that we've got,
which is like not super great,
that people,
maybe it's not that people are hardened to this stuff,
but it's that it doesn't always translate into the wider political change
that you would think it does.
Yeah, and it's also that the fear is not new, right?
You see it throughout the cold war.
is then McCarthy kind of almost witch hunt of communist in the U.S.
It's based on almost the same type of expectation and panic, right,
that there is some kind of shadowy influence that is changing everything that you can't trust anyone.
There's something in the body politic that's rotten.
Yeah, exactly.
And I think that's the, it's dangerous.
So on the one hand, it's interesting as a researcher for me,
because there is an element of something I've been thinking about,
recently of really kind of unintended success in subversion where you do something you kind of screw it up
completely but your victim you know still plays it up in a way that you know it looks like you are
kind of a godlike actor and i think that's the 2016 election i think is a good example not that the russians
screwed it up completely but what they actually achieved was extremely modest right so they succeeded
to steal some data from the democratic national committee they also saw
succeeded to create a website that looked somehow to a lot of people as a legitimate kind of out there,
at least for a short term, of some kind of whistleblowing stuff, right? And then they managed to get
this content, also these emails, which to some extent were embarrassing because, yeah, they were
internal kind of emails that were not supposed to come out and get the interest of the media.
But what if you think about a political impact, that only has.
happened because media got interested in it and also not only kept publishing the content of
these kind of leaks that were trickling out, I think that was intentional to keep kind of a slow
and steady drip of damaging information, but not only putting out the information, but also
linking it with, as you said, that narrative of, you know, Russia is incredibly good at this.
They're everywhere. They're manipulating everyone. There was this New York Times article about the
perfect weapon actually that made me think about the linkage to subversion as the perfect weapon because
it's the same idea. I think the subtitle was how Russian cyberpower invaded the US. And that's the
idea. There is this shadowy power that, you know, is responsible for almost all ills. And I think
that's a real danger then because it distracts from the actual domestic causes of the tensions
and lack of trust, lack of cohesion, conflict in societies that make it possible to run
these campaigns and maybe amplify them.
But then if the evil
is, you know, Russian propaganda,
Russian disinformation, Russian interference,
you'll never get really to the bottom of it.
And I think that that really plays
into the hands of your adversary there.
If it's combined with, you know, playing up what they can do.
So, I mean, a good example on the cyber side for me
of this kind of dynamic is
the discovery of these Russian compromises in US
infrastructure in 2019.
which were, you know, a lot of, the commentary you often read on those stories is that it's the next kind of step in the continuing escalation of cyber war.
Now they're everywhere in the infrastructure, they have these implants that in case of whatever scenario, some escalation or, you know, some further tensions, they can just shut down the power in the U.S.
That's the long-running kind of fear scenario, right?
The cyber pearl harbor kind of scenario.
But if you look at what actually happened,
just the evidence that is out there
and the consequences is that the U.S. and its cybersecurity firms
intelligence services succeeded to discover these intrusions
and neutralized them.
So this was a failure from the Russian side
because they lost those implants.
Sure, they will come back and try to come back in,
but at face value, this is a failure.
They were not careful enough.
They lost their access before they could produce
any effect. But if you frame it as, you know, this is, you know, it shows the Russian capabilities,
this is how far they have come. That's exactly what kind of lays into Russia's hands there and
what feeds these unrealistic expectations that we all, I think, read in the wake of the invasion
of Ukraine. Some people even arguing that Russia doesn't need to attack militarily because they can
just do everything with cyber operations, right? The kind of shock and awe scenario there.
Cyber operations are fairly limited, though.
It's not that they aren't important,
but it's, I think we are learning that
they're not to be all end all, right?
Yeah, and I think that's an important lesson.
But I also, I hope we don't, as a kind of society
or society is gone too far
and then just completely kind of dismissing the threat.
Because it's clearly not true that cyber operations
are the strategic kind of strike instruments that are similar to airstrikes in a way.
I mean, that's an analogy that informs also the cyber Pearl Harbor idea quite literally, right?
Because Pearl Harbor was a surprise aerial attack.
But they can still cause significant disruption, and we see it throughout.
Ransomware is a good example now, too, of operations campaigns that cause real damage.
But not only that, I mean, there are examples of cyber operations that cause
significant impact on the organizations affected on, you know, individuals that are affected
the power outages in Ukraine that Russia successfully caused too.
They weren't really strategically very useful for Russia, but they still had an impact on
people.
So I think what's important is to be realistic about the kinds of effects that cyber operations
can achieve and also distinguish between different types and think about it in a kind of
strategic context, right, without this idea of, I think the problem is that it's seen as a new
technological tool and kind of magic often. And I think that reproduces somehow there's a,
there's a pattern if you look back in history to whenever there is a new technology that's
useful in conflict, the first kind of reaction is to where this changes everything.
So when aerial warfare started, when you had warplanes, the expectation was the prediction,
was that ground warfare is going to become obsolete because any fortifications you have can be
bypassed instantly and you know you can just drop bombs concentrate mass there but it turns out
that's not enough to control territory so ultimately the really kind of ground warfare is still
relevant and sadly terribly we see the same as the case in ukraine now so there has been this
expectation there for, yeah, a long time. And I think the cyber war debates just really renewed it.
And then cyber is seen as this kind of one revolutionary tool that changes everything about
conflict. But there are many different types of operations, right? It's a very different thing if you do
espionage with a cyber operation compared to actively disrupting critical infrastructure to
use the most extreme example. And given these limitations, right, this dilemma, as I
call it. But even if you don't buy into the trilemma itself, it's clear that cyber operations
have these limitations. It's not straightforward. It's not simple to cause significant damage,
especially physical damage, against an adversary. So that means you need to think strategically
about your adversary, right? What do they want to achieve? What do you think they want to achieve?
What are their goals? And what are the most likely means they're going to use to achieve those goals?
And if it's about physical destruction, cyber operations are just not very good at it. So
you should probably expect some military attack, right?
You just expect missiles to hit you, and that's what happened in Ukraine.
Critical infrastructure is being attacked daily, but with missiles, not with cyber operations.
Sure, sometimes there are also cyber operations, but very rarely, and they have had far less of an impact.
And it also doesn't make sense for Russia in that situation once you're in an open conflict, right?
why would you want to invest all the time and resources to find some flaw in your adversary system
and then, you know, carefully kind of manipulate that to cause a disruption when you can just hit it with a missile.
But on the other hand, in kind of long-term competition, especially between the U.S. and China, in the current scenario,
cyber operations do have, you know, a lot of use, especially in this kind of long-term erosion campaigns.
if they are done well.
And I think that so in that way,
I do agree with the authors of cyber persistence theory.
It's useful.
It's much more useful to think about those kinds of threats
compared to the military cyber strikes scenarios,
but still with a caveat that we should also be realistic
that this is not the thing that's going to bring us down
as societies likely,
especially not the cyber operations themselves.
because they do have these limitations
and the main one being really that
to achieve any effect in the physical world
they have to always go through
a computer system that is embedded
somehow in the physical world can produce
that effect, right?
If you want to create an explosion, you need to find a computer
system that is connected to some
equipment that can actually
explode. And in contrast
to that traditional subversion
has an advantage in that
regard and causing effects in the physical
world for the really basic fact that it
uses humans, human agents, who have a physical body, right?
And they can, if need, be, they can open doors, they can break into doors, they can
threaten people, they can hurt people too, they can use violence, at least within their
kind of limited capacity without, you know, being discovered.
And in that way, when it comes to producing more intense effects, traditional subversion has
a real advantage.
And I think that's a key point to consider when we think about relative threat.
especially as geopolitical competition heats up,
cyber operations are relatively good at this large-scale,
low-level disruption.
But they're relatively bad and, you know,
kind of targeted surgical strikes that cause damage.
Traditional tools are much better for that.
And we do have examples like the facility I mentioned in Germany, right,
that went up in flames.
But also the Iran's nuclear program,
that Natanz facility that was hit first by,
virus dachshnet in 2010, which at the time was seen as the future harbinger of, you know,
or the harbinger of future of cyber war, of exactly this kind of surgical strike.
Someone called it a cyber cruise missile, but which took five years to prepare,
had relatively limited effect, caused damage there, but still probably only cost around three months
or so delayed to Riyan's nuclear program.
And then the follow-up operation by Israel against the same facility that caused a massive
explosion there that really just destroyed a lot of these centrifuges also broke someone's
leg who wanted to look what happened and then fell into the crater somewhat strange story.
That was first also some commentators came out saying this was another cyber strike,
but turns out there was completely done by traditional means with infiltrators, basically subversion,
right, undercover agents who worked at the facility under false identities and then managed to
smuggle in equipment that had plastic explosive, I think, hadn't hidden within it.
That's the story that came out in the Jerusalem Post a few months after and had so many
intricate details.
And that is quite likely that it was a leak by Mossad claiming kind of anonymously indirectly
responsibility for that operation, which tells us something, right?
They used the cyber operation against that target, achieved relatively limited effects,
wanted to cause more damage, and then fell back to the traditional.
means there. So it's about, yeah, long, long answer short, it's about seeing it in the larger
strategic context as a tool that states use for a specific ends and then considering what the
ends are and then what the tools most likely used are and what we can do against them.
That example is really funny because I think most people remember Stuxnet and they don't remember
the bombing. Yeah, I know. It just kind of falls by the wayside. I want to write something about
this at some stage too, because I think it needs to be highlighted.
It's become the defining case in a way of cyber warfare.
And on the one hand, it shows the possibilities, but it also very clearly shows the
limitations. And then that fallback, it's fascinating to me.
Yeah, when you have to go back in afterwards with a traditional explosive,
the, you know, maybe you shouldn't be as so worried about cyber Pearl Harbor.
You should be worried about what comes after it when it doesn't work out.
Yeah, and I do worry about it.
Yeah, no, we absolutely should.
Because as long as people are typing away at their computers,
they're not sending in ground troops.
Yeah, exactly.
It's a line I sometimes use, right?
I'd much rather be the victim of a cyber attack
compared to conventional military attack any day.
Yeah, you know, we're seeing that play out
in literally in Ukraine right now, right?
Yeah.
Where, I mean, they did,
Russia did invade in 2014, right?
Like the eastern part of the country was taken over,
but there was a been 10 years of a subversion campaign,
backed up by a little bit of force on the eastern side,
not a little bit, a lot of force on the eastern side,
and then a full-scale invasion when that doesn't work.
Yeah.
Right?
So it does seem like,
at least as a pre you can always mark it as a prelude to violence.
If you see a subversion campaign fail wildly,
something worse may be coming.
There may be tanks in the streets of Prague.
Yeah.
And a question that I haven't found an answer for yet
is whether in that way it does make the use of violence more likely or not.
And that's hard.
It's a kind of chicken and egg problem, right?
because once you commit kind of to a cause as an actor, right, you have a goal, you want to achieve it,
and then you commit some resources to the subversive campaign.
You do have some sunk costs.
And then, as I said, like in Ukraine, there's the option it failed.
So what do you do?
Do you escalate further or, you know, do you kind of step back?
And perhaps having those sunk costs makes the difference.
But on the other hand, the choice in favor of subversion also shows the reluctance to escalate, at least initially.
But I haven't, yeah, I haven't looked at this more systematically.
I wonder if you'd looked at it, right, the number of cases where subversion was used,
different types of subversion, also perhaps different goals, if they make the use of violence more or less likely.
And if you use an erosion campaign and it works out, you avoid maybe World War III,
if you think about the China and US scenario.
You might not like the outcome, but from a, you know, very kind of long,
long lens or short lens I should say it might still be preferable to yeah nuclear war which is
unfortunately a realistic scenario on that case right what did you make of I'll ask I'll ask
about a recent recently an earthed subversion campaign um here at the end of our conversation
what did you make of this uh the US led anti-vax disinformation campaign
that they were deploying in China.
Yeah, it was quite incredulous when I read that.
Because it just seems like such a bad idea on every level, really, right?
You can't see how it makes sense that you think, okay, we're going to hurt China
by discrediting their vaccine or the vaccine in general and, you know, making less people get
vaccinated.
but how does that really serve your interest,
especially targeting an ally Philippines or an aspirational ally
in the middle of a pandemic?
And clearly there were people who had, you know,
doubts about this morally probably.
And I think that's the reason why we know about it now too.
It looks like someone leaked it.
It's very clearly someone who leaked it from within
who wanted this to come out to, you know,
make sure this kind of stuff doesn't really happen again,
which is a risk with things that are,
especially when it gets into morally questionable terrain, I think.
This is a really, in a comparison, it's a very harmless case,
but the same outcome that you saw with that thing I mentioned much earlier in Czechoslovakia
with the assassination plot, assassination plot, right?
That's also someone who, at least from what we know,
someone who came across this within the service
and leaked it to make sure this doesn't happen
because they found it reprehensible.
So, yeah, I mean, I think it's clear to say
this doesn't look like a very fruitful approach
and also, yeah, especially disinformation campaigns,
I think that don't really suit democracies.
I think the risk is, and that goes back to the fear of,
you know, I think it's directly linked actually,
to the fear that this is this kind of all-powerful tool,
and we have to use it too.
But the opposite is the case, if you look at the Cold War,
and I think we talked about at the very start of the conversation,
because it's a really kind of two-edged source.
You use this tool, you use disinformation tools,
kind of so confusion in your adversary's head
and make them question everything around you.
But that same applies to yourself.
So there is this intelligence theorist, Michael Herman,
who compared it to use the analogy of like it's putting like a,
it's like putting a virus in your enemy's bloodstream.
But the thing has been just like a real virus,
it can spread to yourself.
You can't control the spread.
And that's evident throughout the Cold War that the KGB especially,
they became less and less efficient and effective
because they believed not only into their own disinformation campaigns,
but they also just saw conspiracy everywhere.
So there is this book,
It's fantastic from a research perspective.
The Sword and the Shield by Christopher Andrews and forgot his first name.
I think, yeah, I don't want to guess.
Me Toki, is his last name, who was in the KGB archive and smuggled this whole trove of
operational documents out of it.
And throughout that book, it becomes clear that traces the history of KGB operations,
that they became more and more obsessed with hunting down traitors in their own ranks.
supposed traitors that distracted from their core task, which was to weaken the United States.
And on top of that, real kind of, you know, large-scale policy failures where they believed in
their own disinformation, like in the public support for the Soviets in Ukraine or public support
as a contemporary example, or public support for the Soviets in Czechoslovakia.
So it backfires.
And over the long run, I think it weakens, it weakens a reason.
regime from within, it causes a lot of dysfunction.
So rather than as a disadvantage, I think we should really see the openness and the chaos
of our Western societies and democracy and our media systems as an advantage, because
at least there is a potential for correction there.
You can have multiple narratives.
It's not that, you know, you have one organization that controls a negative from the
top, a narrative from the top down.
And once that's dysfunctional, there is really no alternative anymore.
So rather than seeing it as a weakness and thinking that, you know, we have to now do this too to our enemies,
I think the better strategy is to, you know, confidence on the one hand that, yeah, I mean, we have a good system.
It's flawed in many ways.
And, you know, not everything works the way we would want it to, but it's still much better than the alternative.
And there is some, there's a lot of reason actually to be hopeful and looking, you know, optimistically in the future.
In the last big contest, this system prevailed.
And I think it's important to keep that in mind and to really focus on those strengths.
And rather than using disinformation and discrediting, I think this is mean it's obviously, it's a huge stain on the US as supposedly the leader of the free world and the rules-based order running such a campaign, right, that doesn't fit together at all.
So it just hurts you in the short run, in the long run too.
Sir, thank you so much for coming on to Angry Planet and walking us through this.
Where can people find the book?
It should be available everywhere now, finally.
So it's on Amazon, and it should be also in most bookstores, especially over where you are.
And give us the title one more time?
It's called Subversion from Covert Operations to Cyber Conflict, Oxford University Press, the publisher.
Excellent. Thank you so much.
Thanks. Thanks for taking the time.
That's all for this week. Angry Planet.
listeners, as always, Angry Planet is me, Matthew Galt, Jason Fields, and Kevin O'Dell.
You like us, you really like us.
Go to Angry PlanetPod.com.
$9 a month gets you access to commercial-free versions of the mainline episodes,
bonus episodes, and the written one.
We will be back again soon with another conversation about conflict on an angry planet.
Stay safe until then.
