Irregular Warfare Podcast - The Rise of Digital Repression: How Technology is Reshaping Power, Politics, and Resistance
Episode Date: July 28, 2023Be sure to visit the Irregular Warfare Initiative website to see all of the new articles, podcast episodes, and other content the IWI team is producing! What happens when authoritarianism expands into... online environments? A form of digital repression takes shape. But what does that actually look like? What are the specific ways that authoritarian regimes use new technologies to control their populations? And how are resistance groups adapting to overcome digital repression? This episode addresses those questions as hosts Matt Moellering and Adam Darnley-Stuart are joined by Steven Feldstein, author of the book The Rise of Digital Repression: How Technology is Reshaping Power, Politics, and Resistance, and John Tullius, who retired from the CIA in 2019 and now teaches classes on intelligence at the Naval Postgraduate School. Intro music: "Unsilenced" by Ketsa Outro music: "Launch" by Ketsa CC BY-NC-ND 4.0
Transcript
Discussion (0)
It means having facial recognition, biometric surveillance in all the public squares.
It means using social media monitoring and so forth to bring together all these different streams of information
as a way to better track and understand who he actually is within that population,
how to control them, and frankly, how to incarcerate them.
I can simplify that one for you. In my mind, this is the chapter Orwell
forgot to write at the end of the book, some form of active measures gone real cyber mode.
Welcome to episode 84 of the Irregular Warfare podcast. I'm your host, Matt Mullering,
and today I'll be joined by my co-host, Julia McLennan. Today's episode is our fourth installment of the IWI Project on Cyber, where we look at
Steve Feldstein's book, The Rise of Digital Repression, how technology is reshaping power,
politics, and resistance. Our guests delve into what digital repression is and how authoritarian
regimes are using new technologies to control their populations.
They then look at how resistance groups engage with oppressive regimes and combat these challenges.
After that, they discuss how digital repression affects democracy and how information warfare intersects with digital repression.
Finally, they discuss how the irregular warfare community can help resistance movements combat digital repression.
Stephen Feldstein is a senior fellow at the Carnegie Endowment for International Peace
in the Democracy, Conflict, and Governance program. Before that, he served in multiple
policy positions throughout the U.S. government, including serving as the Deputy Assistant
Secretary in the Democracy, Human Rights, and Labor Bureau in the U.S. Department of State under President Obama. His research focuses on technology and politics,
U.S. foreign policy, and international relations. John Tullius retired from the CIA in 2019,
where he held a variety of positions, including managing China S&T analysis,
working overseas as the Iranian nuclear expert,
managing a group of big data analysts,
and then operating OSE's bureaus in Europe and the Middle East during the Arab Spring, the emergence of foreign fighters, and ISIS.
Currently, John teaches classes on intelligence at the Naval Postgraduate School.
You are listening to a special series of the Irregular Warfare podcast, a joint production
of the Princeton Empirical Studies of Conflict Project and the Modern War Institute at West
Point, dedicated to bridging the gap between scholars and practitioners to support the
community of irregular warfare professionals. Here's our conversation with Mr. Stephen Feldstein
and John Tullius.
Steve, John, thank you for joining us on the Irregular Warfare podcast.
Thanks for having me. I'm really excited to engage in the conversation.
Thanks, Matt. Looking forward to talking with you guys on this really important issue.
So, Steve, can you define digital repression and why you decided to write your book? So the idea behind digital repression is essentially the use of information and communications technologies by governments as a means to coerce, manipulate, and otherwise deter individuals and groups who would challenge the state.
And the context behind it was essentially that I think for a period of time, especially when people were just starting to use social media platforms and other types of digital technologies,
especially when people were just starting to use social media platforms and other types of digital technologies,
there's a real sense that liberation technology was going to allow citizens new ways to bring accountability to the governments,
to challenge the state and to remove authoritarians and dictators and bring about, you know, more pluralistic democracies.
And I think there was a lot of promise in that initially, especially in the kind of the early 2010s, but then the switch is flipped.
And what I started to see was new ways that governments were using these same technologies to repress individuals, to deny citizens and suppress their ability to speak freely online
and to spread disinformation as a way to advance their political objectives.
And so the goal of the book was really
to capture this new framing, to understand what these dynamics look like, to better chart and
make sense of these trends, and then to think about different ways that citizens, individuals,
opposition, journalists, and others could fight back against this kind of growing trend
of digital oppression. So that's the sort of stage setting.
John, you were in the CIA, where you got firsthand experience with some of these things.
And now you're studying the topic. Can you talk from your experience and your research
how you see digital repression changing irregular warfare?
Yeah, thanks. And Steve, I think your definition is great. And I share all of your concerns that
you've laid out. As a practitioner and someone
who's worked at the agency and now teaching our SOF operators, I take maybe just a little bit
broader look at it in terms of not just digital repression, but how authoritarians and adversaries
are using the digital domain to conduct hybrid warfare against us and or how we can help some
of our allies who are, say, periphery states to China
or Russia better protect themselves from the threats that we know are continually and ongoing
hitting them in the cyber domain. If you look at, for example, what the Russians have been doing to
us probably for the last decade or so, it's hard to argue that this is anything but some form of
active measures gone real cyber mode. And they've been extremely effective at a low cost with a low resource commitment for Putin and the Russians. And I think that if you look at the Nick report on to be very proactive in the way that we're thinking about, not just in a tactical, responsive way, but incorporating this kind of strategies into all of our campaigns and into all of what we're doing as an intel community, within DoD, and as a government writ large.
So in a nutshell, I just think that this whole digital thing has changed fundamentally, even how we think about irregular warfare.
So with that, how significant is digital repression and how is it changing how states interact with its populations and other populations?
Yeah, you know, I think one way to think about it is the following, which is that increasingly we're seeing societies rely on digital means to conduct all aspects of people's lives.
rely on digital means to conduct all aspects of people's lives. So, you know, individuals go online for shopping, communicate with loved ones via social media or other types of platforms.
Even at work these days, you know, whether it's conducting meetings by Zoom or using Slack as a
way to communicate with colleagues, so much of what we're doing throughout our lives is based
on sort of digital technology. So it shouldn't come as a surprise then that this also is a portal
and is a vector for different types of political control, whether we're talking about governments repressing their citizens or whether we're talking about kind of adversarial conduct between states.
The digital has become increasingly important.
And I think what's interesting is that when I wrote the book, at the moment, I felt we were sort of in a transitional period where you could still kind of separate out kind of more traditional means of oppression from the digital side. But I think increasingly those are blurred. I think as our
society itself has become blurred in terms of what is digital, what is not, and what is work,
and what isn't, and so forth, I think the same idea extends to the instruments of control and
influence that we're talking about. And to sort of say, well, this is traditional repression,
and this is digital, and the two maybe meet in a certain area. To me, that doesn't really actually
work anymore. I think it's all blended together. I think that's one thing that's worth unpacking.
I started grad school in the early nineties. And of course that's the wall had fallen and we'd
seen this great emergence of democracy, pretty much a big tide rolling across the globe. And I
think everyone sort of
had this expectation that if anything, that momentum would continue to see more democratic
regimes. And then along comes the internet and social media platforms. I think people initially
thought that these would be tools to help enable greater democratic discussions and governments and
all that kind of stuff. And I think the real unintended consequence that we saw
is now we've got this reversal of democracy globally and autocrats are taking over. And I
think you could pinpoint with every single one of these that the digital domain has played an
enormous part in their success. And these guys are talking a lot of the same language. They're
using a lot of the same tactics and techniques and tools.
This really constitutes a big threat to democracy globally, and I'm really worried about it.
Your responses actually transition us perfectly to the next set of questions that we want to
bring up with you. John, we'll start with you. So is digital repression a shift from previous
techniques used by authoritarian states, or do you see it as more of the same, or what's different about it?
to their populations, using information to control populations and so forth. But really, I think the digital age now has just accelerated and provided increased capabilities for governments to really
crack down on their communities, surveilling them more actively, having much better pulse on what
people are talking about and planning. So all of this has just been, in my mind, you know,
you go back to the Soviet Union and Stalin's active measures. This is no different, but it's
just really active measures on steroids because it's provided just such an enhancement to what
these states can do, both with their own populations and then externally directed at us or their other
adversaries. So yeah, this has just been, I believe, a massive shift in their ability to exert control.
You know, I think John makes an interesting point,
especially if you kind of look back to some of the kind of Cold War antecedents right here.
You know, one of the examples I use in the book is to look at the Stasi in East Germany
as a good case in point when it comes to surveillance.
So surveillance is like one of the big tools that we talk about when it comes to digital technologies and how there's
increasingly sophisticated methods to do that. But of course, surveillance wasn't something that
was invented in the 2010s. It's been a longstanding practice dating back decades, far more when it
comes to the state wanting to monitor and track what citizens are up to, what they're doing,
and to preempt potential challenges that might result. And so if you look at kind of the way traditionally surveillance might have taken place, you know,
with the Stasi, something like a million citizens or so were enlisted to physically spy on their
neighbors. And, you know, the cost associated with that in terms of, you know, the money,
the lost productivity that was needed because people are watching close friends as opposed
to actually doing work. I mean, the lost productivity from that is immense. So if you can free up that labor
and rely on digital tools, and now increasingly AI machine learning to take on some of that onus,
you know, you can actually use those same resources. Those same resources can, first of all,
go further, and you can free up those resources for other means as well. And so I think that's
kind of one of the ways in which you can sort of see there is very much a linkage, but there also is a transition as well and a new dimension to how these digital
tools are being used in this way. You mentioned in there AI and machine learning. What ways have
states utilized digital tools and platforms to suppress, dissent, and control information flow?
What are these main tools of digital repression? Well, I can offer a couple examples in the book, and I'm
sure John has some ideas on this as well. I mean, one of the classic kind of countries that people
look to is China. And in particular, there's been a lot of attention paid to Xinjiang in China and
the use of algorithmic tools as a means to subjugate a population that has periodically
challenged Beijing's rule. And I think the response back from the authorities,
from Chinese authorities, have been to kind of implement a combination of grid-style policing,
so establishing every few blocks of police outposts as a way to control the population,
and then using these advanced algorithmic tools as a way to further augment those efforts. So that
means everything from DNA sequencing and
taking mandatory blood samples from individuals, from Uyghurs, as a way to kind of map out who's
connected to whom. It means having facial recognition, biometric surveillance in all
the public squares. It means using social media monitoring and so forth to bring together all
these different streams of information as a way to better track and understand who he actually is
within that population, how to control them, and frankly, how to incarcerate them. So you
have a million plus Uyghurs who have been imprisoned, and that is precisely part of this
kind of AI-oriented surveillance apparatus that's been established there. I guess I would add,
you know, like stepping back 10 or 15 years ago, states had a variety of tools at their disposal,
including just shutting down denial
of service for periods of time if they were having strife or keyword search finders or
IP restrictions, any and all that kind of stuff.
But that's like so 2015 now.
I think China is kind of the model in a bad way of how things can go in a lot of these
other countries because they have developed really, really good
strategies now. And the new technologies coming online have enabled them to do this in a much
better, quicker way. For example, 10 years ago, everybody had CCTV on a lot of airports and street
corners, and that's all great. But CCTV requires someone to actually have to either be directly monitoring
the take or roll the tape back to yesterday at four o'clock if you want to see what was happening.
But now they have algorithms where if I want to find Steve in China somewhere, I can load his face
up and my cameras will actively be looking for his face and give me alerts anywhere he might be in my
operating area. And I can have MSS or 2-PLA
or some kind of response on him within minutes. And this is just a game changer because you will
never have good luck evading that kind of system. The next question that I wanted to ask about was
really the broader question of what each of you would share about the use of digital repression
by states and how
that impacts human rights and civil liberties within societies. I can simplify that one for you.
In my mind, this is the chapter Orwell forgot to write at the end of the book on state control of
people. And there's no other better way to put it because right now states have much greater capacity to track and control and monitor and harass and repress their populations than they ever
have.
And Steve alluded back to Stassi and having to have a million-man surveillance network
of everyone informing on each other.
Well, I just need a group of people in an office with some really cool tools right now,
and I can pretty much do all of that work with a fraction of the resources, but in a much more highly effective
sort of way. And there's no other way to put it. I mean, this is just a massive game changer. I
think we're seeing the rollback in democracy and the related rollback in civil liberties and rights
in all these countries. Now, I fully concur with what John said. Essentially, my book is documenting what you've mentioned, which is this connection between
use of digital technologies and human rights violations that ensue in the different ways,
the myriad of ways in which these technologies are being wielded in order to undermine freedom
of expression, to prevent people, you know, freedom of association when it comes for people to gather, to undermine, you know, media freedom and so forth. So every single one of these
tools has a direct connection to very specific human rights violations. In fact, I think that
this sort of issue is inseparable. With the emergence of digital repression, how has the
nature of resistance and opposition evolved? Well, I think we're seeing that playing out every day in Ukraine.
And I think what the Ukrainians have done with the digital domain to really effectively
counter pretty much everything Russia's been trying to do there, it's been a case study that
I think we're going to be learning from for a long time. What we're learning specifically in regards
to what the Ukrainians did,
not just after the Russians rolled over the border, but the years in advance to increase
their digital capabilities to counter that which they knew was going to be coming from
their experiences in Crimea and what they saw the Russians do in Georgia. And so they were very
proactive and anticipatory, and they were able to hit the Russians in a lot of neat ways that I think surprised Putin even, and he didn't see this coming.
The Ukrainians are not successfully holding off the Russians just because we gave them good weapons and they're seasoned fighters now.
But I think they really had a digital advantage, including the way they're using their drones and the way they're collecting information and all of that.
the way they're using their drones and the way they're collecting information and all of that.
And so the other thing I would just add is that I know that there's other countries that are on the periphery of, say, with China or Russia, where we're actively teaching them the resistance
programming, the heavy, heavy, heavy focus on the digital domain and how they're going to get hit
by these countries and then how they can counter them and at least mitigate some of these impacts. So to me, it's huge. I agree with that. It's really been, you
know, a really just absolutely fascinating and critical test case when it comes to how digital
technologies and the information warfare in general is affecting the battlefield. And it's
far more than just the kinetic activity that we're seeing. It extends well beyond that.
I think one of the ironies is that it was the Russians themselves who have sort of pioneered different methods when it comes to extending
warfare beyond just the battlefield into the information space. And they've sort of gotten
a bit of a taste of their own medicine because the Ukrainians have been extremely effective
in terms of putting forth their own narratives and then using that as a way to galvanize the world,
but particularly, you know, the United States and Europe to provide armaments necessary to withstand the Russian invasion.
I was able to make my way over to Kiev a couple months ago for a brief week to participate in a strategic communications forum co-sponsored by the Ukrainian government and by the Carnegie Endowment.
And throughout the course of the two days of those deliberations, we heard from all sorts of different people, government officials, journalists, bloggers, you know, different civil society groups and others talking about different
aspects and dimensions of the information war and the ways in which Ukraine was using citizens to
fight back, to push back, journalists and videographers to showcase what was actually
happening on the battlefield, collecting digital forensics when it came to wartime abuses committed
by Russian troops and so forth. So there's many dimensions to it, but really, I think it'll take years to kind of learn and
understand exactly what this has meant, what the war has meant, and how it's changing things going
forward. Yeah, I just wanted to add one other anecdote, too. I remember early in the war,
you guys recall when the Soviets were in Afghanistan and got routed out. One of the
reasons so was that a lot of the mothers kept lobbying
local politicians, and there was starting to become an upwelling of dissent from the mothers
of the dead being brought home. And at the beginning of the Ukraine thing, I remember
reading about how the Ukrainians were using facial recognition technology on dead Russian soldiers.
Then they were going onto the social media platforms to identify who they were
and who their families were. And they were the ones notifying Russian families about their dead
soldiers. When I read that early on, I was blown away at just that sophistication of what they were
doing there and the potential impact that it could have. Does the concept of resistance in the
digital age require a different mindset or skillet compared to traditional forms of resistance? Or is this just a slight evolution
of traditional irregular warfare and resistance? I mean, look, I think with digital, I think you
need to have a generation that is digitally fluent, right? You know, like people call them
digital natives. When you're dealing with more sophisticated technologies, it helps to have a
cohort who understands how it works and is able to come up with adaptations and responses as a
result. I mean, let me give you a quick example. When it comes to something like censorship or
China's Great Firewall, it's actually pretty leaky, somewhat by design. And so if you know
how to find what technologies you can use, like virtual private networks. To get around that, you can still access information and news from the outside.
Sometimes certain VPNs are compromised.
You have to find other ones.
So what it takes is an ability to be technically fluent, to have a certain capacity to kind
of get around and respond as different challenges emerge.
But I would argue that that isn't all that different from the types of tactics and the adaptations that prior generations of resistance fighters or activists
had to deploy as well. It just happens to be kind of more digitally and technologically
sophisticated. So in my mind, there's a pretty clear through line from one generation to the next.
So I've had a chance to work with operators from the IC and DOD, and there's
similar problems that I think are plaguing both sides. And I count the covert actions that the
agency might do in the IO field as a form of irregular warfare. So in my mind, some of the
problems are, one, we're always behind technologically in terms of the tools and platforms and equipment.
And that's really hard to keep up when you just don't even have access to the tools and platforms and equipment. And that's really
hard to keep up when you just don't even have access to the newest and neatest things that
are out there. And the second one is really about personnel and knowledge. And I also believe that
at least in the IC side, for a long time, the reason we were so slow on the internet and social
media is that there's just this inherent bias
towards classified information. And we had all grown up with FBIS and open source, which had
some value, but that wasn't the same kind of information that we're able to get now out of
social media. And I think people sort of conflated, oh, it's just open source and it doesn't have
that much value. And we're really slow. So for my perspective, it's going to require a mindset.
And the third part of this is that I think both sides are plagued by people who are doing
this work who either fundamentally don't really want to be doing it or they're being put in
these jobs without the right expertise or they're being rotated out after three months
or six months,
like a lot of the soft officers that I work for. If you're going to effectively operate against a
savvy country like Russia or China, you need to be having dwell time on that issue so that you
know it in and out. It's different. So at the agency, a lot of case officers who make their
bread and butter by being out on the street recruiting assets, many of them don't even like doing this kind of work because they're not recruiting people, which is what they were hired to do and what gets them promoted and all that. They're doing these covert actions that aren't as desirable.
the innovation labs that have been created to try and solve some of these issues. Do you think this is the right approach for irregular warfare practitioners, or does it require different
or other changes? So I've had a chance in my government work to work directly with In-Q-Tel
and then in my private sector work with DIU, and they're both pretty good at what they do,
actually, but I don't think that that, and it's great for addressing a lot of irregular warfare technology needs and equipment needs. And I think they're doing a pretty good job with that. But this digital domain is different. And part of why I'm not very hopeful that they're a good solution for this particular thing is one, again, that strategy issue. What In-Q-Tel and DIU are doing is they're going out and developing or
seeding or buying capabilities that are directly meeting IC and DOD requirements. And those
requirements are being driven by specific IC or DOD mission sets that have a strategy behind them.
But with this digital stuff, there's no strategy behind any of this. So what are you going to task
them to go out and do? You know, hey, go out and get us a new tool that's blah. Well, that's great,
but what are you going to do with it at the end? You know, how is it going to be incorporated into
a bigger irregular warfare campaign that you're starting to develop against whoever or to help
develop resilience in Sweden or wherever your country is that we might be helping?
It occurs to me that one of the interesting things that the Ukraine war has brought is
that it's been kind of this real life laboratory about different ways that adaptive capacities
can be used both on the battlefield, in hybrid situations, and so forth.
And I think, you know, I know from my own personal research that all sorts of capabilities
that I had little idea how they could actually be deployed into what effectiveness. I've actually been able to empirically observe that and learn a
lot just from that alone. And as the war continues to evolve, my knowledge, my understanding of new
tools and capabilities is also evolving. And I wanted to some extent, whether that is something
that can translate to military contractors and others in terms of saying, look, we can build
out more tangible,
concrete scenarios when it comes to our planning. We have a better sense of some of the possibilities
or capabilities that might be deployed or we might need. So following on to that, are there
other good examples that you know of resistance groups successfully evading techno-authoritarianism?
Doesn't necessarily always have to be technology beating technology.
Sometimes very traditional organizing efforts can go a long way.
So I'll give you an example from my book.
I did a case study in Ethiopia, and I had a chance to actually talk to some different
resistance leaders against the prior government.
And one of the big issues they were dealing with was internet shutdowns.
So you had a group of people who were fighting against the current regime, and they all of a sudden were blocked out from being
able to access email, social media, to communicate with their followers. And initially, that looked
like it was going to be a pretty significant obstacle. But what ended up happening is they
had so many physical networks that were built up over decades between villages and different
individuals who frankly had actually gotten together and known each other in prison. And they were able to use these networks to smuggle out footage,
either across the border that could then get broadcasted out to otherwise contact people
through different means. And so a lot of what I kind of talked about was this adaptive capability
that sometimes, you know, the best way to counteract an adversary or foe who has significantly
higher capabilities than the resistance movement is not to tackle adversary or foe who has significantly higher capabilities than
the resistance movement is not to tackle that or challenge it head on, but to find ways to work
around those advantages and to deploy other advantages that the resistance group might have
other technological challenge. I would just add, Steve, I think, you know, as powerful as the
digital stuff is for the authoritarian regimes, it also creates a lot
of problems for them.
If you go back to the old Soviet Union, doing your own propaganda materials and disseminating
them among the population, that was a lot harder and a lot riskier.
But with new platforms and new apps constantly hitting the scene, it's really difficult,
even for a government like China that's so sophisticated in this realm,
that a lot of times Chinese citizens are very effectively being able to use new apps. And maybe
they're doing almost like when I was in the army, the frequency hopping of radios, where they're
moving from app to app to app, and they are getting smarter. And they're learning how to
use technology in ways that maybe it makes it much tougher for
the regime. And then not necessarily resistance groups, but I also was very closely tied to
tracking ISIS and other terrorist group use of the social media platforms. And it was funny
watching them because they evolved their game. And I think there's lessons in here. When we first
started getting onto them, when I was covering the foreign fighters in Europe, well, eventually Darwinism set in and the dumb ones were killed
or captured. And they learned to evolve how they were using the digital domain,
sometimes using the regular outlets for their recruiting and propaganda, but then going out
and using a lot more sophisticated, secure, encrypted types of networks for more of
their financial and operational planning and things like that. And I think the lesson in that
is that we're going to see groups like in China, where you've got very educated, very technologically
savvy people now who are going to be able to continue to challenge the regime in this way.
So I do have some optimism in this regard that not all is done in terms of
the democracy and rights in China. Governments are not infallible by any means when it comes to
having these technological capabilities. In fact, one of the things I heard a lot when I was
researching the book in places like Thailand and the Philippines and other places is that there's
a lot of security theater. So governments will purchase sophisticated systems. They'll put out the idea that, hey, we have AI now and we can monitor
everything you say and do. So you better not say anything. And it has this sort of chilling effect
where citizens believe or have reason to believe that they're going to be washed. And so therefore
they start to self-censor when in fact, the truth of the matter is that governments are not always
that great, that so much depends on, you know, the ability of security forces and intelligence services to be able to act on this information which are well-organized and very effective,
and many of which are not or only have pockets of excellence, but otherwise have a pretty corrupt or
not that well-functioning organization. And so I think we have to just be realistic about
the fact that, yes, there is pretty scary technology that's out there, but it is humans
at the end of the day that are responsible for building the systems that will make these technologies work or not. And in many countries, that's not always the
case. Yeah. And just one more point on that, Steve, I think too, if we look at China, they had the
longest lockdowns from COVID, the economy was suffering, they had the fire where people died,
and all of a sudden you started having protests. And I'm pretty sure that social
media or some kind of outlet like that helped organize those protests. And the big surprise
for me actually was that the government didn't crack down on them harder, like in even potentially
a Tiananmen sort of way. And I was wondering why that might've been. And part of that, I believe,
is that the government now knows that this is not the early 90s.
And if they were to crack down on that, that those images and what they were doing to those people would be much more easily and readily disseminated within the Chinese population than globally.
And probably have a much bigger problem to address than placating the protesters and then relaxing the COVID restrictions.
But in my mind,
that was a really, really significant tell. It feels like there's still a chilling effect
with Hong Kong as far as their democracy is still having a hard time. It's gone now, right? Is that
just because of the fact that it's part of mainland China? Or do you think it has more to do with the
fact that the Chinese government hasn't fully impacted them the way they have the Uyghurs?
Well, I just think Hong Kong is such a unique case, right? Because just given the history and
everything, and I think the digital stuff has played a significant part of Beijing's ability
to really tighten things up and monitor what's happening there. But this is part of a much
bigger policing activities, harassing business people who aren't towing the line.
I mean, it's just such a big problem. And the other part of that too, is that most of the
people in Hong Kong who are really more Western liberal oriented have fled the first big tranche
after the turnover. And then over the last few years, you're seeing a lot of people fleeing now.
And those are the people who you hope are going to be the ones that could be that roadblock
on the autocratic turn that's happened there. But I don't see that happening.
I think Hong Kong is a very classic example of traditional coercion being used. It's a story
about a national security law that has been weaponized deliberately by the Hong Kong government,
backed up by the CCP as a way to imprison anyone who would protest.
And so it is not an example where digital technology, if it didn't exist, you wouldn't
see a crackdown. There is a crackdown because the Chinese authorities are using any means at
their disposal, starting with a draconian law and linked to imprisonments and public trials that are
completely rigged. And then, as John mentioned,
driving out anyone who has been a ringleader of the protest, either driving them into exile,
or they haven't had a chance to do that, just putting them into prison.
How much of the technology of digital repression is outsourced from other authoritarian regimes?
And how much is actually provided by companies coming from
Western democratic nations? It's probably a mix because if you look at the capabilities that are
out there right now, these are not controlled by any government export controls. Like I said,
when I got out of government, we were able to give a friendly foreign government in the Middle East
a really significant tool that
we had not even had in working at the open source center. And so these are very easily available
through Western companies. You look at the Israelis and their development of Pegasus and how
the Emiratis and others have used this in a very sophisticated way. And then at the same time,
you have countries like China, you know,
20 years ago, they might have had one or two IT companies in the top 20. And now I think over half
the companies are from China, and most of them are in the IT sector. And so they don't necessarily
need Western technologies or capabilities anymore, because they're developing their own. And then
even if they are developing things that are
really, really powerful and sophisticated, I think with some of their friends and allies,
they're not going to have the same kinds of restrictions that we might impose on exporting
and sharing that kind of technology. And the other part of this, I think, is you look at Huawei and
ZTE and other Chinese firms that are going into the Middle East and Africa and other places and either like on the cheap or nearly free providing all these services and capabilities.
Well, the real crux of the matter is that this is giving them amazing access to everybody's data.
And the Chinese government, this just pains me to say this, but I know for a fact that they're doing a lot better job with commercially available data. And the Chinese government, this just pains me to say this, but I know for a fact that they're
doing a lot better job with commercially available data than our own government.
Yeah, I would add and say similarly that I think conventional wisdom says that the proliferation
of these tools is by and large an authoritarian, a Chinese problem. And I very much agree with
John that it's a much more complicated question and that certainly the Chinese are active players right now.
And that's a new thing.
And sometimes they're able to extend the selling and subsequent purchasing of their equipment through subsidies and so forth.
But there are lots of other companies involved in all aspects of this from the United States and other OECD countries to the Israelis and others.
And it really depends on what you're looking at.
But if you look at spyware, for example, I had a working paper that came out recently about spyware. You know,
Israelis dominate the market. Actually, the Chinese and Russians are far behind when it
comes to, you know, the actual selling of commercially available spyware. And so the
answer kind of depends. And even when you look at some of these integrated systems, like these
integrated safe city models and so forth, it can end up being pretty complicated in terms of
actually which countries
are supplying what technologies for this. But I think the bottom line is that look at the end
user. Look at what the governments themselves who are acquiring these technologies, what are they
doing with it? What kind of controls and safeguards do they have, if any? And I think when it gets
complicated is when you get weaker democracies. So look at a country like India, where, you know,
world's largest democracy, but a sharp backsliding when
it comes to its governance, and increasing concern about censorship and surveillance abuses that are
occurring at the behest of national authorities and regional authorities as well. And so that's
where you kind of get in this like tough, muddy area where, you know, on the one hand, you know,
it essentially doesn't quite fit the narrative of democracy versus authoritarians when it's
essentially illiberal actors, people with their own agendas, some who are anti-democratic,
others who are populist, who are using these technologies in ways that extend their political
objectives. Steve, I wonder if you had thoughts on this same kind of issue with Brazil. I know
that China has been exporting specifically facial recognition technologies and other AI tools to
Brazilian police. Just wondering if you had any thoughts about what the effects of that might be.
Brazil is interesting in terms of the question of what their police are procuring in terms of
facial recognition. I've anecdotally heard and seen some of that. I mean, to me, I would ask
the kind of bigger question when you look at Brazil is, you know, where have been the kind of biggest issues that we've seen democratically?
And from what I've seen, police abuse is part of what we've been concerned about. But frankly,
we've been concerned about the prior president, Bolsonaro, and his use of disinformation and
populist narratives as a way to harass opponents, spread falsehood and lies, and essentially
reinforce his rule. And so there, I would sort of say, well,
what technology are we most concerned about? And frankly, it's misinformation spread over WhatsApp,
disinformation spread over Facebook, false videos that are spread over YouTube and so forth. These
are all American platforms, by the way. And so I think we have to be very specific about what
actually we're concerned about when it comes to the use of digital repression techniques.
Yeah. So really getting some kind of framework around what we're talking about when it comes to the use of digital oppression techniques.
Yeah, so really getting some kind of framework around what we're talking about so that it can be actionably addressed. Yeah, that's right. Like breaking it down, looking at country
context and saying like, what is it? Like, let's do a quick political analysis. Let's do a quick
breakdown. We're worried about Brazil's drift into backsliding and deterioration of governance.
Well, what's behind that? Is it a digital problem? Is it something else? You know, is it societal inequalities? And is that being exploited
by a populist leader with dictatorial tendencies? Is it some other kind of issue? And I think that
can help us figure it out. I mean, sometimes it is more straightforward. You know, in certain
situations, when you have Chinese technologies, like in Uganda, that were acquired by the leader
of the country, President Museveni, before the last general election as a way to surveil opponents and then crack down on them. Sure, that is much more of a straight line, but that doesn't always happen. In many situations, it's far more nuanced.
our techno-authoritarianism is the benefits from a messaging and information warfare perspective for the state doing the repression. Do you think that the West needs to do a better job of calling
this out on a global stage? And would this require changes to our own practices of data privacy?
And what would this look like? Part of the problem is just on general policy questions that, you know,
many people look towards when it comes to how the U how the US sets up its own rules. Many people find them lacking. So data privacy is a great case in point. You know, I think everyone
recognizes that it's high time that we have a kind of unified national data privacy framework
to guide what is and isn't allowed when it comes to the collection of individual data. And frankly,
you know, when we talk about banning TikTok, that to me, and to most experts, seems pretty
secondary to the idea of first set a data privacy framework, figure out what the rules of the road are, then figure out how different platforms are potentially exploiting those rules and use that as a common basis.
And so instead, what we have is a lot of politicization of these technologies.
I don't think that does anyone any favors.
And I think that both makes things inconsistent from a substantive perspective when it comes to how U.S. laws apply on these questions.
And I think it also sends a poor message globally when it comes to the U.S. setting a standard for
ensuring that regulations at least try to keep up to some extent with innovation. And so I think
there is a lot that we need to do and ought to do when it comes to getting our own digital house
in order, and then using that as a means to kind of broadcast and advance positive, democratic, human rights-respecting messages
that will also provide a good deal of benefit when it comes to counteracting authoritarian
states. So there's a lot of, I think, very positive things that can emanate from that,
but it starts from getting our own house in order. I agree with you 100%, Steve, but I have
zero confidence this is going to happen because
coming up with new regulations and laws and whatever requires politicians that actually
know something about this.
And I just ask you to queue up the Zuckerberg hearings with Congress and the kinds of questions
that he was getting.
And that gives me zero confidence.
Or if you hear about the discussions about TikTok and banning it here or there, well, I'm sorry, but most of these discussions are framed around kind of idiotic assumptions. TikTok is just one platform, and I'm sure the Chinese can get access to Facebook and Twitter and name your social media platform easier than the U.S. government now, because after, of course, Snowden leaks, companies like Twitter said, we're no longer giving our data to the government.
because after, of course, Snowden leaks, companies like Twitter said, we're no longer giving our data to the government. But I'm pretty sure the Chinese are finding cutouts or ways that they're
exploiting all this other data that U.S. citizens are unwittingly, when they get to the end of every
app and say, concur because they're getting it for free, and unwittingly giving away their rights to
that company to sell that data to any kind of, you know, the ad ID stuff to any
kind of company that can even turn it into a tool to track your movements over a 90-day period and
figure out where you work and where you live and all of that. Well, these are such massive privacy
issues. And I don't know how we even in our government right now wrap our hands around that
because it's a complicated issue that requires
people who actually need to know something about it. And we don't have that right now,
at least from what I've seen. Yeah, that's right. I mean, it immediately calls to mind the data
broker industry. Talk about a vulnerability that Chinese and other adversaries can exploit.
They will sell to anyone at any time with no threshold in terms of any kind of limitations.
All you need to do is come up with the money for that. There's the vulnerability. I mean,
there's so many ways that third-party brokers and others are trafficking in personally
identifiable information with zero accountability from the government.
So what role can or should technology companies play in mitigating the negative impacts of digital
repression and in promoting online freedom? I don't think we can assume or even really hope
that these social media companies are going to be able to self-police themselves. I think we've
seen some efforts both in regards to terrorist groups' use of their platforms. We've seen it with a lot of
the weirdo groups out there that are doing conspiracy theories, and we've seen it all over
the board. Or even like trying to block out Russian bots or Chinese bots. Periodically,
you'll see these companies announce that they've closed down 20,000 bots or something like that.
And my impression, big deal. How long does it take them to kick off
20,000 more that it's going to take you another three to six months to figure out and then have
them canceled? They've already done their damage. And so I don't think we can expect companies to
do this effectively. They're obviously financially driven and there's huge revenues to be made in
China and elsewhere. Are you really going to take on the CCP if it's going to come down to your bottom line?
Some companies might do that, but I don't think that many will.
And what we tend to see is some kind of small concession being made.
But at the end of the day, the companies end up kowtowing and toeing the line with the regime.
I think it's just such a massive problem that's, if anything, getting worse with
all the AI capabilities out there. And we're just plodding along and we're hoping to find a solution
that I don't see it right now. Yeah, I agree. It's a huge problem. Where to start? The newest
thing that people are talking about these days is generative AI. There were a number of hearings
that took place recently with Sam Altman, the head of OpenAI and ChatGPT. They were better than, as John rightly pointed out, the sort of embarrassing hearings, Mark Zuckerberg and what is Facebook and how they make their money type of hearings. These are better. And so there is a learning process going on, albeit a slow learning process when it comes to our policymakers.
some clear pathways of things that need to be done when it comes to large language models.
And I think this is where hopefully we can learn some of the lessons of the past in terms of falling down when new disruptive technologies come into the public. And there is a moment where
you can actually do something, whether it's setting up an AI regulator or putting together
some kind of oversight body, or at least doing something to otherwise offset some of the
manifold of harms that will result
chat GPT and other large language models as they kind of permeate and ripple through society.
So I'm hoping that maybe we can see a little more progress than we have.
But John is right that the track record is poor.
So how should irregular warfare practitioners adapt their approaches and strategies to
effectively counter state-sponsored digital repression and help resistance groups? I think this part and partial irregular warfare
has to be inclusive of the digital domain from the get-go. Any campaign planning we're doing,
any kind of resistance training we're doing for allies, that has to be part and parcel of it.
And it's not something you just layer on at the end and say, oh yeah, we need to think about our IO capabilities when we're going into this.
And that's, I think, happening. I mean, even you see it in our department at MPS where it's a
whole curriculum on IO and our students are coming through there and they're learning what the
Chinese and the Russians are doing. And they're also getting courses on irregular warfare and resistance. And a lot of this is getting blended
now. And I think this is 100% essential because it gets at that education part. And most of our
students tend to be at the 03, 04 level, and they're going to be going back to their commands
and they're going to be much better suited, I think, to be incorporating some of this in the initial campaign designing
that they need to do.
But it's tricky because it's going to require, I think, a longer term approach to make sure
that we've got a force that's educated, we've got IC officers who are educated, and that
we're able to get them the right tools and capabilities in a timely manner.
And that's just a whole other basket of problems that for the U.S. government,
it's really, really complicated.
And you see fits and starts of progress with that.
But at the end of the day, it's still USG acquisition.
And we're not known for being real speedy and quick.
I'll just leave it at that.
I'll answer the question by broadening it out slightly in the sense that I will respond
to what can democracies do to help groups on the ground resist, successfully resist
and counter state-sponsored digital repression.
And my answer to that would be three things.
One, I think democracies can help build norms of conduct in terms of the types of behavior
governments ought to exercise when using digital technology.
So just as certain types of conduct, even if still used, are frowned upon, whether it's
mass imprisonments of individuals or torture, the same thing sort of ought to apply in the
digital realm when it comes to tactics ranging from mass surveillance to great firewall-like censorship to internet shutdowns.
If you can build norms of conduct and consensus, I think that'll help quite a bit.
I think second is working directly with groups on the ground and adapting to local contexts.
So those resisting repression in Iran have a certain set of issues
to deal with. Those resisting repression in Hong Kong or in mainland China have a different set
of capacities. And those, likewise, who are pushing back against Moscow authorities have
other capabilities they require. And finding means and the ability to help provide technological and
other solutions for these resistance actors, I think is essential.
And then finally, I think, as we were just talking about with companies, companies don't
get a pass.
And we can put pressure on companies as well, both through formal and more informal mechanisms
to try to do the right thing when it comes to stopping them from selling powerful capabilities
to autocratic regimes, when it comes to providing and tamping
down on propaganda that is undermining democracies, as well as enhancing authoritarian practices.
But ensuring that companies are living up to their responsibilities and commitments to a larger
extent, I think is also essential. I think those are three pillars that could do a lot of good
when it comes to countering state-sponsored digital repression.
John, Steve, thank you so much for joining us on the Irregular Warfare podcast.
Yeah, thanks for having me. It was a pleasure talking to John and talking to the two of you as well.
Yeah, thanks, guys. This was fun for me.
Thank you again for joining us for episode 84 of the Irregular Warfare podcast.
We release a new episode every two weeks. In the next episode, I talk to John Noggle and
David Kilcullen for a 20-year retrospective on I.W. and Cohen with my co-host, Louis.
After that, Matt and Laura talk with Dr. Una Hathaway and Congressman Tom Campbell
about congressional oversight. Be sure to subscribe to the Irregular Warfare podcast so you don't miss an episode.
The podcast is a product of the Irregular Warfare Initiative. We are a team of all volunteer
practitioners and researchers dedicated to bridging the gap between scholars and practitioners
to support the community of irregular warfare professionals. You can follow Engage with us
on Facebook, Twitter, Instagram, YouTube, or LinkedIn.
You can also subscribe to our monthly e-newsletter
for access to our content and upcoming community events.
The newsletter sign-up is found at irregularwarfare.org.
If you enjoyed today's episode,
please leave a comment and positive rating on Apple Podcasts
or wherever you listen to the Irregular Warfare podcasts. It really helps expose the show to new listeners. And one last note, all that you hear in
this episode are the views of participants and do not represent those at Princeton, West Point,
or any agency of the U.S. government. Thanks again, and we'll see you next time.