Irregular Warfare Podcast - Dynamite to Drones: The Diffusion of Lethal Technology to Terrorists and Insurgents
Episode Date: April 8, 2022From dynamite in the early twentieth century to drones, bioweapons, and private-sector satellite constellations today, lethal technologies are increasingly available to nonstate actors and individuals.... At a time when states are focused on competition and potential conflict between great powers, the decentralization of today’s low-end technologies could equip nonstate actors, private companies, and terrorists with unprecedented irregular and asymmetric capabilities. In this episode, Professor Audrey Kurth Cronin and Major General Patrick B. Roberson join to discuss the history of technological innovation, examples of current and burgeoning technologies that will impact future warfare, and how governments can (and sometimes cannot) regulate the development and distribution of potentially dangerous technologies to malign actors. Intro music: "Unsilenced" by Ketsa Outro music: "Launch" by Ketsa CC BY-NC-ND 4.0
Transcript
Discussion (0)
The entire capability of a population, if it's willing to resist, everybody can be a
resistance fighter now.
If you have a phone, you can be a resistance fighter.
You don't have to be an anonymous.
You don't even have to be a hacker.
You can just have a phone and you can be part of the resistance.
I think this technology is changing a lot of the way that we should be thinking about war in the future.
Policymakers are not coming up with enough creative ideas for watching what's happening on the low end of technologies.
And thinking about the combinations of technologies, as we've been discussing,
we need to get Congress much more up to speed on what the
downsides of various technologies are. Some of that is already happening,
but we're really at the very beginning.
Welcome to Episode 50 of the Irregular Warfare Podcast. I am your host, Kyle Atwell,
and I'm excited to introduce as my co-host, the newest member of the Irregular Warfare Initiative team, Benjamin Jebb.
In today's episode, we ask how the increased availability of lethal technologies down to individuals and non-state actors impacts national security.
At a time when states are focused on great power competition, our guests argue that the decentralization of lethal technologies allows
non-state actors to increasingly challenge the international system. New technologies can be
good for society, but they have a dark side. In particular, they often enable non-state actors
to pose irregular threats to nation states. Our guests discuss the history of this technological
change and its implications for future warfare. Professor Audrey
Cronin is Distinguished Professor at American University, Washington, D.C. She is the author
of the book, Power to the People, How Open Technological Innovation is Arming Tomorrow's
Terrorist, which is the focus of today's conversation. In addition to multiple academic
accomplishments, Professor Cronin has served in the Office of the Secretary of Defense for Policy Thank you. enduring freedom, and inherent resolve. You are listening to the Irregular Warfare Podcast,
a joint production of the Princeton Empirical Studies of Conflict Project and the Modern War
Institute at West Point, dedicated to bridging the gap between scholars and practitioners to
support the community of irregular warfare professionals. Here's our conversation with
Major General Patrick Roberson and Professor Audrey Cronin.
General Patrick Roberson and Professor Audrey Cronin.
Professor Audrey Cronin, Major General Patrick Roberson, welcome to the Irregular Warfare podcast.
Thank you so much, Kyle.
It's a pleasure to be here.
Yes, absolutely, Kyle.
I appreciate being invited.
Thank you.
Well, our motivating question for today is how has warfare changed as a result of more
and more individuals having access to lethal technologies?
Or in other words, it is essentially easier for non-state actors like terrorists and insurgents to get their hands on powerful lethal technologies.
Audrey, you wrote a book on this topic.
I'll start by asking what motivated you to write this book?
Well, I was watching all of the attention, especially in the United States, changing to the centralization of power and the shift to great power conflict, to focusing on peer conflict, very important element of what faces the United States.
But I was also worried about the very important aspect of decentralization of power.
And it isn't as if those two things are mutually exclusive.
They're actually quite intersecting. And if you don't understand both of them as part of the
holistic nature of what we're facing in conflict, I think you end up with a very stovepiped view
of what's going on. So the reason I wrote the book was to focus on the aspect that very few
people are still focusing on, which is the decentralization
of power, not just to your traditional non-state actors like terrorists and insurgents, but also
decentralization of power to the private sector versus what we used to be used to in the 20th
century, which was emphasis on the public sector. And Professor Cronin, a lot of the
decentralization you focused on is the decentralization and spread of lethal technologies, correct?
You have a lot of cheap, easy to use, simple technologies, and you have a huge number of people who are using them.
That's a kind of a swarming that can be quite effective even against much more superior and powerful technologies.
So just trying to look at both sides.
Yeah, I think that the drone, the context of just unmanned aerial surveillance platforms is an interesting evolutionary piece because when I
first started fighting in Iraq in 2003, we got very little unmanned aerial surveillance.
And then later on, it started to increase and increase. And as you were going through these
series of wars and deployments as we're going through it, you thought like, we are the ones
that own this technology. Nobody else has this technology. And then when I got to go back and fight against ISIS, you see, okay, they're learning from us,
and they've taken something that was very expensive for us to make and manufacture,
created it cheaply on the civilian side, and then figured out how to use it in war,
whether they armed it or used it as a surveillance device. And now you see our most high-end UAS with the capability to fire
some type of weapon system off of it being produced in Poland, being produced in Turkey
for export for, I think, a fraction of the price it would probably cost us to make something like
that and used, I would say, to great effect. So those are some fascinating examples of how
contemporary non-state actors innovate. For Professor Cronin, in your book,
you discuss two very different types of innovation. You've got closed and open innovation.
Just to frame for the listener, could you delve into those a little and maybe provide some
historical examples of how closed or open technological innovation impacted the national
security environment? Closed technological innovation is what we mainly had during the 20th century. So
you had the building of major combat systems and major capabilities that was heavily controlled
by security clearances, under government auspices, oftentimes in secret. The most iconic image of
this is, you know, working in the Manhattan Project and building nuclear weapons, but also things like
building ultimately the F-22, the Arleigh Burke destroyer, the Trident II ICBM, increasing
capabilities that were in certain channels of types of capabilities and controlled by the
government. That is what I would call closed technological innovation. But what we have now
is open technological innovation, which is driven by commercial processes. In about 1993, there was a deliberate decision to release quite a number of our digital capabilities that have been developing and to make publicly accessible. NASA and the US Air Force drove development of microprocessors.
Google built their search engine with National Science Foundation grants. All of these things
became available and we're building upon a deep foundation of government-run, very expensively
developed digital technology. In fact, virtually all of the components of our smartphones came from
digital technologies that the US government developed, and in some cases the U.K. developed in the 60s, 70s, and 80s.
So in about 1993, there was a deliberate decision made that these would be made more accessible to the public.
What you had was innovation that occurred because companies were gaining profit, because individuals were able to tinker with them, because there was a desire to make technologies accessible to a broader number of people.
And for the most part, that's been a good thing. Certainly the development of access to the Internet, the information that we're receiving from the Internet has entirely changed the ability to learn more about what's going on
in the rest of the world. That's all great. But the other side of it was that in an open
technological revolution or in a period where you've got that context, military innovation is
totally different. You can't use the same models that we developed so rigorously, particularly in
academe, but also in the military and in R&D and how
exactly the weapons acquisition process worked, those models don't work when the context has
changed, when you have an open technological innovation situation. So instead of building
weapons that are driven by the national interest or by what a government wants to accomplish or by
politics, oftentimes you have weapons and capabilities that are driven mainly by profit the national interest or by what a government wants to accomplish or by politics. Oftentimes,
you have weapons and capabilities that are driven mainly by profit, accessibility, high desire to
make them cheap, and to be able to sell large numbers internationally. So it's a different
context. And that is changing conflict, not necessarily removing the impact of major weapons. It's not either or,
it's just changing who can show up on the battlefield and actually have an impact.
And that's important, I think. It's not just which side has the better technology, but also
who can access the technology and perhaps have either an impact on the battlefield or even a
political impact, which as we know from Clausewitz, the political objective is more important than the military objective in all wars.
So the key insight, if I understand correctly, is that this has made lethal technologies more
and more accessible to non-state actors, essentially. Some of them might be positive,
but some of them might be very negative because they can use those weapons against the state
at the end of the day. Yeah, or just use them in ways that are unpredicted. And remember
that when I'm talking about non-state actors, I'm not just talking about dual use. So like
military and civilian, I'm talking about a bigger range of people who have access. In some cases,
civilian individuals or companies are actually ahead of the military in that small piece of the technology, and the military is behind. So if you don't use a kind of a more granular approach in talking about the actors that are using these technologies, it's not just whether they're groups like Anonymous, or they could be mercenaries, or there's a
whole muddy group of different kinds of actors.
And we'll throw them under the name non-state actors, but we're not just talking about terrorist
insurgents anymore.
You know, Professor Cronin is spot on.
I would like to add a few things to this and say rates of innovation matter a lot also.
It's not just like what you're getting or what's available.
It's what can the adversary or what can you innovate at a rapid speed.
I think when you enter into a lot of these conflicts and wars, wars are engines of innovation.
And I think that matters a lot.
Like the idea that an enemy could take what you have and say, you know what, I can use
that in a better, more creative and interesting way than even you're using it.
I thought ISIS was a very good example of this because they were a non-state actor that wanted
to be a state actor. So they would actually take everything that we had and figure out,
how do I manufacture the things that the Iraqi army or the Syrian army has in a weapons
manufacturing facility that used to be an auto parts manufacturing plant?
And how do I modify something like this and make it more useful to me?
All the way from, can I make something like a rocket launcher from scratch,
where I couldn't get that imported before?
Their rate of innovation was, in the beginning, I thought, extremely fast.
Maybe even faster than ours.
It took us a few moments to adjust to that.
Major General Roberson, I'd like to just follow up with that point quickly. Could you provide
an example of either a time you saw an organization or non-state actor innovating
quickly on the battlefield? Absolutely. If I could go back and just think about what I've seen
on this idea of innovation over time, I would say that in the beginning of the 2003 Iraq War,
the insurgency was using a lot
of material that they had captured, like old artillery shells, recoilless rifles. They were
using a lot of that material. Then they evolved to the point where they'd run out of that material.
But in the meantime, they'd figured out, okay, I can use fertilizer to make weapons. I can do EFPs.
I can do all kinds of other different kinds of things. And as I was watching these adversaries, I could see them go through like different kinds
of rocketry that they had manufactured to innovate.
And I would say one of the things that always struck me is the things that didn't work.
Many times they would get rid of them rather quickly.
And I thought that they had used chemical weapons sometimes.
Al Qaeda had used chemical weapons.
ISIS had a chemical weapons program.
I think these people thought that this was going to bring them kind of the respect that they deserved. But I also
think that they came around to believing that these were not projects that were really worth
their time. And at the end, I think they felt like they weren't getting the bang for the buck
out of that. So I don't want to say it's a failed innovation on their part, but they were
experimenting with it, which I thought was very interesting. This ability of non-state actors to
quickly innovate and access technologies, does that give them a competitive advantage in conflicts?
Can they impose costs on standing militaries at kind of a disproportionate rate? I remember during
the Iraq war, the cost of an insurgent to make an IED was significantly lower than the cost of the
MRAP that was carrying the American soldiers in it. Does that give insurgents and non-state actors a competitive advantage?
Yeah, I absolutely believe that it does. And since we're talking scale, one of the things that I
thought ISIS did extremely well, particularly in the fighting leading up to Mosul and before,
was they had a new technology that they were using, the DJI Phantoms. They had combined that with a human
suicide bomber and a vehicle-borne IED. And they were using this technology combined with a human
in a way that like, okay, we had not seen that before. They were using these drones to guide
these suicide bombers in against the Iraqi and Syrian army. And I would say they were able to do it at a scale that we had
never seen before in the suicide bombing arena. You would see a few suicide bombings maybe a day
during the 2003 to 2010 piece. When you went to fight ISIS, they were sending 10 of these VBIDs
against you a day or against the Iraqi army. It was on a scale that was unprecedented with that technology that they had innovated and brought together to use against us and the Iraqi
army. Wow, that's very interesting. And I do think that this tendency to combine a lot of different
technologies, General Robeson, is one of the things that makes innovators, insurgents, terrorists so
effective because they combine
things in new ways that are much harder when you're in a bureaucratic structure and you've
got stovepipes and different sort of lines of innovation. And going back to that, using it in
a new way, I would say the idea of using these VBIDs against the counter-terrorist service,
against the Iraqi army, it terrified them also. The idea that they had a technology that was
hard to stop in the sense that they were guiding these basically almost like cruise missiles
against us, combined with the fact that they had a person that was willing to die on large scale
for their project. So again, I think that was devastating against our partner, the Iraqi army.
Now, eventually we were able to come in and kind of clean up the battlefield after a while, but did it give them a competitive advantage? Yes, I think it gave them a pretty good
competitive advantage over the Syrian military or the Iraqi army and also against other terrorist
organizations because they were competing against other formations of insurgents and terrorists also.
Something I've thought about as the Russians have staged a fairly conventional invasion of Ukraine,
and Ukrainians have had some success with key pieces of technology of destroying Russian helicopters, Russian tanks,
is if modern technology, the kind of ability to access some of this technology, favors the defense, essentially.
Does it favor the insurgent in these fights?
And is technology kind of maybe one of the key reasons that counterinsurgents have struggled over the past couple decades?
Yeah, I'll jump on this one because I think it's interesting because you kind of mentioned the
anti-tank guided missile piece. And I was an 06 colonel level commander in Afghanistan for 2013.
And one of the things that we feared the most was just an RPG attack against some of our
vehicles. And RPGs were not new technology, but they're still very effective against the vehicles
that we were traveling in. And I fast forward myself to Syria in 2017 and 2018, and our Turkish
NATO allies were always talking to us about, do not ever give your partners anti-tank guided missiles like TOWs or Javelins
because they knew from ISIS and other formations
that these things were devastating to armor formations
and they lost a lot of tanks.
So they were on the learning end in a way of this,
I don't want to call it a new technology,
but it's evolving and it's getting much better
on the anti-tank guided missile piece,
top-down attack, combining drones with anti-tank guided missiles. But yes, it looks to me like the weapons as we've
evolved them both on the open side and the closed side, an individual soldier or person has much
more killing power than he previously had, including against a tank or armored formation
that he would have never had, I would say, previous to this.
Russia has been using non-state actors like the Wagner Group, mercenaries, in a very targeted way and in some ways in a very leveraged and effective ways. Other non-state actors that have had
enormous impact upon what's happening in Ukraine include people like Elon Musk, who gave the
Ukrainians access to Starlink and there are 2,000 satellites out there. It's
hard to take down 2,000 satellites if you're Russia. So this is another example of a cheap
technology that a non-state actor, in this case, a very powerful and wealthy and smart businessman,
that kind of technology is having an enormous impact in making Ukraine much more resilient
in its communications. So we're seeing a lot of
non-state actors who are showing the capabilities of digital technologies in new ways, particularly
on the side of the Ukrainians. I would like to ask you, sir, though, to someone like me,
you know, just an observer, it seems that the lesson with respect to Russia is their inept
use of ground-based conventional force. It's really been quite surprising, hasn't it been?
I believe what they thought was they were going to have the kind of success in the Ukraine that they had in the Crimea,
they had in Georgia, that they had perhaps in Syria.
And I think they looked at what we had done in Desert Storm and what we had done in 2003 against the Iraqi army
and thought, mistakenlyly that they could imitate that
and that they were up against a soft target.
And because a lot of the great things
that the US government had done
to bolster the resistance level of Ukrainians,
I think that he had discounted that.
I mean, because we're talking about like,
what deters a country?
Do large weapon systems deter a country?
Basically, does the fighting power of a country itself, would that deter somebody? Would someone walk into a country? Do large weapon systems deter a country? Basically, does the fighting power
of a country itself, would that deter somebody? Would someone walk into a country like the Ukraine
knowing that they have such a resistance capability? Two months ago, maybe they would
think twice. Now, because again, it kind of goes back to this idea of technology,
as you brought up, combined with the will to resist, or an individual person, you can leverage,
I would say, the entire capability of a population if it's willing to resist. Everybody can be a resistance fighter now.
If you have a phone, you can be a resistance fighter. You don't have to be an anonymous.
You don't even have to be a hacker. You can just have a phone and you can be part of the resistance.
You can even be in a foreign country and be leveraged. I just read there's 500,000 people
that have basically enlisted in the Ukraine's
cyber defense force to attack the Russians. So yeah, I think this technology is changing
a lot of the way that we should be thinking about war in the future.
That's interesting about the deterrence because in the past, if you wanted to help Ukraine deter
a Russian invasion, you might think, oh, we need to give them high-end capabilities like
tanks, or we need to give them an Article 5 NATO kind of commitment or fighter
aircraft. But it might change the calculus for future states to invade countries at much lower
thresholds of technology, given that they can have a devastating asymmetric impact.
That's exactly what I'm saying. I think getting to Professor Cronin's book,
the idea that she's talking about non-state actors,
I would be talking about just an individual. An individual can have a lot more power.
Kind of goes to the dynamite question. Previous to dynamite, it was a lot harder to have a person
have this kind of explosive power. That power is exponential in the sense of like a cell phone
or an anti-tank guided missile or anything like this. Individuals have a lot more power,
both in killing power and just in information power.
Professor Cronin, I'm fascinated by this idea you bring out in your book about how
open technologies can often intersect in ways that are mutually beneficial.
More specifically, you discuss how at the end of the 19th century, a global anarchist movement
was able to capitalize on both a sensationalist media
environment and distribute easily disseminated build-your-own-bomb pamphlets with commercially
available dynamite, and all this kind of coalesced to inspire fear in the public imagination.
Could you expand on that topic? And I guess, do you see any corollaries from that era with kind
of modern warfare right now?
Yes, I think that the broader political context in which they were operating is very similar to ours because they had tremendous open innovation, as we've already discussed, including using
dynamite. They had a new form of communications, which was mass market newspapers that had not
appeared until the end of the 19th century. And those two things acted upon each other so that those who engaged in dynamitings got huge publicity.
And there was inspiration of other people, people that didn't know each other.
This argument that was made about al-Qaeda by a lot of academics mainly,
but they were the first group not to have direct contact with each other. They simply inspired other people to follow them and ISIS as well. I disagree because
the late 19th century and how the anarchist wave, which was the first wave of modern terrorism,
how that wave unfolded was all about exploiting mass market newspapers who would then have these
sensationalized stories about how dynamitings had
killed all kinds of people. And oftentimes they were wrong and the numbers, the casualties were
higher and people were both horrified and fascinated by it. And so the anarchists gained
more followers and there were more attacks and little by little anarchism, which began in Europe,
spread to every continent. That first wave of modern terrorism was extraordinarily violent, and we don't even think about it
today, but it spread to every continent except Antarctica, and people were copying what they
saw and read about through the newspaper.
The other thing that wasn't important, though, which I should add, is that there were manuals
about how to build dynamite bombs, And they became very popularly accessible too,
because the technology to print small pamphlets became much cheaper.
And people could buy these five cent pamphlets to learn about how exactly to do it.
Certainly a newspaper is not like the internet,
but that element, that ability to learn as well through
pamphlets that were distributed and encouraged by many terrorist groups throughout the world,
that was also very important. So I see that parallel with how we've had a tremendous wave of
terrorism within the last 20 years. I don't think that that's likely to decrease.
Yeah, I'll add on to that. I talked about evolution and the rate of innovation
that organizations have. I thought when we were fighting ISIS, that their ideas of how they
evolved to take on the information environment and the new kind of the internet, where I don't
know that Al Qaeda had done quite as good a job, but when ISIS stepped in, they were doing things
like, okay, that's very clever to do that. And they're doing it for a variety of reasons. I mentioned the idea of competing with other groups. ISIS had al-Nusra
Front to compete with. It had old al-Qaeda to compete with. It was trying to set itself apart,
and it was able to use the internet and social media and its magazines to do these kinds of
things. In a way, propaganda of the deed was one of the ways that they would do it. They had a
marketing campaign of violence, extreme violence, to bolster their own troops. But also, I would say, they had
a gigantic recruiting campaign. They had gone global with their entire platform and were bringing
in recruits from all over the world. They weren't just advertising a military entity. They were
advertising they were a state, so come live with me. And they owned territory also, so that was
something that was a little bit different, where they were able to, I think, come out of the shell and say, hey, this is us. Come join us. See what we do. That was a big evolution in how they used the Internet and social media.
struck me at the time was ISIS was willing to be outrageously brutal. And they were trying to do something different. They were trying to use that sensationalized violence to gather people to them.
And that makes them more similar to what was happening in the 19th century. They were not
necessarily trying to mobilize their followers. They were trying to mobilize people who wanted
to have agency and who wanted to get out there and fight with them.
So up to this point, we've discussed open technology.
And from what I understand, this sort of open technological diffusion sometimes escapes
government control.
Professor Cronin, in your book, you discuss the issues that government regulators face
when trying to stop the spread of commercially available explosives.
One example that easily comes to mind is Alfred Nobel, who not only
invented dynamite, but then found clever ways to increase production and distribute it globally
all over the world, which is both good and bad, but it confounded government efforts in the US
and abroad to regulate the explosives market. Can you discuss a little bit of the challenges
governments face today when regulating the spread of lethal open technologies?
Alfred Nobel had his greatest success
when he could maintain control over the patent for dynamite.
And he was extremely smart about how to do that.
But he was successful in some places,
but unsuccessful in other places.
But what he was really brilliant at
was working with local regulatory regimes.
And he actually was begging people sometimes.
So for example, in the UK,
for better regulations so that he could sell more dynamite there, because there had been some very
high profile accidents that had killed a lot of people. And he wanted to be able to convince his
buyers that dynamite was safe to transport and use. And so he begged the UK for a regulatory system. And that system ended up being a former
military who had in many cases been serving in the First World War or serving in earlier wars.
And then also academics who were experts on high explosives, they worked together,
the military and the academics, in order to come up with a reasonable regulatory system that allowed the UK to use dynamite very effectively. And Alfred Nobel got extremely
rich from what he was selling both to the UK and all of its colonies. So I think the message there
is that when we look at major technology companies today, and also at the types of technologies that
are diffusing among all of the actors we've been discussing,
to see this as a situation where we can't have any regulations or any guidelines that allow us to use those technologies in the most effective ways that protect the public interest,
I think that's wrong. Because if you look through history, many people have gotten much richer as a
result of good regulations or good
guidelines on lethal technologies like dynamite in Alfred Nobel's case.
Sure. I mean, Professor Cronin, you bring up some good points about the trade-offs between
too much and too little regulation when it comes to new tech, especially in the national security
realm. But could you give an example of what the government can do today to regulate modern technology companies?
What I think we can do is educate commercial actors in the implications of what they're
doing with respect to war and peace.
You know, the perfect example right now is Meta, because they've been struggling with
the fact that their moderation of what appears on, for example, Facebook, but also WhatsApp,
Instagram, what appears on those platforms,, Facebook, but also WhatsApp, Instagram, what appears on those
platforms. If they use their moderation policies according to how they're written, they're not
allowed to say anything in favor of Ukraine and against the Russians. And so they've made an
exception to allow Ukrainians to call for blood from the Russians. They believed that they could
be completely neutral in war and peace. And I think that they've been mugged by reality when it comes to a war in which they're supportive of one side and not the other.
So I think actually that we can work with organizations like META to convince them that
what they do affects the outcomes in ways that they haven't previously admitted and perhaps help
them build better policies that support national interests, our national interests.
Just how accessible is significant technology today?
Are there some technologies you've seen, General Roberson, that have been surprising that non-state actors have been able to get access to them?
Absolutely. Yeah, I think about this technology a little bit too, the proliferation.
Particularly, I think more about drones, but I'm sure there's other things that are out there. A good lesson, I think, is the Houthis
and their ability to take a drone, modify it, and then fly it across three countries and basically
land it as a suicide bomb wherever they want it. I assume this will proliferate. I can't see why
a non-state act or terrorist organization or anybody else wouldn't use that technology, much like dynamite. It doesn't seem to be that difficult to do. The idea can proliferate on the internet. I think the drone piece is one that we have to be careful about. Those are not very hard to weaponize.
warming technology where it wouldn't take much. I think the other piece to this whole thing that wouldn't take much to figure out how to do is the idea of like facial recognition,
human recognition, this idea that once I can figure out, I can use technology to positively
identify you or seek you out in a crowd. I think that's something states are working on that,
but I assume that is not hard to export to the individual level, particularly corporations,
if they wanted it.
I think much of the ISR that we've used or the ability to look at something is also available
in the commercial sector and individuals be able to leverage this also.
I see this technology, again, it goes both ways.
Things that were used in war can be used in peacetime against a population.
So I think these are just things that we have to be aware of
that are out there. Yes, I couldn't agree more, particularly this example of facial recognition
technology. And getting back to the earlier point, we were talking about combining different
technologies. So if you used FRT and a quadcopter that was loaded with a small explosive projectile
and then had a limited amount of autonomy, perhaps. I don't want to go
into too many details, but there are a lot of ways that each of these technologies can be
combined and used by individual actors, both with respect to terrorists and insurgents,
but also criminals. I think that much like drone technology, the genie's out of the bottle. We just
have to figure out how we're going to regulate it and control it, much like dynamite, as you point out. The thing that's harder in our case now is that dynamite
at least was one thing. So it could be regulated as a very powerful and easily leveraged new high
explosive technology that people could stick in their pockets, but you could actually deal with
it as a single thing. Whereas what we're talking about, General Robeson, is
these things are on separate trajectories of development. And so getting those who are
building FRT regulations to cooperate with those who are building drones, each of these different
parts of our commercial processes do not naturally work well together. And the federal government is
very slow at regulating most things. So it's a bigger challenge, I think.
So we've all been discussing the role that drones play in modern warfare,
but if we just pause for a second and kind of scan the horizon,
are there other open technologies that you two think we should be wary of?
You know, the things that immediately come to mind are machine learning
and artificial intelligence, biotechnology, 3D printing,
additive manufacturing, just to name a few. Do you think that these items will have a decisive
and major role to play in the not too distant future? Well, yes. 3D printing is not a new
technology. Additive manufacturing is very old, actually. But what's really new is the ability
to share digital cam files. And also 3D printers themselves are getting
more and more advanced. And the kinds of materials that can be used are getting easier to buy,
they're getting cheaper. So 3D printers are very useful already for things like ghost guns and,
you know, some of the untraceable weapons that are being used. So that's something that I think
is quite concerning. Certainly the ability to integrate
different systems through simple AI, that's something I see a little bit further down the
road. The various ways of combining the technologies, I think, are where the future lies.
And those are two that I would mention. Yeah, I couldn't agree more. I think it's the combining
piece, the idea of using AI to make
whatever you have faster or autonomous in a way. I mean, the reason we talk about drones, I think
right now, most drones are controlled to some degree by a human. You could pre-program to go
wherever you want. But once you have better machine learning or better AI, you can use a lot
of these things autonomously. And then I think you've turned the kill chain up a few degrees.
That's something a state can do or a non-state actor or an individual also. And I think that's
what you have to be careful for. Just combine exactly with what Professor talked about,
the facial recognition software, the autonomy of some type of device to act on its own.
Yes. And we probably won't go off to talk about this very much, but the other big area that I'm really concerned about is synthetic biology, biotech that can be creating new molecules
or replicating old molecules, the ability to gain information about the DNA sequencing
of difficult things like smallpox.
These things are widely available, and you're going to be able to buy tabletop synthetic
printers soon.
Some of them are already available.
I'm really glad you brought that up because that's one of the technologies that stood out to me.
When you talk about the idea that the United States and other great powers
have become very comfortable with the idea that they have a kind of preponderance
of ability to impose lethal force on others,
what we're talking about is how technology has changed that.
In CRISPRs, clustered, regularly interspaced Short Palindromic Repeats, I won't pretend like
I had that memorized. I had to pull it up on Google to remember the acronym. But the way it's
been kind of instructed to me is that in the near future, essentially at a home lab, an individual
in a house will be able to potentially create lethal biological weapons that could have massive
global impacts.
And that's kind of something that, as was said, is the genies out of the bottle on that.
That is exactly what I'm talking about. And CRISPR is only the beginning. It's in some ways the most
basic of the tools that are available. There are lots of other comparable gene editing tools now.
And the insider threat, which is quite worrisome. We've had a lot of experience in the past, historically, with insider threats being people
who are, for one reason or another, disaffected and transfer that technology into other uses
in civilian life, either criminal or terrorist, or they transfer them to terrorists and criminal
groups.
But then also, after the insider threat comes just ordinary people who are trying to learn
exactly how to use DNA sequencing, and it's getting easier and easier to do so.
Absolutely. I mean, the fact that they came up with a vaccine for the coronavirus in a record amount of time with a new type of vaccine, pretty amazing.
So it just kind of demonstrates this idea of the compression of technology and time, I think, and space.
So, yeah, we're going to see that.
of technology and time, I think, and space. So yeah, we're going to see that.
We are. And if I could just add though, because some of it is so exciting and possibly going to solve really horrible diseases, cancer perhaps eventually, cystic fibrosis, we're fairly much
closer to sickle cell anemia. There are lots of exciting developments in this area, but
unfortunately it has its dark side.
Absolutely. Just like dynamite, I just keep going back to dynamite. It's got a good side and a bad side.
Exactly.
General Roberson, you said something in the pre-call that I think is worth bringing back up, which is our ability to kind of respond to innovations by insurgents and terrorists that we've worked against.
by insurgents and terrorists that we've worked against.
Have we been pretty good at keeping pace on our end with innovations when insurgents started using new technologies
we didn't anticipate like quadcopters and stuff like that?
When I had first gone over to fight against ISIS,
I was familiar with the idea of using unmanned aerial systems,
but as I mentioned earlier,
only really from a perspective of a high-end piece.
We had just started to get a lot of our smaller unmanned aerial systems.
But what I saw with ISIS was this idea that while they have taken a commercial off-the-shelf platform and they've been able to both weaponize it and use it in ways that we hadn't thought of.
I already mentioned the idea of using a quadcopter as a guide for a vehicle-borne suicide bomber.
But some of the other stuff that we saw was this idea of using one of these quadcopters for forward observing, for mortar fire or artillery fire.
And that was something that when we saw it as a soft force, we were like, okay, hadn't thought about using that in that way.
force, we were like, okay, hadn't thought about using that in that way, or on that small of a scale, like we could really take this and every entity on the battlefield could be using this
for forward observing, we were able to take that and use it in a way that the enemy was using it
against us. Because from his videos, ISIS videos, you could see, okay, I see exactly how he's using
these things hadn't thought about that. Let's implement what he's doing and use it against him.
And then we're actually as far as rate of innovation goes, we were able to, I think,
take that and out-innovate him. He had slowed down, but we basically thought of the idea like
we could use these things on a scale, this UAS, to sense everything on a battlefield.
We can use it to see, we can use it to sense the electromagnetic spectrum, we can take this and scale it down to a team or a SEAL platoon or a MARSOC unit of action, and we can use some of
the tricks that they taught us against them at a scale that they had not thought about.
In this conversation, we've talked about kind of the democratization of technology,
of lethal technologies to more and more non-state actors and individuals. And we've talked about
the role of technology in modern warfare. I'd like to ask both of you, and we can start with
you, Professor Cronin, what are the implications for policymakers and practitioners from this
conversation? Well, I would say that policymakers need to pay attention to the full range of
technologies and the full range of actors that are likely to use them.
I see a lot of focus that is being placed upon the competition with China over cutting-edge
artificial intelligence uses, for example, very important focus. But I think that the actors that
are most advanced right now in developing artificial intelligence are actors like Google.
most advanced right now in developing artificial intelligence are actors like Google. And to the extent that we have different efforts that are underway, I think it actually undermines what
the United States is trying to accomplish. That is not to say that we're going to be able to
work hand in glove all the time with private companies, but to the extent that we cooperate,
I think that ultimately we're going to be able to shape new technologies in ways that
better serve the public interest and hopefully as well, ultimately the United States interest.
That's a difficult challenge and we're not paying nearly enough attention to it.
The other thing is that policymakers are not coming up with enough creative ideas for watching
what's happening on the low end of technologies, and thinking about the combinations
of technologies as we've been discussing. We need to get Congress much more up to speed on
what the downsides of various technologies are. Some of that is already happening,
but we're really at the very beginning. A lot of my students, frankly, are going on to work
as staffers and also at DOD and in other places where they're trying to make a
difference. Emerging technologies right now are looked at strictly in the commercial perspective.
I think we have to open our eyes more to the downsides of these commercial technologies so
as to be able to achieve their greatest potential without ending up in major disasters. What I most
worry about is that we're going to have an example like
the one that we were talking about before, where you have individual actors who are using facial
recognition technologies and explosives and drones and autonomy. We're going to have a disastrous
experience like that, and then we're going to react in a stupid way. So we have to think in
advance about how it is that we're going to both regulate
those technologies, track those technologies and private actors, and make smart policy in
the aftermath of when an inevitable disaster happens. I would offer for policymakers just
the idea of constant innovation on all fronts. And we talked about dual use earlier, but I think the idea of
investing in R&D, whether it's in a commercial government partnership or just on a government
side, I think is invaluable. The world is not going to slow down on the technological innovation
front. And I think to maintain your competitive advantage as the leader of the free world,
you need to invest in that side of your
portfolio. Let me ask you, General Roberson, and we can continue talking about this as well, but
you're kind of in a unique position in that you are training future special operations forces on
how to engage in the modern battlefield. When you look at technology's role in the modern
battlefield, this kind of diffusion of lethal technologies to lower and lower levels, what kind of implications do you have or recommendations do you have to the
practitioners out there, either from the tactical up to the strategic level on how to engage in
these environments? Yeah, I think as a practitioner or as advice to practitioners, what I tell people
and what I try to instill in the people that we train is this idea of kind of keeping an open
mind. You know, we want to be critical thinkers and problem sol we train is this idea of kind of keeping an open mind.
You know, we want to be critical thinkers and problem solvers. And I think a lot of that
evolves around what we train people is what they will do in combat, right? And if you train a
person very doctrinally, like this is the way you handle this problem. This is the technology that
you use to solve this problem. When that person is in combat, that is the technology he will go
to and he won't think about like, okay, let me step back a little bit. I've never seen this before.
I've never seen someone use that technology against me in this way. How would I look at
that and say, how do I defend myself against that? And then how do I use that against my adversary
and do it quick? When I talk about rate of innovation, part of it is like, how quickly
do you understand your current environment? How quickly can you understand the situation is not like I thought it was going to be and adjust? Usually,
it's not the person maybe who's best prepared at the beginning of a war. It's the person that can
adjust the quickest to the current situation. So we're trying to instill that in the people that
we're training here. That's the advice that I would give any of my fellow soldiers, airmen,
That's the advice that I would give any of my fellow soldiers, airmen, Marines out there.
Let me also add one thing that speaks to the question of how to be well prepared for what's coming on the battlefield.
I do believe in all modesty that having a lot of education to being omnivorous in what our warfighters read,
in being exposed to a lot of different ideas. And potentially also,
I would even think it would be a good idea to have those who are in military careers be able to get out of the career and back in to even serve in a startup or to do work for a small
company that's building a technology. These things would actually serve our military.
We still have a military career system
that was very similar to what we had in the 19th century.
And it's been wonderful.
But I do think a little more flexibility
in how we train our officers and also our enlisted,
being a little bit more interested
in letting them have different kinds of experiences,
not necessarily always having to have
that straight career track. I think that would actually serve us in being a more agile military as well.
I couldn't agree more. We're trying to push the envelope with getting our non-commissioned
officers into graduate programs. We do believe that education is a hedge against uncertainty,
and the world that we are in is a very uncertain world, both technological, geopolitically, and everything else. So yes, we are on board with that, Professor Cronin. Thank you.
Well, we agree 100% then.
We're in violent agreement about that.
Excellent. Excellent.
Is that difficult to implement and practice, General Roberson? I'm thinking about the demands of a career timeline that can often seem very regimented and then somebody wants to go off the beaten path. Do you think that we as a military can adapt realistically? I know we aspire
to, but do you think that those challenges can be overcome of kind of, I hate to use the term,
but a cookie cutter kind of career path? Well, Professor Cronin mentioned the idea of a startup
or being entrepreneurial. I look at ArmySoft and Soft in general as a very entrepreneurial startup
type of entity within the military. We're usually trying out lots of different things. It could be
a new career path. It could be a new piece of equipment. It could be a different way of fighting.
So I think the things that we're doing right now in this regard, that going to school,
the different career paths, where we have a public-private partnership to send a person
to a marketing agency to work at or to send a person to Silicon Valley to do
an internship. I think these are things that we're doing that maybe right now they're kind
of on the cutting edge, but I assume that they will be mainstream in the near future.
Major General Patrick Roberson, Professor Audrey Cronin, unfortunately, we're out of time. We
really appreciate you joining us today. It has been a great conversation on irregular warfare.
Kyle, Ben, especially Professor Cronin, thank you for letting me be a part. Wonderful book,
wonderful discussion. I look forward to more in the future. Thank you.
Well, I want to thank you because I learned a lot from our conversation and I really enjoyed it,
General Roberson. I feel like we could talk I learned a lot from our conversation and I really enjoyed it, General Roberson.
I feel like we could talk another three hours and go way beyond what I wrote about in my book.
And I'd learn a lot from you, sir. So thank you. I really appreciate it.
Thank you again for joining us for Episode 50 of the Irregular Warfare Podcast.
We release a new episode every two weeks.
In our next episode, Shauna and a guest host, Audrey Alexander from the Combating Terrorism
Center, speak with Professor Dan Byman of Georgetown University and Nick Rasmussen,
the former director of the National Counterterrorism Center, about online extremism.
Following that, Laura and I will speak with Greg Poling of the Asian Maritime Transparency
Initiative at CSIS on maritime irregular warfare in the Indo-Pacific.
Be sure to subscribe to the Irregular Warfare podcast so you do not miss an episode.
The podcast is a product of the Irregular Warfare Initiative.
We are a team of volunteer practitioners and researchers dedicated to bridging the gap between scholars and practitioners to support the community
of irregular warfare professionals. You can follow and engage with us on Facebook, Twitter,
Instagram, YouTube, or LinkedIn. Subscribe to our monthly newsletter for access to our written
content, upcoming community events, and other resources. If you enjoyed today's episode,
please leave a comment and positive rating on Apple Podcasts or wherever you listen to the Irregular Warfare podcast.
And one last note, what you hear in this episode are the views of the participants and do not represent those of Princeton, West Point, or any agency of the U.S. government.
Thank you again, and we will see you next time.