Your Undivided Attention - The Dark Side Of Decentralization — with Audrey Kurth Cronin
Episode Date: March 10, 2022Is decentralization inherently a good thing? These days, there's a lot of talk about decentralization. Decentralized social media platforms can allow us to own our own data. Decentralized cryptocurre...ncies can enable bank-free financial transactions. Decentralized 3D printing can allow us to fabricate anything we want.But if the world lives on Bitcoin, we may not be able to sanction nation states like Russia when they invade sovereign nations. If 3D printing is decentralized, anyone can print their own weapons at home. Decentralization takes on new meaning when we're talking about decentralizing the capacity for catastrophic destruction. This week on Your Undivided Attention, we explore the history of decentralized weaponry, how social media is effectively a new decentralized weapon, and how to wisely navigate these threats. Guiding us through this exploration is Audrey Kurth Cronin — one of the world’s leading experts in security and terrorism. Audrey is a distinguished Professor of International Security at American University, and the author of several books — most recently: Power to the People: How Open Technological Innovation is Arming Tomorrow’s Terrorists.Clarification: in the episode, Tristan refers to a video of Daniel Schmachtenberger's as "The Psychological Pitfalls of Working on Existential Risk." The correct name of the video is "Psychological Pitfalls of Engaging With X-Risks & Civilization Redesign."RECOMMENDED MEDIA Power to the People: How Open Technological Innovation is Arming Tomorrow's TerroristsAudrey Kurth Cronin's latest book, which analyzes emerging technologies and devises a new framework for analyzing 21st century military innovationPsychological Pitfalls of Engaging With X-Risks & Civilization RedesignDaniel Schmachtenberger's talk discussing the psychological pitfalls of working on existential risks and civilization redesignPolicy Reforms ToolkitThe Center for Humane Technology's toolkit for developing policies to protect the conditions that democracy needs to thrive: a comprehensively educated public, a citizenry that can check the power of market forces and bind predatory behaviorRECOMMENDED YUA EPISODES22 – Digital Democracy is Within Reach with Audrey Tang: https://www.humanetech.com/podcast/23-digital-democracy-is-within-reach 28 – Two Million Years in Two Hours: A Conversation with Yuval Noah Harari: https://www.humanetech.com/podcast/28-two-million-years-in-two-hours-a-conversation-with-yuval-noah-harari45 – Is World War III Already Here? Guest: Lieutenant General H.R. McMaster: https://www.humanetech.com/podcast/45-is-world-war-iii-already-here
Transcript
Discussion (0)
Is decentralization inherently a good thing?
It's a big trend now.
People are talking about Web 3 or having decentralized social media platforms
where you own your own data,
decentralized cryptocurrencies like Bitcoin,
or 3D printing, or people can fabricate anything they want at home.
But if the world lived on Bitcoin,
then you can no longer sanction countries like Russia
when they invade sovereign nations.
And if you decentralize 3D printing,
then anyone can print their own weapons at home.
home. Decentralization takes on new meaning when we're talking about decentralizing the
capacity for catastrophic destruction. I'm Tristan Harris, and this is your undivided
attention, the podcast from the Center for Humane Technology. Today on the show, we're going to
explore the decentralization of dangerous technologies throughout history and how social media
might be a new decentralized weapon. Here to guide us in that exploration,
is Audrey Kirth Kronin, one of the world's leading experts on security and terrorism.
Audrey is the distinguished professor of international security at American University
and the author of several books. Most recently, Power to the People. How open technological
innovation is arming tomorrow's terrorists.
Audrey Kirtz Kronin, thank you so much for coming on. Your undivided attention.
It's an honor to be here, Tristan.
So you have written this terrific book called Power to the People, which is a kind of a history of how new technology got distributed into the hands of more and more people and lessons that we can draw from history in where we might have created technologies or democratized technologies that we might have thought were beneficial or benevolent.
And later we found out might be dangerous.
As we get into that, because I think our listeners are obviously aware of how we talk about social media and its impacts on society.
You're going to provide kind of a much broader view.
I just wanted to ask you as a starting point, what inspired you to write this book?
Well, Tristan, I wanted to write about how there's a decentralization of power
that is not just a matter of the heavy focus on counterterrorism that we've had in the United States for the last 20 years,
but it's also a broader phenomenon that reflects developments in technology,
not just social media, but also developments in other digitally-enabled technologies.
So what I was trying to do was trace how individuals and small groups have used new technologies
that were intended for good, how they've diffused, and then how oftentimes they're used
for very negative reasons in very dangerous ways, violent ways.
I wanted to trace that process in a much more rigorous way rather than just assume,
that everybody understands exactly how we ended up with terrorist insurgents and other types
of violent small groups.
The other thing I wanted to do was to provide an alternative to the huge focus on states
and how they're using various types of technology, especially AI, but everything from drones
also to the communication suites, the surveillance suites that are becoming so attractive
to autocracies.
there's such a focus on states that I think this centralization of power is never put in juxtaposition to the decentralization of power that's also going on.
And I think the two things are intersecting.
So I was trying to focus on the decentralization of power to make a bigger point about where we are overall in the world.
So I think a huge parallel between your work and ours is that it can often sound really depressing.
We might talk about the history of weapons or biotechnology or terrorism.
or synthetic biology. I mean, this can become a very dark conversation. So just to maybe say
up front that, you know, however dark this conversation may get, I think we both share a deep
motivation for getting to a positive future and figuring out what are the criteria it would
take to get there. Yes. And I also think that there are some practical things that we can do.
When we fully understand the dangers, we can mitigate them. Right. So why don't we do the fun thing
and start talking about some of those dangers,
but maybe do so through a lens of history.
In your book, Power to the People,
you go through the history of new technologies
as they emerged and how we thought about them.
I think the trend that you're tracking throughout your book
is a history of where individual people, through tinkering,
because I think Nobel's story is actually really fascinating.
I'd love for you to talk a little bit more about it
because there's a link between the invention of dynamite
and later the Nobel Peace Prize.
But I think that just the fascinating aspects of how he had to come up with this technology
that at the time he thought was going to be incredibly beneficial.
And I'd love for people to kind of situate themselves there
because I think we find ourselves in a similar situation today
where we think of social media as totally beneficial.
We have thought of it as a totally beneficial thing.
And I think later we can discover some parallels in how things went dangerous.
Exactly.
So Alfred Nobel was trying to develop something that would improve upon the
disadvantages of gunpowder.
Before the development of dynamite, you had to use gunpowder to build any kind of
infrastructure, like to get through a mountain.
You would use little plugs of gunpowder that you would stick into a hole, and they
might move the work ahead by, you know, six inches a day.
And people died, all those workers who, there were accidents, the gunpowder would
come blowing up in their faces.
So therefore, you know, you had to have people who were using their life energy in order
to try to move those huge boulders and those rocks afterwards.
You had to use pickaxes.
I mean, I don't want to belabor the point,
but it was an extremely hard way to build anything.
And Alfred Nubel came from a family
that had just declared bankruptcy at the time that he was born.
So he was born into desperate poverty,
living in Sweden.
And his father also worked for the Russian Tsar,
building sea mines and various other kinds of military weaponry.
Through his hard work in their backyard shed,
Nobelk figured out a way to put nitroglycerin, this unstable chemical, into a stabilizing medium.
It caused the nitroglycerin to be able to be stable enough that you could stick it in a little tube and then put a fuse on it and have this powerful high explosive that could now pulverize rocks.
And that's where we get the iconic image of dynamite, these tubes with the little fuse at the end.
So everyone's familiar with that and seen the wily coyote, you know, the way.
that this was depicted. The reason why Alfred Nobel endowed the Peace Prize was that he had this
horrifying realization that what he had invented was potentially very destabilizing for the world.
Now, he died before the First World War broke out, but his very close friend was the most prominent
Austrian peace activist, Countess Bertha von Sutner. And he tried to support her work,
and then ultimately he left all this money to endow the Nobel Prizes
and particularly the Nobel Peace Prize.
And that's not by accident.
I think he had a deep sense of guilt.
That's actually fascinating.
What do you think had him wake up to that?
And that's before he actually saw the catastrophic use in two major world wars.
There's actually this thing that this is paralleled in the modern day tech industry
where founders received this story of themselves that they see all these people benefiting.
They have this enormous scale, wealth accrues to them.
They're the guy or the girl that invented.
some powerful world-changing technology that can't not go to your head.
There's a positive sense of affirmation.
And generally speaking, when there's negative consequences,
people who are in that position tend to want to suppress that or look away from that
because, of course, there's going to be some negative consequences.
That's unavoidable, and we can say that openly.
But then there's this really interesting thing where you're speaking about a kind of a reckoning,
a moment of waking up to the downsides being really catastrophic.
Do you know what was his process?
He saw large numbers of casualties, many of whom were innocent civilians, particularly as the decades went by, and leaders had more and more protection.
And he began to feel deep angst about it. He didn't talk about it publicly, but I believe that is why he endowed the Nobel Prizes.
He certainly wrote about it because he couldn't face the horror of the downsides of his own invention.
He thought perhaps his factories would convince people of how horrible a future war would be,
and that would be a deterrent to ever having that kind of a war.
But Alfred Nobel watched all of that unfold, and he was horrified.
I think people just don't realize just how accessible dynamite was
and just how much even the manufacturers of dynamite were clamoring to be regulated so that we could get out of this.
You're writing your book,
The technological enthusiasm and hobbyism of today
is strikingly similar to at the turn of the 19th century
and it's had a comparable degree of willful blindness about the risks.
In 1903, the managers of a big New York dynamite manufacturing company
begged publicly to the New York Times for new laws to abide by.
They said, quote,
it's one of the easiest things in the world to buy dynamite
enough in this city to blow up half of lower Broadway.
A total stranger could go into a dynamite powder company in the city
and buy all the dynamite he had money to pay for,
and not a question would be asked
as to what use the explosives was to be put.
I have often talked to other powder men
about selling explosives to everyone willing to buy,
whether he'd be able to give a satisfactory account of himself or not,
but the law is at fault, not the powderman.
Give us a law which we all must obey,
and we shall be only too willing to follow it.
And in reading those words,
I often think that this is what I wish,
you know, that heads of TikTok and Twitter and Facebook,
would just say because they are handing out memetic dynamite. They are handing out division dynamite,
dynamite that basically divides society daily. And they should be clamoring to be regulated so that we can
all change these business models. That's exactly right. And in fact, the people that were most
worried about it were those who, they call them the powder men, the people who actually sold dynamite
in their little shops, because they were afraid that they would be responsible for helping to
lead to the deaths of a lot of innocent people. I see those parallels with what you're saying about
today's major tech companies, because with that kind of power, you also have tremendous
responsibility, and I don't see much of that responsibility being shouldered.
Well, completely. And in this case, the purpose of that technolostan, you know, the big five tech
companies grouped together as one giant, you know, super nation state, is to maximize shareholder
value. And so much of our current, you know, dangerous weapons are often tied with that narrative
with nuclear weapons where they're obviously so horrible and horrific that we would never use
them. It really makes me wonder, can you imagine a world where Zuckerberg endowed the Zuckerberg
Prize? And the Zuckerberg Prize worked to basically give away 99% of his wealth, which, if I'm not
mistaken, he's committed to doing anyway, but do that immediately to kind of rebuild the social
fabric that has been grossly eroded through the, you know, inadvertent consequence of the
business models of the thing that he created. I think that a Zuckerberg or any of the leaders of
our technolistan, that's new to me. I like that. Any of those leaders, I still have some hope
that they're going to find some conscience. I want to zoom out for a moment. Because I mean, if we do
broad situation awareness of where we are in the year 2022. We have an acceleration of these
new capacities, these technological capacities we've never had before. I can buy drones from
Amazon. I can download swarming algorithms off of GitHub and use those drones in various ways.
I can combine those drones with facial recognition. I mean, you can combine all these technologies
in ways that we haven't even thought about. And the deployment of them, the existence of them,
And that's kind of what I think brings us to this conversation today is a shared recognition
that technology is moving faster than our society at appraising of them.
And as we invent all these new technologies and we do it at increasing speed and anyone can
combine them, I think the premise of your book is this kind of question.
How do we manage this accelerating deployment of technologies who may not, again, just like
dynamite look obviously harmful or dangerous, but can certainly and are very obviously
able to be repurposed to be very dangerous. That is the meta problem. And we've actually
described this in our work as the situation we find ourselves in is kind of like a bowling alley.
And we've got these two gutters on both sides. On one side of the gutter, we call catastrophes.
Catastrophies are the decentralized capacity for anyone to cause exponential damage. I mean,
By the way, a meme could be a catastrophe, the fact that on TikTok, someone could post, as they did
recently, and suggest that December 17th will be national shoot-up your school day.
So let's say I'm Russia, right?
And I want to spread that meme, that this could be national shoot-up your school day.
It's just a rumor.
But I can post that on TikTok and it has a certain enragement, incitement, inflammatory capacity
that all these kids are going to say, is this real?
Should I not go to school?
And this is exactly what happened.
My understanding is that actually no shooting did happen, but if I'm Russia, I win in both cases.
Either a shooting does happen, and I've created stochastic terrorism, or nothing happens and I've still created mass panic.
So to make this example concrete in the two gutters analogy, the ability for smaller and smaller actors to deploy an exponential consequence, whether that is, you know, millions and millions of people being aware of national shoot-up your school day, even though that doesn't exist, or the ability for anyone to build a drone and to point it at, you know, sensitive targets, for anybody to build, you know, viruses in their basement, these are all catastrophe. So we don't want that. That's one of the bad scenario.
we don't want to happen. On the other gutter, we have dystopias. Disopias are surveillance states
that are basically monitoring the use of all of these dangerous technologies. Because, of course,
you're going to need to have regulations. You're going to need to say, well, maybe we should
make sure people don't post these rumors about national shoot-up your school day. Or we should make
sure that we regulate who can get a gene compiler on their desktop computer or who can access
CRISPR. Or maybe we should regulate who can buy drones. But again, that
leaves you more towards an authoritarian, draconian, dystopic society where everything is monitored
and surveilled, and that's more of the China model. So each of those gutters are getting bigger,
I think it's important to say. So we have, on the catastrophe side, we have more and more technologies
that make up decentralized capacities to create a catastrophe or chaos. And on the other side,
governments that are able to, with surveillance at scale, take up and monitor more and more space.
And so when we've articulated on this podcast in the past,
that we're looking to sort of find this thin little runway in the bowling alley
of what is a digital open society that recognizes the acceleration of these
decentralized tech capacities that are being built into society
and put into more and more hands.
And on the other side, making sure we don't create a kind of closed or authoritarian
or surveillance state.
And that that is kind of one of the master problem statements that we all have to answer.
And I know that what bring you and I both to this conversation is not that we are naively optimistic,
but the premise of it is we've got to find the answer to that question.
So I'm just curious to first lay that out there and let you respond to it,
and then we can dig into where relevant stories from your book that might be able to shed some light.
Sure.
Well, I completely agree with that way of laying out the two gutters.
And one of the things I was very worried about in writing this book,
and actually before I began to write it,
I don't believe in being an alarmist.
I've been studying counterterrorism for decades,
and I've always tried to find the way out
or the solution to the problem or the more strategic perspective.
And yet, by talking about the chaotic side,
the gutter that is all about how these technologies are used
and clusters of technologies in particular.
So how do you talk about that
without having everybody go immediately to the other gutter,
which is about put in place a surveillance society?
you know, I don't want to be justifying here moving from one gutter to the other gutter.
That's not what the story is.
It is about how to find the middle way.
The Romans would call it the via media.
This is not a new problem.
We need to find that moderation, that new way of governing.
And there's this intersection between mobilization through social media and other forms of communication,
increased reach by which we can talk about quadcopters and other types of projections of force,
and then finally systems integration where you can use AI and other kinds of tools
to give that kind of power, unprecedented power, to small groups.
Let's actually dig into that because I think that might sound like abstract topics for folks.
I'd love to just break down exactly what you mean.
So could you explain what you mean by mobilization and reach and integration?
Sure.
Well, mobilization was one of the things that caused the nation state to come into being.
I mean, you had with the French Revolution the levy on map.
which was basically conscription,
which enabled Napoleon's armies
to develop huge numbers of people on the battlefield.
And that was the beginning of conscription
and the extremely powerful numbers
that were in the professional armies
that went to war with each other
in the First and Second World War.
So mobilization was at the very heart of power for states.
What I'm arguing now is that mobilization
is no longer under the control of states or armies.
It's now under control of individuals or sometimes nefarious actors.
Sometimes we can mobilize for good things.
If it's a cause that you support, you're bringing attention to abuse of power, police killings.
There are lots of ways that mobilization is bringing positive effects, but it's also bringing people to the battlefield.
Not just ISIS.
Isis is the example that everyone knows best.
That was the most dramatic example.
But we also see mobilization of people to carry out violence individually.
So you have the Christchurch shootings where that perpetrator was mobilized by what he had read on the internet.
And then he used live streaming to mobilize others to follow him.
And you can see a contagion effect of many other people copying what people who have sent their messages out do.
And we have an increase in violence,
an increase in anger and increase in the ideas that oftentimes lead to that violence.
That's what I mean by mobilization.
And social media, of course, is probably the key factor in all of that.
So what I hear you saying is I could have a state that conscripts an army,
and that's I can move 100,000 people from this to this territory, and I'm Napoleon.
That's previous era.
New era is I'm an ISIS terrorist in Afghanistan or Iraq,
and I broadcast on telegram channels, and I can get 5,000 people to move from Europe to,
Syria or something like that through
propaganda. So that's that first piece
of mobilization. So then the second piece of reach?
Reach is about the ability
to project lethal
force. So you can attach
an explosive to a quadcopter
and send it over and
into the governor's compound
in Kunduz in Afghanistan, which happened
about a year and a half ago.
You can use explosives to have
a kind of a leveraged effect. Remember
that having an effect
is not just having large
numbers of people on a battlefield. It's also giving power to people who can have a political
effect that's very targeted, and then they spread that political effect through the first
factor, which is mobilization. So the ability to use reach through facial recognition technology,
through algorithms, through AI, through capabilities that might only have been able to be used
by an army. That's what we're seeing. And we're still fairly early in that.
That's the key thing that I got written reading this part of your book, which is, yeah,
I mean, obviously, an army could move projectiles from one part or have a missile that can shoot thousands of kilometers or something like that.
But increasingly, we're decentralizing those capacities.
So more and more people have access to be able to, with a drone, move something through time and space.
And then the last thing you were talking about was autonomy and integration with direction?
Yes. Well, as is well known, it's very difficult to integrate all of these technologies.
And it used to be that you had to have a National Command Center in order to do.
that. But now, because of the ability to download algorithms, the ability to integrate through
AI techniques, you can actually have very complicated systems that are working well with each
other. You don't have to rely on advanced, highly trained human beings and the strong structures
that states had before now. And now you've got, you know, the Houthis in Yemen, and you've got lots
of non-state groups that are increasingly powerful, and it's the leverage between these two.
It's not that they can go toe to toe to with the U.S. military, but they don't have to.
What's better is to operate under the radar below the level of physical armed response
and have a political impact that hollows things out from within.
Yeah, completely. I mean, 9-11 could have killed, what, 3,000-something people on that day,
but then the use of that event to become propaganda to recruit many more people
and to spread that and market target that into other groups that can be radicalized
and recruit more people into bin Laden's, you know, Al-Qaeda and ISIS networks and so on.
So it's not just the number of people that are killed.
It's the ability to enact power, to use asymmetries of power in a new way.
Just to provide another parallel to the framing you've brought up.
And we've had a previous guest on our podcast, Daniel Schmachtenberger,
who's talked about the situation that we're in,
is that we have the power of gods,
but unless we have the love, prudence, and wisdom of gods,
that power that we're wielding is dangerous.
And nowhere is it written that you can democratize
these kinds of godlike powers into everyone's hands
and have our species not self-terminate in some way.
And I say this again, not to be an alarmist,
but just to frame, we have to have some vision
about how do we wield these more decentralized catastrophic capacities?
And what I loved about your story with dynamite in your book
is we actually have an example of something that was seen as initially positive, beneficial,
and then we slowly realized it was dangerous,
and then we actually had a response.
We had a regulatory response.
We had a media response.
And I just wanted to first frame all that for you and see if you can react to it.
And then maybe we can talk about, I think, the optimistic case of, you know,
how did we take this thing that was everywhere?
Dynamite was available for everyone and then get to a world where we started wielding it
with the love, prudence, and wisdom of gods.
Maybe not all the way, but certainly better than it was before.
I love that phrase.
the love prudence wisdom of gods yes well the response to dynamite does tell us a little bit about
where we could find a little bit of that wisdom because the Europeans relied heavily on regulation
as Europeans are more tending to do now and they passed various laws through all the capitals in
Europe and ultimately they tamped down the wave of dynamitings that had been very very violent
in the 19th century, and by the time the turn of the century came, things had gotten better
in Europe. But in the United States, we were way behind, and in our usual way, we focused not on
regulation, but there was a tendency to pass laws against immigrants, because the belief was that
it must be the immigrants who are causing all the bad things to happen, even though the truth of
the matter was that those people who were carrying out terrorist attacks actually were living
in the United States and were long-term residents or even citizens of the United States when they
did that. So we started with immigration laws and then major police crackdowns. And ultimately,
one of the key elements in the answer was that the railroad stepped in because they got tired
of having their railroad cars blow up because there was no regulation of the dynamite that was
being transported by them. This is actually so fascinating as a parallel because what I hear you saying
is, okay, so we've got dynamite and we realize it's a problem. Actually, Europe realizes it's a problem first.
This may be paralleling some of our history. And then in the U.S., we kind of misdiagnosed the problem
and we turn it into xenophobia when, in fact, it was more around how do we regulate the technology.
And then what I just heard you say is it's how business stepped in because, well, the business of railroads,
which were the infrastructure on top of which dynamite was used, that was causing too many accidents.
And I just want to anchor that for listeners because I think, you know, if you just make that comparison,
and it's possible that we could be living in a moment just like that,
where we actually need some new institutions,
but we don't have them yet,
and Dynamite is a good example of having made a transition,
which we do what we needed to do.
Yes, I believe that very strongly.
And, you know, I used to work on Capitol Hill,
and I'm accustomed to advising members of Congress.
And one of the things that I hear from Silicon Valley friends
and just people who come and testify before Congress
is a lot of frustration about the level of ignorance
and some might even say they're concerned
that's incompetence,
inability to be able to regulate technology companies
in any effective way.
And I understand where that frustration comes
for many different reasons.
Some people point to the generational differences,
some people point to political interests.
There are lots and lots of reasons,
but they definitely don't understand technology well enough.
But at the same time, those who understand the technology have some responsibility to teach them, to mitigate the downsides of their own technologies, to build in protections at the development phase of new technologies.
I don't see Congress and our tech companies as being at odds. If we have any hope in the future of building effective government institutions, we need to be able to have both of them being honest about what the downsides are and being,
smart about them. Right now, I would say, you know, Zuckerberg has a fiduciary duty to whenever there
is research showing that, let's say, they cause genocides in certain countries or something like that,
it'd be much easier for them to shut down those research departments and not look. And also,
when they're competing game theoretically against other platforms, and if they're the platform that
tries to play nice and do research about where their platforms might be causing polarization or genocide
or something like that, they're just going to lose
of the platforms that don't do that.
And so this is a classic multipolar trap
tragedy of the common situation
where government, we need some third actor
to basically step in and create a requirement,
a binding by law, that everyone has to do that.
And we could easily imagine a world
in which the Apple and Google play and app stores
just basically require that companies,
that apps that are of a certain size,
that basically become the primary communication
or information basis for a democracy,
have to do that research.
I mean, that's doing it through the Play Store or App Store model.
You could obviously do it through government as well.
And there could be an externalities fund, you know,
that actually when you start,
you have to allocate a portion of your stock, essentially,
to a forecasting of possible ways that things could go wrong,
and you have to actually allocate resources into that.
And also a cleanup fund,
that knowing that there's going to be problems,
and you actually have to allocate a portion of your stock to do that.
So that now there's a skin-in-the-game process.
There's an actual stakeholder.
that's in your cap table as a company
that is resources that are devoted
to anticipating negative consequences
and also ameliorating them when they show up
as opposed to if I did that
I would be increasing my costs as a company
and if I'm Facebook and I do that
and TikTok doesn't do that, then I'm just losing.
These are the kinds of things that could happen
that are not about speech
but about design choices that are unsafe or safe.
Yes, and I think that
we have to get to the point
where tech companies understand their responsibilities
and their dedication to the public interest
becomes more important than their dedication to profit.
And I hate to say that because it sounds very idealistic,
but I'm not sure how else we can get out of this situation.
In your book, you actually do talk about
what happened with the end of dynamitings in the U.S.
and what coincided with that.
And I think maybe one thing we haven't really talked about
is the relationship between political and social upheaval and political discontent,
social discontents, and the relationship between that and the use of these technologies
in more dangerous ways, because I think, not to be more dystopic, but we're in a similar
moment where there's a lot of political and social upheaval. And, you know, there's, in that
time in the early industrial era, exploitation of a labor class, a major transition from
agrarian to industrial societies, in that transition, there's more social upheaval, that
social upheaval can translate into uses of some of those technologies. I think that might have a
parallel lesson for where we are today. Do you want to speak to any of that? Yes, it does. There's a lot,
there was a tremendous amount of worker discontent. There was a crushing of the unions, particularly
in the United States. There was huge increase in inequality towards the end of the 19th century.
There was a huge impact upon individual lives where they had to move into factories and then
were highly abused or overworked, there are parallels to how our workers are having to adjust to
the impact of digital technologies today. And there's this opportunity to use a lot of accessible
technologies in the same way that there was the opportunity to use dynamite. But ultimately,
you had, particularly in the United States, you had FDR come in, and first you had to go through
the Depression, of course, but hopefully we don't have one of those coming. But ultimately,
you had a rebalancing of what the society valued and also what it looked like and how
stable it could be domestically. And you had the civil rights era. These things followed on
from the abuses of the latter part of the 19th century. And I think we may be going through a
similar period. But my hope would be that we would get to some of the solutions without some of
the violence that occurred then. That's my hope too. And in that spirit, we're going to take a quick
interlude with our executive producer, Stephanie Lepp, and then go back to my conversation with
Audrey.
Hey, Stephanie.
Hi, Tristan.
So you have been talking with Audrey about this phenomenon of, you know, the decentralization
of the capacity for catastrophic destruction.
And so first and foremost, just to make it super concrete for our listeners, what kinds of
of, let's say, tools are becoming available to civilians that can be used for catastrophic
destruction.
Yeah, and just to ground that, they don't all have to be catastrophic in and of themselves,
but that their use could lead to catastrophe.
Like, if you think about TikTok, you don't think about a meme going viral in less than
an hour to 100 million people.
That doesn't sound like something that's catastrophic.
or dangerous. But if I'm Russia and I want to the day before I invade Ukraine create a meme that goes
viral saying that, you know, nuclear weapons are about to go off in your hometown. And of course
that's going to go viral. I now have the perfect means to do that. And whether it's amplifying the
trucker convoy in Western countries at the time that I'm invading Ukraine or I want to amplify
the National Shooter Day meme, it's never been easier to do that.
that. So that's a kind of contagious communications. That's a new capacity that's in anyone's
hands. Another example is hacking tools. It used to be that you had to be an expert in cyber hacking
to hack into infrastructure. But thanks to the fact that the NSA's hacking tools were leaked,
now many different actors, so just hackers in their basement, can go online into online
forums and download these hacking tools. It used to be really hard to build a drone. I mean,
think about the early 2000s. When the U.S. military,
had these drone weapons that it could use.
Now anyone can buy a DJI phantom drone off of Amazon
and get swarming algorithms from MIT off of GitHub for free
and then start combining those things.
Or even facial recognition.
You know, it used to be something that was expensive
that only a handful of AI departments
in maybe governments or universities had access to.
And now facial recognition technology is available to everyone.
What's really worrying is the way that you can combine
these different technologies together.
You know, you could combine drones
and facial recognition and explosives
and now you have a pretty dangerous weapon.
So we're just scratching the surface
on the trend line,
which is that there's going to be increasing technology
democratized into more and more people's hands.
We, you know, the public,
likes to focus on the positive aspects of that.
Well, isn't it great that now anybody can do a drone
and can use that for planting trees at scale
because you can have drones and AI
that's scanning the ground to plant trees faster and faster.
But the problem is really looking at this negative dark side
that, frankly, the more anybody has access to it,
the more anyone can cause real danger.
Yeah, well, let's talk about that, the danger, the dark side explicitly.
So this decentralization of the capacity for catastrophic destruction is dark.
And I think one thing that our listeners might be wondering is,
how do you hold it? I mean, and specifically for you, Tristan, I mean, how do you hold this?
How do you, even if we just think about it in terms of how do you sleep, you know, with this awareness?
To be honest, it's not always easy. You know, I remember reading Audrey's book late at night
before doing the interview with her, and it's not good bedtime reading.
It's just like, you know, listening to Paul Hawken, who is one of the authors of Drawdown,
the book about climate change.
And he has a rule that says
don't read climate news after 4 p.m.
I just want to say to listeners
that this material isn't easy, right?
It's very challenging to wake up
and look at this stuff.
In fact, I remember meeting some people
in Washington, D.C.,
who knew some of the clinical psychologists
who work with the Joint Chiefs of Staff.
And these are the people who know about
all of the things that are going wrong in the world.
And it used to be that only a handful
of people at the tops of institutions
had to hold the weight
of some of these ominous futures.
But in a world of decentralized catastrophe,
now more of us are aware of them
and more of us are holding them as individuals.
You know, we're not military leaders.
We're just regular civilians, parents, teachers,
listeners of this podcast.
One thing I might recommend is
Daniel Schmachtenberger actually has a great video online
called the psychological pitfalls
of working on existential risk.
And I think it's a good video on what attracts people.
Like, where are there problematic or perverse ways that we get attracted to this material?
You know, people who are attracted to catastrophes or thinking about this stuff all day.
I don't think it's a good thing for our health.
This stuff is real.
It's not hypothetical.
And the reason that I think we want to raise more awareness about it, the premise is that we want to actually mobilize our institutions and mobilize companies to act.
and to help protect these things.
And there are ways of doing that.
I believe that I think Audrey talks about in her book
that drones are starting to be required to have software
that basically says you can't fly near these sensitive areas.
So just when the drones start moving in that direction,
they just don't fly.
Now, again, there's going to be more ways
that people hack around some of these limitations,
but we need to start thinking about protecting
against these decentralized capacities.
So there is actually an existentially hopeful
implication of decentralized catastrophe. And that is that we can't leave anyone behind, right?
We increasingly cannot alienate anyone because within the context of decentralized catastrophic
capacity, their alienation can become an existential threat. And so, yeah, I would love for you
to talk about that and what it means for our national security strategy.
Yeah, well, as you said, in a world where anyone can take their pain, their psychological pain and suffering, and translate it into catastrophes that affect the whole world or the whole planet, the way out of that is we have to care about everyone.
We have to have no shut-ins. We have to have people caring for each other.
We have to have a system, an economy, a political system, in which people feel agency, a positive some world.
and this is actually why Daniel Schmachenberger in his work
talks about how we cannot play a win-lose game.
We have to create a new win-win game
that creates omni-win-win outcomes
for as many people as possible.
Can they build systems in which everybody is winning,
has dignity, has basic respect, has love, has compassion?
Well, and not leaving anyone behind
doesn't mean no consequences, right?
The way that I think about it is we don't have to choose
between consequences and compassion, right? Just because we can't alienate anyone doesn't mean
they shouldn't face the consequences appropriate to their actions, be those consequences,
jail time or whatever they might be. I mean, people should endure the consequences appropriate
to their actions and also have the opportunity to learn and change and grow, you know, for their
sake and all of ours. And I guess what this implies to me is a reorientation of
of our national security strategy or our defense strategy, you know, if the power of gods without
the prudence of gods leads to self-termination and we have the power of gods, then, you know,
the Department of Defense should really be focused on helping us develop the prudence of gods.
You know, otherwise, if you don't do that, again, you're left with swinging the other way on the
pendulum towards creating some kind of master totalitarian surveillance state in which every action on
every computer is that desktop gene compiler that lets you build the next pandemic virus,
you know, from your computer, then you have to have ubiquitous surveillance on everyone's
machines. So we have to choose. Like, what does it look like? And I think a society that's in
between is a society of Seussphalance instead of surveillance. Surveillance is watching from above.
Seussvalence is watching from within communities and from the side. And the way that we existed
in tribes is we had people caring for each other. So in a tribe of only 150 people, you would know
when someone looks like they're feeling a little bit estranged from their community
or they're not doing very well.
I think each of us walking out of this episode can say,
what does it look like to care for the people that are in our own lives
who are not doing so well?
In the spirit of optimism,
I also thought it would be great for you to cover
what happened in the way that the media covered dynamite.
There's actually a distinct role in how we got past
the contagious spread of the idea that people should be using dynamite to create bombings everywhere.
The media had to go through a process of really upgrading and coming up with a standard for media
professional ethics. And a lot of that had to do with social contagion theory. So can you talk a little bit
about the media's upgrade process? I think that's going to be key for how we deal with
more of these decentralized dangerous capacities and more and more hands that we're going to need
to be responsible about communicating if and when they happen. Yes, I'd be happy to. The media
at the turn of the century, we're making enormous amounts of money.
So you had the Pulitzer papers, the Hearst papers.
You had papers that sold for five cents each that made more money than they'd ever seen before in all of the history of newspapers.
So there was this huge drive.
It was a machine to whenever there was an attack, a dynamiting attack, to have explosive coverage, pun intended,
to have extremely sensationalized coverage of that attack.
and then to sell newspapers on the corner by the millions.
And it was a great business model.
And many of the great media empires were built that way.
But ultimately they came to realize that they had a responsibility
for the fact that these attacks were spreading.
And that responsibility ultimately led the better papers,
not all papers, but papers like the New York Times
to develop editorial standards.
And the professionalization of the media is dated to that period.
They began to realize that they were part of the problem.
And they began to institute much stricter rules about exactly what would be reported, how, what the facts were.
There were a number of different editorial standards that were put in place.
And this helped to end the wave of dynamitings internationally.
I think what you just said is so crucial because I agree.
I actually don't think it would have been possible to end the wave of dynamitings if you had a perpetual.
viral, sensationalized news media that profited directly from trying to get as many people
to be afraid of it every time it happened.
You know, social contagion theory has had, you know, a really prominent role in our history
of violence.
In 1970, researchers noticed an increase, this is from your book, researchers noticed an
increase in violent crime following well-publicized assassinations, such as the 1963
killing of John F. Kennedy that led to more murders in the United States.
Other research found that widely publicized bombings and kidnappings inspired copycat attacks.
And careful analysis of aircraft hijackings between 1968 and 1972
also demonstrated a clear relationship between successful aircraft hijackings in the United States.
In 1972, the total number of aircraft hijacking attempts of foreign-boarded planes peaked at more than 60,
and researchers found that the best predictor for a hijacking was a prior hijacking that was well-publicized and broadly
perceived to be successful.
So this is really critical stuff when you understand how powerful and contagious
certain ideas are when you can discover, say, a new way to so panic.
You know, we mentioned TikTok before.
Another example with TikTok is this devious Licks challenge, which was basically a TikTok meme
was a video where people actually showed how to destroy their high school bathroom.
So it's literally you walk into the bathroom and there's a video of someone put
pee and the soap dispenser and they flushed, you know, McDonald's Big Macs down the toilet,
and they just made a wreck of the bathroom. And this is awful. But TikTok is making this stuff
go viral. I mean, that's what the business model is, is let's make this stuff go viral. But I think
of this as we've already laid out the dynamite lines. We've already laid out the fuse
so that any time someone wants to launch a dangerous meme like this, we make these things go
as contagious as possible. And that's why I think social contagion theory is so fundamental
to how we're going to have to deal with these problems.
Now, the question I have, and this is, I think, I'm really posing it, I think, fundamentally
as the situation we find ourselves in why I find social media to be so dangerous and catastrophic
is that now, instead of having a small number of newspapers who could adopt that
responsibility framework, that could be institutional, that people had to have certain
degrees or certain education and certain training, ethics training, to become part of those
institutions, whether it was, you know, major newspapers like the New York Times or, you know,
purse or whatever it was, now when three billion people are now individual newspapers,
or you have 15-year-olds who are walking MTV stations, where you have influencers who are
20 years old, who have 10 million, 20 million followers and are telling people what they should
think about COVID, that to me is the structural problem that's going to prevent a media
responsibility framework from taking place if we had this decentralized broadcasting world
of social media. And we don't come up with some responsibility.
responsibility framework that as you gain more power, like if you have less than a million people that you're reaching, you might have a different responsibility framework than if you're reaching more. But we now have individuals, many individuals, who are reaching more people than the newspapers in the 1800s reached.
I think that's a wonderful idea. If you could have required ethics training and required professionalism training after you have a certain number of followers, I think that would be a great idea.
I think that when you view it that way more systemically, it's like we've over given out these godlike powers.
everyone without pairing it with that responsibility. And, you know, in our country, we know that
we have to pair rights with responsibilities. That is the key framework. It's rights decoupled from
responsibility that I think is the philosophical error that we have made in decentralizing these
capacities. We have to recognize, again, the twin gutters of, on the one hand, you take your
hand off the steering wheel and you get decentralized catastrophes and chaos everywhere because you're
not putting any controls on it. There's no licenses for anything. And the other side, you get the
dystopias, of overreach and power and licensing for everything and dystopian abuse of
governments. And we don't want either of those outcomes. And I just want to say that really appreciate
how elucidating your book and your work has been at, I think, framing this for people and giving
them a much deeper historical perspective for why these issues are so urgent. And I just want to
thank you so much for coming on the podcast. Thank you, Tristan. I've really enjoyed our conversation.
is a global expert in security and terrorism.
She's served in the office of Secretary of Defense for policy
and is a life member of the Council on Foreign Relations.
And today, Audrey is a distinguished professor of international security
at American University in Washington, D.C.,
and the author of several books,
most recently, the subject of our conversation today,
Power to the People,
how open technological innovation is arming tomorrow's terrorists.
Your indivated attention is produced by the Center for Humane Technology,
a nonprofit organization working to catalyze a humane
future. Our executive producer is Stephanie Lep. Our senior producer is Julius Scott. Engineering
on this episode by Jeff Sudakin. Original music and sound design by Ryan and Hayes Holiday. And a special
thanks to the whole Center for Humane Technology team for making this podcast possible. You can find
show notes, transcripts, and much more at HumaneTech.com. A very special thanks goes to our
generous lead supporters, including the Omidyar Network, Craig Newmark Philanthropies, and the Evolve
Foundation, among many others. And if you made it all the way here, let me just give one more
thank you to you for giving us your undivided attention.
