Angry Planet - What 'War by Other Means' Means Now
Episode Date: October 15, 2018Taylor Swift and Islamic State are in a battle for our hearts, minds and eyeballs. Russia wants your vote, or for you not to vote at all. And if you think the amount of false information out there onl...ine is dangerous now, just wait. Artificial intelligence is about to make fake news virtually indistinguishable from the real thing.Peter W. Singer, author of the new book "LikeWar: The Weaponization of Social Media," takes us beyond the troll farms and into some even creepier territory.Support this show http://supporter.acast.com/warcollege. Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
Love this podcast?
Support this show through the ACAST supporter feature.
It's up to you how much you give, and there's no regular commitment.
Just click the link in the show description to support now.
You would speak to anyone in the military and they would say,
all the tactical victories we achieve on the battlefield won't matter if we don't get the messaging campaign right.
And very soon in the discussion, I realized they don't get it.
You're listening to War College,
a weekly podcast that brings you the stories from behind the front lines.
Here are your hosts, Matthew Galt and Jason Fields.
Hello and welcome to War College. I'm Matthew Galt.
And I'm Jason Fields.
Social media has changed everything, including war.
Groups such as the Islamic State and white nationalists now have access to an unprecedented propaganda machine.
Terrorists live stream their attacks and use Twitter to hunt down victims.
disinformation is stickier and more viral than ever.
Every day and every way, the future looks less like the Jetsons and more like Black Mirror.
Peter Singer is here to talk about that with us.
He's a military strategist at the New American Think Tank and a consultant at the Pentagon.
He's also the co-author of the new book Like War, the Weaponization of Social Media,
which you can pick up at Likewarbook.com.
Peter, thank you so much for joining us.
Thank you for having me.
So my first question is, what did the Islamic State learn from Taylor Swift?
It's one of the strange juxtapositions that you get in this space. Essentially, what like wars about is how news politics and war changed the internet for the rest of us, but also how the internet reshaped news politics and war.
And if you think of it, you know, cyber war is the hacking of networks.
What we call like war is the hacking of people on the networks by driving ideas viral through a mix of likes and lies and the network's own algorithms.
And so you get these strange outcomes where you have wildly different groups and individuals with wildly different real world goals.
But they are all using the same tactics online to achieve those goals.
And so in the book, there's many different fun characters, scary character scenes,
but one of which looks at the back and forth between Jinnade Hussein, who was a failed British rapper,
turned ISIS's top online recruiter, and Taylor Swift, and how both of them are using,
a very similar tactic of winning the online war to achieve their goal for Hussein, it's to get people to join ISIS,
and then he's also, in essence, basically helping to coordinate terrorist attacks abroad.
And for Taylor Swift, it's to win fans, to drive album sales.
But they're both doing, they're engaging in one of the ways of winning online is this seemingly strange contradiction, but it works.
and it's not a contradiction, is planned authenticity.
How do you plan authenticity?
But what they both do is they use the Internet
and kind of what makes it such a revolutionary technology.
And in social media particular,
it's simultaneously global and personal.
So both of them reach down to individually engage people.
Taylor Swift does what's known as Taye Lurking.
She came up with the name herself.
She will reach down.
down, jump into conversations that our fans are having, engage with them, just like any other
friend would. It is as if she is their friend, and she really is their friend. She'll console
someone who didn't pass their driver's license. She'll, in turn, she'll celebrate with them when
they pass their driver's license. So, talk to them about their breakups and the like, but she's
doing it in a manner that's, you know, direct one-on-one, but also in full knowledge that the
world is watching. And for Hussein, it was the very same thing. He would engage with people one-on-one.
He'd do it with this kind of hip-hop patter because being real online is so crucial to success.
You know, he's saying things like, you know, you can stay at home and play call of duty or you can
join the real call of duty with us here in Syria. But again, he's doing it in the knowledge that
the world is watching and that he's winning more recruits that way. It's, again, it's a very
strange world, and yet that's where we're at right now.
The idea that authenticity is important is interesting to me because a lot of people,
when they think about online interactions, they're inauthentic, right? Can you speak to that
a little bit? So I'm going to combine again two things that seem to have nothing to do with
each other, and they almost shouldn't have anything to do with each other. But welcome to the
strange world. We're in. It kind of shows the fun of the book. I'm going to link Aristotle and
and Spidey, Fincer Pratt and Heidi Montag, a reality show phenomenon.
So going back to Aristotle and his writing about the very first democracies
and what we now think of as politics, politica, he basically describes how you have
this challenge of, again, a seeming contradiction.
It's government of the people.
And yet there is a specific class of people that have decided that,
they are better than the people. They should be the leaders of the people. So they both have to
simultaneously show that they are better, but they are also of the people. And they have long tried to
kind of balance this and by being authentic, authentic, but in kind of fake ways. And then the technology
weaves into that. So you get the early newspapers being used to sell the story of the politician
who, you know, is born in a log cabin to then you get television. And,
And essentially, television, you know, one can make the argument that the diner industry of New Hampshire and Iowa is almost exclusively dependent on it, on, you know, politicians coming in and trying to show that they're authentic by meeting with people in a diner.
And then you get social media.
And what social media has added this element of being real, but also performative at the same time.
And the ultimate example of that would be Donald Trump, who was like a public.
politician and that he was seeking out votes, but he did it in a way by showing his authenticity
online. Even among his opponents, people who, you know, supported another Republican in the
nomination or during the campaign against Clinton, one positive attribute they would say about
him in all the polling was, it's real. He's authentic. It really is him behind at real Donald
Trump. And that authenticity helped him went out as comparison.
and Hillary Clinton's online presence wasn't viewed as real, in many ways, because it was not.
She had as many as 11 different people writing her tweets for her.
Okay, so how does this link to Spencer Pratt and reality stars?
Among the people we interviewed for the book, again, they ranged from General Michael Flynn to terrorist group recruiters,
to the godfather of the Internet itself.
We interviewed Spencer Pratt and Heidi, his wife, who,
essentially taught us through how you manipulate the media and the broader public to get what you want
by figuring out sort of the key aspects of achieving this goal.
And one in particular was a narrative.
But to this idea of authenticity, we talked to them about, you know, what's the difference
between when you were doing back in the days of the hills and then most people don't remember,
but he was actually the producer of the show that first introduced the Kardashians to the world.
And so, you know, what's different?
And she waxed almost philosophic where she said the difference with social media is that everyone is a reality star now and they're just as fake as we were.
Does that mean that world governments and militaries should learn from people like Spencer Pratt?
Are there lessons for them in Instagram influencers and YouTubers?
Scarly enough, yes, because we had this fascinating.
interaction. We're meeting with Spencer and his wife, and they're walking us through the different
ways you achieve your goals online. It's a very different set of goals. Their goal was to become
famous to make money. But soon after we met with the head of the U.S. government's counter ISIS
messaging campaign, which, again, you would speak to anyone in the military and they would say,
all the tactical victories we achieve on the battlefield won't matter if we don't get the messaging
campaign right. And very soon in the discussion, I realized they don't get it. They don't get how
this space works. They don't get how to influence using these tools for kind of making a military
comparison. They don't understand the tactics. They don't understand the terrain. They don't
understand the strategy. And they certainly don't have a doctrine for success. And it quickly realized
that the campaign was going to fail. And it did. But again, you can see.
this learning back and forth, because as Klausowitz would explain to us, war always involves two
sides, and just when you think you're ahead, the other side's going to learn and react. And the same
phenomena has happened, even within the U.S. military, where the way it was getting its clock
cleaned a couple years ago in this space, it's now copycatting, not just what online
influencers and celebrity doing, it's copycatting what the Russian disinformation warrior
were doing. So you can see, for example, contracts being announced by the Defense Department
for one person to control multiple social media accounts simultaneously. It's basically the same
model of what the Russian sock puppets were doing targeting the 2016 election. We're just now
doing, you know, thinking about its battlefield effect in the Middle East. And again, one of the
places that we visited to learn about this and, and, and,
interviewed with people, there's a scene in the book of Fort Polk, which Fort Polk holds a real
hallowed place in military history. It was where back in the interwar years, right before World
II, and what are known as Louisiana maneuvers, the U.S. Army figured out how to shift from a world of
horses to a world of mechanization and wireless communication. And since then, Fort Polk has been
where the army essentially tests and trains for the real world wars.
It was where they plan for the tank battles against the Red Army
and then the actual ones that were done against the Iraqis and the Persian Gulf War.
After 9-11, it's turned into a series of villages with an opposing force that acts like insurgents.
And then you have people playing civilians.
And they've recently updated it again to, in this fake war,
Or they've layered a fake internet over the top of it to simulate the real wars on the ground and on the internet.
And so, again, there's just this learning back and forth that's utterly fascinating.
And there's a lot of lessons wrapped up within that, not just for the military and how militaries are learning from each other, the IDF, the British Army's 77th Brigade, et cetera, but also this learning between fields.
So, for example, some of the things that the military is doing with its war gaming, actually the tech companies ought to be doing to similarly red team their own platforms for how they're going to be weaponized rather than just dumping them out in the world and being surprised when they're used in bad ways.
So you talked about the various groups and nations that are trying to fight in this space.
And it sounded like the U.S. is behind?
Is it behind Russia and China still?
Or have we caught up at all, thanks to Fort Polk and whatever else we're doing?
The sad truth of it is, while the United States is the literal home of the Internet itself,
we are now the nation that other nations point to as the lesson of don't let that happen to you.
And when I say other nations, it's everything from their politics to their defense plans.
when you see, you know, discussion, and whether it's in Sweden or France or whatnot,
we are definitively not the winner of this space.
We are the loser.
And the, there's a lot of lessons to be learned, not just what we can do better ourselves,
but also learning from what other nations are doing to protect themselves.
And here again, it's not about shifting to, you know, a Russian disinformation model or the Chinese
system of web control, which, you know, is, I mean, the system that's being created there is
utterly fascinating, but it's also beyond Orwell's greatest dreams as a measure of societal
control, you know, giving citizens a single score that reflects your societal trustworthiness.
It's not to follow those models. Actually, some of the best nations at this are Democratic
nations. They, unconcidentally, are the ones along Russia's borders because they were the ones
first targeted by this and they're the ones that figured out that the threats to them are not just
merely a tank, but this like war side of things. So we look at what's gone on and be it Estonia or the
Norway's and how they figured out how to protect themselves and what's important is there's definitely
military changes that need to happen, but there's also a whole of societal approach. I think
there's a great parallel with cybersecurity where, you know, about,
10, 15 years ago, we start to realize that there's a new array of threats and their threats
that have to be reacted to at everything from the governmental level and not just within the
military, but across agencies, but also there's a role for private business and it touches
us each as individuals. And it's the same phenomenon on the other side here of like war, the
influence operations, whether they are ones being pushed by foreign governments or drug cartels,
or being pushed by companies that want to get you to buy a hamburger or someone, a politician that wants you to vote, whatever.
The point is, is they hit at each of those levels.
And there is a role for each of us to do to get smart on it and learn how to navigate it, just like we had to learn to stop buying things from Nigerian crown princes.
But there's also a role for government as well.
And unfortunately, in both spaces in America, we're behind.
the curve. And that was one of the hopeful outcomes of the book is to not only point the way on what
government and military can do, but also what each of us individually can do as we navigate
this world of likes and lies. What does Norway do? Oh, gosh. It's a wide array of things from,
not just, let's not just focus on Norway. Let's look sort of across the set. There are efforts
at they identify incoming information warfare attacks, almost akin to the announcements of
severe weather or disease outbreaks. They engage in digital literacy campaigns, essentially
to help teach people how to better understand and deal with disinformation and, you know,
in quotation marks fake news.
You have a particularly strange outcome of this, not in Norway, but Ukraine.
The U.S. government has paid for a program to help Ukrainian kids better identify online disinformation.
So we're doing that for Ukrainian kids, but not for our kids.
You also have, again, a parallel to cybersecurity ways for corporations who are competitive
in business to work together in this space. So in your question of Norway, the media companies
there definitively, you know, they're competing. They each want to make the most money. They
kind of want to put their competitors out of business, but they actually cooperate in fact-checking.
When there's some kind of misinformation that's going viral, they work together to identify
it and ensure that it doesn't spread across their network. So there's a whole array of things that can be
done. None of them is a silver bullet solution. Many of them actually have an American pedigree to
them. So if you're thinking of a whole of government approach during the Cold War, we had something
in the U.S. called the Active Measures Working Group. It was a cross-agency group that brought
together spies, diplomats, communicators, educators. And they basically, just like we talked about,
you know, the social media version today, back then, they identified KGB active measures,
You had KGB campaigns to plant false stories and have a real world effect with them.
So they identified them and said, what can we do to counter them?
The difference is, first, back then the battleground was the third world.
And second, there was not the ability to drive it viral so rapidly and so widespread.
So back then, one of the campaigns that they countered was something called.
called Project Infection.
It was a KGB effort to spread the story that the U.S. military had created AIDS.
It was part of a larger campaign to kind of harm American reputation, but particularly to try
and sabotage the Los Angeles Olympics.
And of course, like that still lives on in the Internet today, but what was interesting
is it took them roughly four years to plant that story and get it spread into media.
media, first in the third world, and then using far right and far left media in the United
States.
Again, another kind of clear parallel to today where you see Russian disinformation.
It's bipartisan and going after both extremes.
But the point is it took them four years to reach a limited number of people.
By contrast, you know, a single planted story can reach across the world in a manner of
minutes and reach tens of millions in a manner of minutes.
And that's the difference.
And yet we don't have anything equivalent to the organization back then that fought that effort.
You make it sound like the U.S. is a little screwed.
We're not done.
We're not, the story's not yet complete.
And so, again, there's a lot of scary things that go on here.
But every part of the story, every character, every example, we try and show both the good and bad illustration of it.
So, you know, if you want to talk about the phenomena of crowdsourcing, crowdsourcing has been used to aid online harassment campaigns.
It's also been used to identify war crimes and bring them to light.
Again, just every example, there's two sides to it, which is always the case with a new technology.
The problem of when things go bad is there's usually a combination of arrogance and ignorance.
Arrogance in terms of, you know, this couldn't happen to us or this couldn't happen on my platform.
Or arrogance, what's good for my company must inherently be good for the world.
We can see these things kind of play out.
You know, this couldn't happen.
You know, the very same things that hit the, you.
Ukraine a couple years earlier, we should have been wised up to, and then we're stunned when it
hits our own election to, you know, this couldn't happen on my network. Many of the companies,
you know, turned a blind eye to the use of their networks for sort of these bad causes. So you've got
that arrogant side of things. And then ignorance circles back to what we were talking about before
of people basically not understanding the new rules of the game. And those people might be, you know,
we were talking about U.S. military officers to the politicians who are going against these efforts,
to you and I, who are the targets of these efforts. If we don't understand the new rules of the game,
of course we're going to be taken advantage of. And hopefully we have kind of gotten that dose of
reality that we shouldn't be so arrogant and then we've got to, you know, get our act together
and then take better actions. Did you happen to read that,
data and society report on the reactionary influencers on YouTube? Yeah, yeah. I mean,
there's one of the things that's fascinating about this space is there is so much data out there,
in part because of one of these new rules, everything is out in the open. So it is wonderful
for those of us that want to study. There's just this wealth of data, but it also has created
this strange kind of, I don't know how I was to put it, it's an impact where we've been trained up in a world where we think the most important bits of information is somehow secretive.
It's somehow not there for us. And if it's unveiled, you know, kind of Watergate style, that means it must be true.
And yet when it's all out in the open, we tend to discount it or not give it as much value.
And again, you can see that in everything from the study that you mentioned to the discourse.
over Russian disinformation actions where, you know, it was out in the open. It was on Facebook. They were
buying advertisements, but we kind of discount it of having the same impact if we had had a signals
intercept that allowed it. It's just, again, one of these ways the world's changed that we have to
readjust. Right. It used to be much easier to quarantine what I'll call bad ideas without, you know,
getting into why those ideas are bad. But you can't do that so much anymore. And it feels like
people are having a hard time winning those arguments in that space now?
What you've put your finger on is it's akin to viral disease outbreaks where they, because of the
connections that we've built, whether it's online or if you're thinking about air travel,
you know, they can spread more rapidly than ever before. And many of the old firebreaks that you
would have had in society are not there or they've been kind of surpassed or worked around.
And, you know, the misinformation side would be the way that, again, for both better and for worse, the gatekeepers of, for example, media have been displaced.
One of the fascinating ways this has played out is everyone from Donald Trump to professional athletes to terrorist groups to, I've heard individuals say this.
They've all used a similar phrasing where they've said, you know, the great thing about Twitter or Facebook, the great thing about social media.
is and they keep saying it's like owning your own newspaper, right? And so there's this kind of, you know,
the love it because you get to be your own editor, you get to push out your own information,
you work around the gatekeepers, but of course there's a downside to that. And it can be used and
abused. The other parallel of this is, and what that study relates to, is in both real world
disease, but also online, everyone is not equal in the spread of information.
of disease. There are what are known as super spreaders, and they are the people that really drive
something viral. And that is true, if you look at disease, you know, there's a great example of,
like, in South Korea when they had a flu outbreak. There was like one guy who was responsible
for this overwhelming number of them. You know, he was like orders of magnitude more than anyone else
to on the internet and on social media, it's the same phenomena. They, both,
obviously in the open internet, but even within the Chinese system, there's still about
roughly 100 accounts that drive most of online conversation there.
What we have to do, again, just like in public health, is you want to rebuild some of the
fire breaks and it helps you slow down the spread, but also you want to win a super spreader
has become toxic.
We also have to ensure that we're trying to limit their impact.
And for some, this is part of the debate over de-platforming.
When they clearly cross the line, the rules, they do not have an inherent right to be on these private networks.
They don't have a right to free speech.
They don't have a right to toxic speech on a private network that particularly when it relates to calls for violence or the like.
And then you have the types that don't cross that line, but we have to make sure their own history follows them.
So we use the example of like the people behind PizzaGate.
If they have a proven history of being viral super spreaders of conspiracy theory and proven falsehoods,
we should not treat them as credible actors moving forward.
And I say we, it's not just you and I, it's the media, it's government.
And until we change this perverse incentive systems where, for example, some of the key people behind Pizagate,
not only gain more online followers, but they also receive all sorts of rewards and everything from jobs, book deals, invitations to the White House.
Until we change the perverse incentive systems, of course the bad side's going to keep winning.
So what particular dangers were you and what do you think the future holds?
We have seen so much change play out.
again, not just online, but its effect on the real world, on our politics, on our news, on our wars.
And yet, only about half the world is online.
Half the world is still to come.
And that half that will be coming online will be mostly in places that don't have the institutional history that we have, for example, in the U.S.
I mean, think about all the challenges that we've faced in social media and its impact, again, on news and
politics and the like. And we have multiple centuries of experience of democracy. We have a
fairly functional government, fairly good economy. So you have this, I think, challenge of the
disruptive effects. We're only now scratching the surface of it in terms of the geographic impact.
And then the second, back to this idea of war, this is the first generation. The tools that are
being used are almost like the biplanes in World War I, where, you know, air warfare is going
to take off, but you're going to look at these biplanes as really antiquated. And it's the same
thing when you think about the tactics and technologies that have been so impactful already,
they're not all that complicated. So take bots, for example, you know, basically machines that
are simulating as if they're a person. They're fairly easy right now. There's certain details that
they have, that it's not a person behind. And yet, on scale, they've been massively important.
The Brexit campaign, one third of the online conversation was generated by false voices. And,
you know, online conversation is where not just individual users are taking their views and
news from. It's where journalists are deciding what are the trends, what to report or not,
who to interview or not. So one third of it in this incredibly close election, influenced by Botan.
to move forward.
It's obviously the exact same percentage
in the recent Mexican election.
We're seeing its effect in the Brazilian election.
We saw it in the U.S. election.
And of course, these bots now infused with AI
are going to become much more difficult
to figure out whether they're real or fake.
It's the same thing when you think about
this idea of fake news and alternative facts.
Well, as we move into a world of what are known as deep fakes,
which is, again, infusing AI into information,
to mesh the real and the false, a video of a speech that a politician never gave, it has two effects.
One, it will be increasingly hard for people to figure out when something is real or not, not just kind of the silly push of it of saying, oh, well, that's an alternative fact.
It literally would be quite difficult.
And then in turn, when real things are done, one of the ways that people will try and defend them is to say, oh,
well, that's fake.
We already see little examples of this where, you know, for example, Donald Trump has
taken from admitting that the access Hollywood tape, you know, of him talking about how he would
grab women by the certain part of their body.
He went from admitting that was him to now he said, oh, no, no, maybe that was doctored.
And again, this is without AI infused.
It's going to get much more complicated moving forward.
And this moves us into a world that.
that is appropriately enough science fiction sounding.
It's science fiction sounding appropriately enough
because the very first social network
was a bunch of scientists on ARPANET,
the proto version of the internet,
not talking about science,
but talking about science fiction.
That's the first time we get a social network.
And now we have the science fiction like outcome
on the new version of the internet
where it's like two AIs battling back and forth
with the humans caught in the middle,
which is basically,
basically what the Terminator franchise turned out to be.
Peter Singer, thank you so much for scaring the crap out of us today.
Give the name of your book one more time so people can scare themselves.
And I hope, you know, it's scary, but it's also fun.
There's so many fun, cool, good stories about it.
It's this twin side.
So hopefully you will be both scared but also highly entertained.
The book is called Like War.
That's one word.
And it's available on, you know, all the online.
booksellers, as well as at likewarbook.com.
Thanks for listening to This Weeks show.
If you enjoyed it, let the world know by leaving us a review on iTunes.
It helps others to find the show, or at least that's what they tell us.
We're putting transcripts of most shows online at warcollegepodcast.com, and you can reach us on
Twitter.
We're at war underscore college.
And on Facebook, facebook.com slash warcollege podcast.
We'd love to hear from you, so hit us up.
War College is me, Jason Fields, and Matthew Galt.
We will be back next week.
