CyberWire Daily - Peter W. Singer author of LikeWar [Special Editions]
Episode Date: November 30, 2019In this CyberWire special edition, an extended version of our conversation from earlier this year with Peter W. Singer. We spoke not long after the publication of his book, Like War - the Weaponizatio...n of Social Media. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
You're listening to the CyberWire Network, powered by N2K.
Calling all sellers.
Salesforce is hiring account executives to join us on the cutting edge of technology.
Here, innovation isn't a buzzword.
It's a way of life.
You'll be solving customer challenges faster with agents, winning with purpose,
and showing the world what AI was meant to be. Let's create the agent-first future together.
Head to salesforce.com slash careers to learn more.
Hello, everyone. I'm Dave Bittner. Thanks for joining us in this Cyber Wire special edition, an extended version of my conversation from earlier this year with Peter W. Singer.
We spoke not long after the publication of his book, Like War, the weaponization of social media.
Transat presents a couple trying to beat the winter blues.
We could try hot yoga.
Too sweaty. We could go skating. winter blues. We could try hot yoga. Too sweaty.
We could go skating.
Too icy.
We could book a vacation.
Like somewhere hot.
Yeah, with pools.
And a spa.
And endless snacks.
Yes!
Yes!
Yes!
With savings of up to 40% on Transat South packages, it's easy to say so long to winter.
Visit Transat.com or contact your Marlin travel professional for details. Conditions apply. Air Transat. Travel moves us.
Cyber threats are evolving every second, and staying ahead is more than just a challenge.
It's a necessity. That's why we're thrilled to partner with ThreatLocker,
a cybersecurity solution trusted by businesses worldwide. ThreatLocker, a cybersecurity solution trusted by businesses
worldwide. ThreatLocker is a full suite of solutions designed to give you total control,
stopping unauthorized applications, securing sensitive data, and ensuring your organization
runs smoothly and securely. Visit ThreatLocker.com today to see how a default deny approach can keep
your company safe and compliant.
So we started this project almost five years ago, and there was a series of seemingly kind of new breakpoints.
But actually now, in retrospect, they signified a new normal.
And they were everything from, for example, you had the first what was called Twitter war
that played out where Israel and Hamas had one of their sort of regular conflicts.
And there was a series of days of airstrikes and the like, and it kind of ended
inconclusively on the ground.
But alongside it, for the first time, you had these online what we now call battles, but basically debates going back and forth as but that the vast majority of the messages claiming what was happening on the ground, who was in the right and wrong, were being pushed by
people physically outside the region. And what was even more notable than the fact that, you know,
you could, for example, weigh in on this conflict, even though you might be, you know, checking
Twitter on the subway and the way to work, is that actually the ebb and flow of the conflict had real world consequences.
They later found that essentially whichever side was winning, so to speak, in the trends online, it shaped the both pace and location of the airstrikes by over 50%. What was essentially happening is that the
Israeli generals and politicians were watching the maps, but also watching their Twitter feed,
which now, of course, seems normal. Another example about five years back was we had a
group of terrorists seize a shopping mall in Kenya, and the government tried to shut down communication and reporting about what was happening.
And the result was that the terrorists who were on social media became the primary source for the world on their act of terrorism.
So actually we fed into the very goal of terrorism, which is, you know, to drive the message and it's to drive fear viral.
But what was, again, interesting is the terrorists realized that because they own the narrative, they also didn't have to tell the truth online.
You know, again, sort of a seemingly obvious realization.
But, you know, this is where we're at. And then finally, you had a policy change in the U.S.
military, which allowed deploying service members to Afghanistan to use Facebook and Twitter.
And so for the first time, you had people on the battlefield able to friend their enemy. And in turn, their enemy, the Taliban, could not just friend and
stalk and track and communicate with them, but could equally reach out and connect to, you know,
everything from family members, friends, journalists back home, you name it. And so you had this kind
of connection point. So all of these things were a spark for us to start the book project.
And then we started to explore essentially how social
media was being used in war zones around the world. But very quickly, that widened. If you're
looking at, for instance, Iraq and Syria, the rise of ISIS becomes a story of terrorism. If you're
looking at terrorism, you have a cross with things like the drug war in Mexico, and we started to
look at how drug cartels were using it. Then we began to look at hold it Chicago gangs. If you're looking at how it was used in
places like Russia, and Ukraine very quickly moved into American domestic politics. And so the
project was essentially trying to explore just what's going on here in this new form of online conflict that, as we talk about it, is not about hacking of computers on the network, sort of the classic definition of cyber war, but rather hacking the people on social networks by driving ideas viral, what we call a like war. Yeah, it's fascinating. And one of the things you dig into is the effect
that crowdsourcing has had on politics and war and reporting. And it strikes me that, you know,
in some ways, crowdsourcing sort of short circuits would have previously been gatekeepers when it
came to providing information. Absolutely. And one of the interesting things is that when we cut across all of these
different actors and, you know, we have cases that looked at everything from, you know, Donald
Trump's very first tweet to then how the Donald Trump campaign utilized social media to ISIS, to
celebrities like Taylor Swift, to corporations, to athletes, to teenagers,
criminal groups, you name it. What cuts across this is, just as you hit it, this notion of the
traditional breakdown of the gatekeepers. Even the very meaning of the word media comes from
the Latin, the middle. The media was the group that was in the middle that would collect the news and then distill it and share it with the rest of the world.
It was the gatekeeper.
And again, this meaning dates back centuries. What we have now is essentially we can all be collectors of information and we can be individual sharers of information.
And it breaks down that notion of something in the own newspapers, the way they've all described it.
That they don't have to go through this media in the middle to push out their message. And again, all of this, like every other technology, like the meaning of the word
hacking, it can be for good and bad. We profile, for example, groups that are individuals that
sort of illustrate this. And I think a fascinating example is a set of little girls. One was a little girl in Pennsylvania who the newspaper in her
small town goes out of business because of what social media has done to the media business.
And so she launches her own online newspaper. Now, the first story, it's about the birth of
her baby brother. It doesn't go all that viral. But later on, she begins to report on
a variety of things in her small town, including she's the first one to break the news of the first
murder in her small town in over a generation. And it's a sort of wonderful story of empowerment
and news spreading. Actually, she also reports on corruption in the city government, and government officials start to meet in a bar to avoid her, and she's under the age of 10.
But then you have the flip side of this.
It's a great story.
You have the flip side, though, of a little girl named Janna Jihad, who equally under the age of, decides to become an online reporter. But she grows up in the
Palestinian territories. And her life has been shaped by loss. Two family members have been
killed in violence there. And so she actively seeks out battles to not just report them,
to not just report them, but as she describes, her camera is her gun.
She sees herself as a new kind of information warrior. And so it shows you kind of this duality of the gathering of information,
but also in its spread, it can be weaponized.
And we see just that.
It can be weaponized.
And we see just that.
I mean, obviously, front and center has been Russian operatives in our recent elections and continuing and the rise of the trolls. It seems like this new type of warfare that doesn't require a huge investment to get great returns from.
It's absolutely essential point in two different ways
that you hit on there. One is the essentially low barriers to entry of this space and the
proliferation of the tactics. And then the second is to understand trolling and its impact, not just in internet, but moving beyond to, for instance,
our politics. So on the first part, we, again, explored all of these different kinds of actors.
We also gathered both massive quantitative data, but also did interviews of people who range from, for instance, the literal godfather
of the internet itself, to recruiters for extremist groups, to tech company executives,
to celebrities, to members of the US military, both active to retired, including one retired
three-star who's become quite controversial, General Michael Flynn,
you name it, all of these different things. And what we found was a consistent set of rules
across these actors, a consistent set of tactics. You know, what we call like war, this hacking of
people on the networks, it actually follows a particular set of tactics,
a particular set of rules, regardless if you are Taylor Swift or Junaid Hussein, who is ISIS's top
recruiter. And one of the things is that groups are learning these set of tactics, and then they're
mimicking each other, and they're spreading across. So one of the more amusing examples of this,
but kind of scary examples of this, is the very same tactics that were used by Russian information
warriors to, you know, and they've used it everywhere from targeting Ukrainian military units
to political campaigns, be it Brexit to the US election, to the upcoming European elections, that's been mimicked
by actors that range from a collective of Lady Gaga fans who consciously copied the Russian
tactics to try and sabotage the prospect of rival movies. When her movie A Star is Born came out. A set of her fans organized
to create sock puppet accounts and false stories and false reviews of the rival movies. Most
recently, there's been the announcement of an app for $29 where you can plant stories in someone's Facebook feed to try and manipulate them to
some end. So for instance, the company that advertises it talks about how you can manipulate
your spouse by, for example, planning different kind of suggestive imagery. They'll get over 200 messages and stories that they'll be seen in their feed
to if you want to manipulate. And it's for everything from interpersonal relations to
you want them to buy a puppy to a different example is to try and get a girlfriend or
boyfriend planning all sorts of different stories about maybe they should marry you,
all these sorts of things. It's allowing a kind of targeting that wasn't possible before.
So this kind of these tactics, they're proliferating. And that's why we're going to
see more and more of them hitting again, everything from elections and not just at
the national level. We're seeing these tactics swinging down to even the local level, but also all sorts of
other enterprises. The second thing that you hit is trolling. And again, the pervasiveness of
trolling that's out there. And there's a core lesson of trolling that too many of us aren't
aware of. It's basically the maxim is attack and then play the victim. Attack and then play the victim. And again, you can see this in everything from the very origin of internet trolling.
We go back and tell the story of the literal first internet trolls and where the terminology comes from to how it's popped up today into American politics and how it's almost become the essence of American
politics today. Attack, but then play the victim. I want to dig into the platforms themselves.
They say that, well, there's no way that we can, at the scale we're running, there's no way that
we can filter messages and you wouldn't even want us to anyway. We want to, you know, we want free speech and we
want everyone to be able to share their ideas. A couple of things come to mind with me. First of
all, is there a government role to play in terms of regulating what can and can't go on these
platforms or guidelines and so forth? But also, it strikes me that really what they're saying
when they say we can't filter this stuff is, this stuff is we can't filter this stuff and still bring you this product for free.
We can't filter this stuff and still operate at the profit margins with which we operate.
Does that make any sense?
Yeah, and it's interesting because what we explore in the book is this evolution.
What we explore in the book is this evolution.
So there's always been a bit of a contradiction where the companies say, you know, there's even before we can't.
It's that we shouldn't.
That's not our job.
We don't want to be, as, for example, Mark Zuckerberg has put it, the arbiters of truth.
Or, for example, you go back to the very origin of YouTube. YouTube is created as a way, in essence, around censorship. Its origin, one of the idea sparks for it is the infamous
Super Bowl nip slip, the Janet Jackson episode where, you know, something happens. But there's,
Jackson episode where, you know, something happens, but there's, it's very difficult.
Media won't show it afterwards, but people want to find a way to find it, a video sharing service.
And that's one of the origins of it. But then the, what, as we explore is that the reality is, is they've always been in this game of deciding what is allowed or not from the very starting point.
It may have begun as a way around censorship, but very quickly, for instance, YouTube gets
pulled into everything from, well, we need to censor for intellectual property theft violations
to child pornography. Over time, that initial intervention expands so it's well you can
say and do everything except and initially it's child porn well we can all agree that's bad right
then post 9-11 um it becomes uh and and beheading videos it becomes becomes extremist imageries of violence. But then you get issues of, well,
what about messages that don't show violence
but inspire violence? For example, Al-Alaqi
was a cleric who inspired a series
of suicide bombings.
At the time, it's allowed on YouTube.
In fact, the algorithms are quite helpful
and actually will steer you to more of his sermons.
If you're interested in this experience,
here's more of them.
And so then we move to, okay,
well, if it pushes violence,
and then we get Charlottesville.
Ooh, now it got kind of complicated
because, well, if we could all agree
that Al-Qaeda and ISIS was bad, well, what about neo-Nazis? Hold it. That starts to cross into the far right. But hold it. I thought these were, quote, very fine people.
to move forward the 2018 election, if you are pushing false information about the voting,
where you can vote or who is allowed to vote, we'll kick you off. So the companies, for instance, have decided it is okay for you to push false messages about political policy, but it is not OK for you to push false messages about your political rights.
And again, the point is what I'm getting at is they've always been at this game.
essentially been them deciding when to intervene or not. And going back to the very first issues around child pornography to internet bullying, the company's decision has been a reluctant one
because every minute, every dollar that you spend on content moderation is a minute is a dollar not spent on your bottom line, not spent on growing the business.
So it's not a job that they want to be in.
It's not what they set out to do, and it doesn't make them money.
to that task by a combined pressure of their own customers saying, you know what, we don't want to see this in the space that we're in, and the fear of government intervention, of politics crossing
over into this space. So they've always been at this. They'll always struggle with it. There's
another consistency, again, going back to, for instance, the very early days of AOL to today discussion at Facebook and Google with AI,
they face a policy, a political problem, and they always try and find a technology solution to it,
which rinse, wash, repeat creates a new set of political problems. And we see that consistency playing out again here with content moderation
where they're never going to be able to police all of it
not just because of the scale, but
we have a quote in the book from a leading engineer at one
of these companies who says, roughly the quote is, yeah, AI could do it
if we could just figure out the First Amendment issues. And you're like, dude, you're always going to have this.
That's the very point of it. But the final issue to what you raise is, but hold it,
isn't there a role for government? There is. And in fact, there has always been this threat
of intervention. And what's playing out is essentially where you are physically located in the world, your government is intervening in different ways, and it is intervention, but mostly we defer to the market place and liability.
That's very different than in the democracies of Western Europe, where we see more government
intervention on what's allowed or not, more fines towards the companies, etc. To if you live in
Saudi Arabia, there's a different kind of internet intervention. You might be put in jail
for what you personally post. If you are in Russia, there's government intervention. You
might be put in jail or you might just happen to fall down an elevator shaft. If you live in China, there is control over what is literally technologically
allowed on the internet. There's web filtering. But then we also have the move towards a different
form of web policing where it uses scoring and incentive systems. We explore what's called the
social credit system, where essentially you get a single score, almost like a credit score, but it's not a financial credit score.
It's your score in terms of your, quote, trustworthiness in the eyes of the government, and your online behavior determines that. everything from micro rewards, free charging for your phone at coffee shops to negative
punishments. For example, if you don't have a high enough score, you might not be allowed to
fly on an airplane or get a bed on an overnight train, or you might not be allowed to interview
for a certain job. It's actually being woven back into internet dating profiles.
So if your score is too low in the eyes of the government, you won't get a good date
and you might not get married.
Yeah, I remember someone putting out there on Twitter saying that, you know, if you want
to get the Nazis off your Twitter feed, tell Twitter that you reside in Germany.
And magically and mystically, they just disappear.
Because of the regulations in Germany against messages that have to do with Nazi ideals.
Yeah, that's this notion of the right to forget.
And again, notice the word right. What are our rights
are highly disputed. And they are notions of what our rights change over time and where,
you know, you live. And again, you know, even as we explore in the book, there's this
technology has always had that kind of impact. And it's really fascinating things of, you know you live um and again you know even as we explore in the book there's this technology
has always had that kind of impact and it's really fascinating things of you know when did we first
start to talk about the freedom of the press well first you needed the printing press to happen and
then you needed the rise of the press which was a profession of um who figured out a second way to make money was not just publishing books but publishing the first newspapers, the first newsletters.
So again, these notions are constantly affected by it. And it's a technology that combines both the kind of if you think about the way the telegraph allowed connection from a distance, but also the way the radio or TV allowed broadcast.
So what's different about social media is that it allows one to reach many instantaneously, but it also simultaneously allows one to reach one.
And the actors that have figured this out, this combination, they're the ones that have been
winning, so to speak. And again, we look at these examples, everything from Trump to Taylor Swift
to ISIS. They're the ones that have figured out the change that this has all
brought. You know, there's no shortage of, you know, breathless reporting and headlines that
these networks are going to be the end of us. It's going to lead to the downfall of democracy
and, you know, the way we communicate and our freedoms are at risk. Do you think that there's something to that? I guess what I'm
getting at is how accurate do you think those warnings are? How concerned should we be as we
head forward? It's a technology that can be used for massive good and massive evil. Guess what? Like every other technology in the past.
So if you think of, for instance, the radio, Goebbels talked about how his rough quote was,
talking about the rise of the Nazi party, the top propagandist of it said, we couldn't have done it without the radio.
Of course, the radio also allowed FDR's famous fireside chats that mobilized the free world against the Nazis. The radio also created new forms of shared entertainment.
So we've been through these kind of, you know, sea changes before. What we need
to recognize is social media is on that level. And we've seen it empower new actors who've used it
for evil and for good. A couple of things, though, that are important about that. The first is I think right now we feel so negative about it, largely because of how positive we felt about it just a couple years ago. the Arab spring and, Oh, social media has a quote, liberating power. And, you know,
uh, dictatorships are on their way out, uh, to, you know, Facebook has a, um, uh, motto that it's
pushing out, um, that back then, uh, it's meant as a positive. Now it feels kind of creepy where
they're pushing quote, the more we connect, the better it gets. I think about that, you know,
now how that sounds. Um, no, the more we connect, the more we connect, uh, and you know, we've seen
the good and the bad of it, but you have this kind of crazy level of techno optimism and now
we're feeling sort of the, the second side of it. The other aspect, um, is that essentially,
um, part of why it feels so bad is that we've not understood these new
rules of the game. And so, you know, essentially the bad actors, whether it's, you know, Russian
disinformation warriors to trolls and conspiracy theorists, they've been the ones that have
understood these rules. And so they've been manipulating their way into a level of success
that they wouldn't have otherwise achieved.
And so it's up to us to learn these new rules to be able to push back against it.
And that's what the book project was about, is trying to help us all understand what are these rules of the game.
There's a second thing that links to something of concern to everyone who's listening to this podcast is essentially understanding the second
side of online conflict. About
a generation back, we started to recognize that
the internet, that our online connections allow
not only this wonderful new economy, but allow new types of
threats. And we began to organize and train around that.
And that's true whether you're talking about the U.S. military to corporations,
all the way down to the individual.
So we're certainly not where we need to be.
But basically, we've spent a generation building up everything from the military,
get cyber command, to kids at schools get taught about passwords, to corporations create departments, get CISOs, you name it.
Well, the same thing now is needed when it comes to the like war side of things, the ability to manipulate online trends and create new realities from them. And that needs to be
recognized, whether you're talking about the military. You know, for instance, we're just
finishing up a piece that's looking at NATO and how, you know, NATO spent all this time worrying
about hacking of infrastructure. And it turns out that the greater threat to it was these viral
disinformation campaigns to corporations. We've seen corporations be hammered by this, you know,
companies that range from Toyota to Nike have all faced greater challenge from online
disinformation campaigns than they have from, you know,
the classic someone cracking into email and stealing it. And so we need to recognize this.
And then in turn, just like what happened in cybersecurity, build out everything from
new organizations, new training, all the way down to our individual digital literacy.
Digital literacy is not just about two-factor right now. It's also understanding
how you might be manipulated on Twitter or Facebook. That's Peter W. Singer. The book is
Like War, the Weaponization of Social Media, which he authored along with his co-author,
Emerson Booking. For everyone here at the Cyber Wire, I'm Dave Bittner.
Thanks for listening.