Angry Planet - What We Learned From an Attack on Facebook
Episode Date: August 3, 2018Bonus Episode: Facebook recently revealed that trolls had been at it again: creating pages intended to drive Americans ever further apart, turning the volume of online political discourse up to 11. Al...l signs point to Russia as the instigator. Graham Brookie of the Atlantic Council's Digital Forensic Research Lab joins us to explain what the perpetrators revealed about themselves and what we have to look forward to in the midterms.You can listen to War College on iTunes, Stitcher, Google Play or follow our RSS directly. Our website is warcollegepodcast.com. You can reach us on our Facebook page: https://www.facebook.com/warcollegepodcast/; and on Twitter: @War_College.Support this show http://supporter.acast.com/warcollege. Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
Love this podcast?
Support this show through the ACAST supporter feature.
It's up to you how much you give, and there's no regular commitment.
Just click the link in the show description to support now.
It costs a lot less to make a bunch of internet content than it does to put a lot of tanks in Syria.
You're listening to War College, a weekly podcast that brings you the stories from behind the front lines.
Here are your hosts.
Matthew Galt and Jason Fields.
Hello and welcome to War College. I'm Matthew Galt.
And I'm Jason Fields.
Facebook. Most of us use it even if we hate it.
On August 31st, Facebook deactivated eight pages, 17 accounts, and seven Instagram accounts
it deemed inauthentic. Headlines rush to point to this is further proof that Russia
was attempting to interfere with the American electoral process and the run-up to the midterms.
The truth is something more complicated. Here to help us get to that truth is Graham-Brook
24 hours before it pulled the pages and accounts,
Facebook alerted the Atlantic Council's Digital Forensics Lab.
Brookie is the director and acting manager of the Digital Forensics Lab.
He's also a former White House staffer who sat on the National Security Council
and advised the Department of Homeland Security.
Graham, thank you so much for joining us.
Thanks for having me, guys.
Okay, so you and your team have probably been over these pages more than anyone else.
What is the nature of the, you know, what kind of politics were they trying to enter?
influence? What's your take on it?
So the interesting thing about the pages that Facebook took action on yesterday, and to be
clear, so Facebook took action against a total of 32 assets, including a number of accounts,
a number of Instagram pages, and then a number of Facebook pages. And the difference between
Facebook accounts and Facebook pages is a pretty big one in the sense that Facebook accounts are
individual users, whereas Facebook pages are places where communities are built and come together
to communicate and coordinate about whatever that specific pages' interest is.
And so the thing that we were alerted to at the DFR Lab of the Atlantic Council was
eight specific pages that Facebook took action against.
And among those eight pages, what we saw was a very specific.
specific tactic of looking at specific demographics with one thing abundantly clear or one overarching
goal that was abundantly clear. And that is that they sought to promote divisions and set Americans
against each other. So rather than, you know, bringing us closer together at a time of polarized
politics, these, the content, the language, the kind of coordination was to drive us further apart
than closer together. What we saw on these pages is behavior, tactics, language, and content that
were in some instances very similar to accounts that were run by the Russian troll farmer, the infamous
St. Petersburg Troll Factory from a period of around 2014 through 2017. Okay. And are these
specific pages, do you think, did the Russians have a hand in it? One of the hardest things about
identifying disinformation or identifying influence operations is attribution. And that
That is to say it's extremely hard to say with 100% confidence that it was Russia behind it.
And frankly, Americans do a pretty good job of generating a lot of hyper-partisan content on our own.
So that's point one.
But point two is the, again, the approach, the tactics, the language, the content on these pages that Facebook took action against this week,
correlated extremely highly with the approach tactics language that were being put out by the IRA
or the internet research agency between 2014 and 2017. So is that a way of saying we think it was
Russia? I think that the behavior correlates pretty highly. Is that a way of saying we know it's
Russia? Not quite yet. What betrayed it as possibly a Russian disinformation?
campaign. What are the similarities there?
Facebook identified it as inauthentic behavior.
And frankly, at the DFR Lab, what we do is open source research.
And so we wouldn't have any way to verify that independently, mostly because we don't have
access to all the data and the platform that Facebook is running.
But what we can say or see using the kind of open source methods that the digital forensic
research lab is known for, and that we can say.
use every single day is that a number of the language patterns, a lot of the tactics and behavior
were the same. And so let me start with language patterns. There were a lot of mistranslations that are not
necessarily familiar to any amount of slang anywhere in the United States, whether that's a southern
dialect or whether that's a northeastern dialect. There were very specific grammar errors that are
most familiar with or that correlate most highly with Slavic languages, of which the largest is
obviously Russian. So that is typically a telltale sign. And what we're talking about here is
grammatical changes in, in articles specifically. The A and those those articles in front of words
are constantly mistranslated. And so if that's a telltale sign that's holding democracy
together, that's probably pretty flimsy.
But some of the other telltale signs include behavior in terms of audience growth and a focus
on really, really hyperpartisan content or topics, migration, like racial issues in the United
States right now, like gun control, and at the top of the news cycle.
And again, this content is designed to insert itself into very real conversations that we're
having and then drive both sides further apart.
What is interesting about the pages that were taken down this week is the degree of sophistication that they used.
So they were looking at very, very, very specific demographics.
The biggest one, the biggest page that was taken down was a page called Resisters, which emphasized the sisters in Resisters.
It was a page that was mostly focused on feminine content.
There was another one called Atlan Warriors, which is a reference to Aztec Warriors.
And that page focused on the Latin American demographic.
There was one called Native Progressives that was focused on the Native American community in the United States.
And there are a few others that were focused on the African American community in the United States.
And the focus, at least on the pages that were taken down this week, was activism on the left of the American political spectrum.
When we're talking about activism, I was wondering, does this translate into physical action?
Are people going out on the street to protest because of some of these pages?
Absolutely.
And that's another degree of sophistication that the IRA, at least, in the 2016 elections during those influence operations,
tried to do a little bit with a few protests that were based on racial issues in Texas.
and a few other instances.
But what we saw with the pages that were taken down this week was that there was a key focus,
at least tactically, on growing as much of an online audience as possible with really kind
of engaging or emotive content.
And then trying to translate that audience once it was at critical mass to action in the
real world.
And so the kind of pithy term that we would use is URL to IRL.
and that's action in real life.
The Resisters page specifically partnered with a few very real organic groups, activist groups in the United States
for a counter protest that was scheduled for August 10, which is I think what, frankly,
what I would hope was one of the forcing functions for a company like Facebook to take action.
And so that counter protest was against the Unite the Right,
which is a group that coordinated protests that turned violent last year on.
August 10th in Charlottesville, something that I think that this audience will absolutely recall.
And so one of these assessed pages that was taken down was actually coordinating a real counter-protest
against that event specifically here in Washington, D.C.
And that's a pattern of behavior that's extremely significant.
Another example of this, Jason, happened in Houston in 2016 when IRA linked
Facebook accounts got promoted and also promoted the protest and counter protest around Muslim
Americans in the Houston area. Let me ask you this, Graham. Is there anything materially different?
I think we've talked about it a little bit, but is there anything materially different this time
around than from what we've seen in previous disinformation campaigns? Well, I think that
it's getting harder to detect. So, for instance, in this scenario,
based on what Facebook said publicly, what we can do this is there wasn't, for instance,
one user or one kind of puppet master that was running 60 of these pages, right?
It's less automated in a lot of ways and more kind of narrowly tailored to insert themselves
into very polarized conversations that we're having in the United States.
So that's something that kind of sets it apart.
And then I think that the focus on real action inside of the United States.
So again, what we were talking about in terms of translating online communities into protest communities on the street, that is also a degree of sophistication that should be pretty frightening.
And that is to say it gets a lot harder to separate the critical mass of political activism in the United States, which is to say political.
activism is a cornerstone of democracy. Any healthy democracy has to have some degree of
political, has to have a high degree of political activism. Political activism is a good thing.
But I think that what these influence operations are designed to do is to insert into that very
real political activism, that very real political sentiment, and then make it a little bit more
polarized to, again, drive us further apart than closer together. And so that's a, it.
the translation from just, you know, online communities that share content that make us really angry
and translating that into actual behavior change, whether that's, you know, deciding to take a
Friday afternoon and go join a really big protest or whether that's in the actual ballot,
when you actually are casting ballots, that is, that's really sophisticated and a pretty scary
prospect. It's, you can have outtise influence with an influence operation by,
poisoning the well of a much larger thing. And that's what it's designed to do. So how do we know
if we're being influenced by, in Facebook's words, inauthentic actors? What can we do to make sure
that we're not listening to Russian trolls instead of our fellow Americans, alleged Russian trolls?
That's a really big challenge for, I think, every single citizen right now, right? Because if you
think about the definitions of success of an influence operation, take the case this week. I mean,
In one scenario, the influence operation goes undetected, and it leads to a very real counter-protest in D.C.
That has the potential for violence, frankly, if we're judging on what happened with similar protests last year.
So that's one scenario, and that's a pretty successful scenario if you're trying to sow discord within our politics.
And then in the other scenario, on the flip side, if they are detected, then they've sewn doubt in the conversation.
that you and I are having right now and that Americans across the country are having around the dinner table, right?
They've shown or they've they've cast a cloud over our entire discord to make everybody a little bit more suspicious and a little less trusting in our own political discourse to make us question like whether or not everything is a Russian bot, which by the way, not everything is a Russian bot.
And that's something that I think we all have a responsibility for, whether it's these big groups like government,
and politicians or the tech and social media companies or frankly traditional media,
but most of all citizens, most of all real people.
And so how can real people engage online in a way that is additive rather than maybe
being manipulated by alleged Russian operatives?
I would say have a healthy degree of skepticism on how you're coordinating, right?
If you wouldn't – I'm from Colorado, so I think about camping.
I wouldn't go camping or into the middle of the woods with a group of people that I didn't know and trust.
And so why would I not apply that same standard to the people that I would affiliate with in political activism,
where we're making a very public movement or a very public statement?
And so I think that there's a level of due diligence and just good citizenship that we can all engage in a little bit more
to be a little bit more resilient against something like Russian influence operations or a lot.
alleged Russian influence operations.
Do you think that people maybe a little bit older than us don't take the internet as
seriously and don't see it as more than just kind of a toy?
That's a, that feels like a loaded question to be asking a millennial.
No, I think, well, let me answer that question with another question is, did you try
turning off your device before asking the IT question, right?
I'm trying to think back to the conversation that I had with my grandparents when they were trying to decide whether or not they were going to get onto Facebook.
I think that's partly true.
One of the things that, at least in the wider conversation on how to create more digital resilience, we talk about a lot, is media literacy.
And I think that's kind of a crutch for a lot of different things.
For me, it includes four things, which is, yes, media literacy.
and that means making people, including older people, more prepared or more informed in how they
consume news or how they consume information. So that's point one. And then point two is a category
that I would call digital literacy. And digital literacy is making sure that people are more prepared
and know exactly how to use devices or how to navigate platforms like Facebook or Twitter or Google.
And I think that that specifically probably applies to older folks a little bit more, although I will say my grandmother is better at Facebook than I am.
And then there's this third bucket that I would call cyber hygiene.
And that's making sure that people are more aware of their data footprint and what their kind of online profile is.
So like whether or not they're using multifactor authentication, whether they know.
how their data is tracked on social media platforms like Facebook.
And then I think that the fourth bucket is probably the most important,
which is at heart just basic civics,
like how to how to participate in a democracy in an additive rather than negative way.
Those four things kind of, at least for me,
are four major pieces in how we solve something
or how we create more digital resilience against something like alleged.
Russian influence operations. And frankly, how can we make democracy stronger?
Well, this is something these pages existed now just a few months before the midterm elections.
Do you have any idea, any thoughts about what it is people might be seeing as the midterms come closer?
Yeah, the way that we would, at least at the DFR lab scale, scale research with a team that
doesn't have the reach across politics at an industrial scale. So politics in large races
across the country are three broad categories. And the first is extremely polarizing issues.
And we've talked a little bit about that earlier, but that's migration, that's various protests,
that's racial issues, gun control, polarized issues that we know influence operations have kind of
tried to insert themselves into an influence in the past to, again, drive us further away than
closer together. That's one of the most direct ways where influence can, or chaos can be
sowed within our discourse. So that's bucket one. And then bucket two would be key swing states
where you could have outsized impact on the balance of power in the United States. And so,
So, you know, if I'm, if you're sitting in the seat of someone who would so disinformation and
chaos, you'd be well placed to target a few very specific places with limited resources, right?
And then the third, and this is kind of a further off bucket, but there were no candidates
mentioned in the content that was shared on the pages that Facebook took down, except for one.
And it's not a candidate in the 2018 midterms.
He's a candidate in 2020, and that's Donald Trump.
And so what we would predict or deduce from that is that the third kind of category of disinformation
around the 2018 midterms will be figures who bad actors would assume have a future, right?
So they're going to be A, B, testing narratives against people that they would assume will be running in 2020.
or will have an outsized platform in the future of U-S politics.
And that's just kind of investing in, if you're a bad actor,
that's a category that I would say is investing in the future of sowing disinformation.
And the one example in that is that the narratives against Hillary Clinton
that were most effective in 2016, whether that's a number of the leaks and things like that,
weren't necessarily started in 2016, right?
They were seated well before, and a nation like Russia had been engaged in for a long time because Hillary Clinton had a long public life.
So those are the three broad categories that at least our team is thinking through.
And I'm sure that there are more.
I'm sure that we will be unpleasantly surprised by some content, and some content will be kind of what is expected or what we would assume it will be.
But that's kind of how we're looking at it.
That's how we're scaling and scoping research.
you make it sound as if it's less that these groups are out and out creating falsehoods
and more that they are enabling people's worst impulses.
Yeah, I think it's a mixture of both.
I think that it's kind of this process where they're probably throwing as much content up against the wall to see what sticks.
And some of that is content that is outright falsehoods.
Some of it is content that is designed to,
insert itself into conversations and make it a little bit more okay to kind of go to the extreme
end of that conversation, what I would call lowering the barrier cost of polarization.
And some of it is a mixture of both of those things, right?
Like outright falsehoods that lead to increased polarization.
So you've had a longer time to look at all these pages that anyone else has.
And I'm wondering, did you see anywhere any active calls to violence?
Not that I've seen, although our team is still looking at the pages that, or looking at the content that we were able to look at and grab before action was taken against those pages.
And so what I would say is that we had an initial take, an initial assessment of a few kind of overarching categories of things that we saw.
And we're kind of parsing, we're taking a step back to parse through and try to get a hierarchy.
confidence assessment in those conclusions. But were there calls for violence explicitly,
as in, you know, on Thursday we're going to take it to the streets with pitchforks and guns?
Not that I've seen, but I wouldn't rule that out as we're trying to go through more,
more of the pages. Another interesting aspect of this story, something that Facebook said during
its press conference, is that an IRA-linked account, this Russian troll form out of St. Petersburg,
was did briefly take control of one of the disabled pages for seven minutes.
What do we know about that?
Is that a slip up or is it, you know, a deliberate tell?
Do we know anything?
I think that it's a very interesting data point, right?
That's not something that my team would have much insight to, but, you know, taking
that data point at its at face value, it is a significant tell in that you know that you
know that the.
IRA or the infamous St. Petersburg Troll Factory was at least paying attention to what these
pages were doing if they took control of it for any period of time. So whether that was some
operative that fell asleep at the keyboard and absolutely took control of something they weren't
for a few minutes or whether it was a slip-up that showed their hand, what it does tell us
is that they were paying attention. Is there any harm to Russians?
actually having people in America understand what they're doing?
I mean, does it just so more confusion?
I think it does.
I think that there aren't many costs for a country like Russia to sow disinformation.
And by that, if you look at the cost-benefit analysis of an influence operation,
And then you look at Russia's geopolitical goals, whether it's putting the U.S. and allied countries on a back foot in a conflict like Syria, whether it's legitimizing their illegal annexation of Crimea, whether it's splitting NATO allies further apart so that you can deal with them individually.
If you're making a cost-benefit analysis on how to do that, then it costs a lot less to make a bunch of Internet content.
than it does to put a lot of tanks in Syria.
You can have an outsized impact geopolitically
with sowing a little bit of chaos in U.S. elections,
in U.S. politics, in politics across Europe,
politics across democracies, frankly.
And that's something that should be an asymmetric threat
that we need to guard against.
And I don't think that we have many costs
for a country like Russia doing that.
You know, what is the consequence?
they get named and shamed in the international forum.
That hasn't had much of a deterrent factor for a country like Russia in the recent past.
So why would it on an issue like disinformation?
And that's a key question geopolitically that I think governments more so than companies
are going to have to be responsible for and something that we don't have a very good answer for yet.
What do you think of the way that this story has been reported?
Generally speaking, the challenge of disinformation is a collective challenge.
And so this will be a circuitous way of answering that specifically.
But, I mean, you look at a challenge like disinformation, and there are three broad groups that are involved.
One is government.
The second is tech, private sector, social media companies, whatever you want to call them.
And then there's traditional media.
And in any given case, for instance, the 2016 case with Podesta leaks where there was this kind of drip, drip, drip, drift coverage and content went from social media platforms to the front page of all the major dailies inside the United States, whether it's New York Times or Washington Post or whatever, you know, those three groups are equal parts.
If we were to take action on that and it was only one of those groups that was acting in a silo, then we might as well pack up.
and go home because we're not going to solve the collective challenge.
So in the case this week, did media cover the story an appropriate way?
I would leave that to your readers.
But at the same time, I think that the top lines of this story are a major social media company took proactive action against an influence operation.
And second top line is, you know, a major social media company is working with outside partners to solve a,
really hard challenge, whether that's with U.S. Congress or with U.S. law enforcement, whether it's
with third-party kind of independent analysis like that at the DFR lab. So I think that kind of
information sharing is really, really good. And then there's this kind of other storyline that came
out, which is Facebook takes action against the activist community. And I guess my thoughts on that
are disinformation is designed to. So yes, the overall, the overall, the vast majority of content
within these pages was probably very real and organic. But at the same time, if it's based on
the foundation of a Russian influence operation, then that's probably not a good thing.
That's probably not a good foundation for political activism in the United States.
So I think that there's this, that's a much harder conversation that we need to have within
in the activist community and within our political discourse.
But I think in terms of the media coverage around this story this week, it's been
objective and fact-based, which is one of the foundations of democracy.
So A-plus to everybody, to you guys and to others.
Thank you.
Graham Brookie, thank you so much for joining us and explaining the story to us.
Thanks for having me.
Thanks for listening to this bonus episode of War College.
In addition to the normal ways you can get in touch with us, on Twitter at war underscore college, and on Facebook, www.com slash warcollege podcast.
We have a new website and we've put up the transcripts of past shows on it.
That's www.w.warcollegepodcast.com.
So check it out.
War College is me, Jason Fields, and Matthew Galt.
We will be back next week.
