The Dispatch Podcast - Narrative Laundering
Episode Date: October 16, 2020How do journalists and tech platforms determine what information is verifiable online? How can news consumers determine which media outlets to trust when the line between partisan bias and disinformat...ion becomes hazier and hazier? On today’s episode, David and Sarah are joined by Renée DiResta—a technical research manager at the Stanford Internet Observatory and a writer at Wired and the Atlantic—for a conversation about disinformation online. “Anybody with a laptop can make themselves look like a media organization, can use a variety of social media marketing techniques to grow an audience, and then can push out whatever they want to say to that audience,” DiResta warns. Where do we go from here? Tune in to learn about journalistic ethics surrounding the New York Post’s Hunter Biden story and what to expect from disinformation actors this election cycle. Show Notes: -“Emails reveal how Hunter Biden tried to cash in big on behalf of family with Chinese firm” by Emma Jo-Morris Gabriel Fonrouge in the New York Post, “The Conspiracies Are Coming From Inside the House” by Renée DiResta Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
Welcome to another special Friday dispatch podcast. I'm your host, Sarah Isger, and we've got David French subbing in for Steve Hayes as my partner today. This podcast is brought to you by The Dispatch. Visit The Dispatch.com to see our full slate of newsletters and podcasts. Very excited. We're joined today by Renee DeResta. She is the technical research manager at Stanford Internet Observatory, a cross-disciplinary program of research, teaching, and policy. And
for the study of abuse in current information technologies.
She investigates the spread of malign narratives across social networks
and assist policymakers in devising responses to the problem.
Who better to talk to this week than Renee DeResta?
She also writes plenty in the Atlantic and Wired.
You can find her stuff there.
Let's dive right in.
Joining us now, Renee DeResta, we are thrilled to have you given this week.
I just want to start off right away with how you think the tech companies handled the Hunter
Biden story as an expert in disinformation.
Yeah, I mean, it's a really, really challenging case.
And this shows us that even with.
policies laid down so much of the implementation, the actual execution of the policy and also how
the policy is communicated are, you know, critical areas that, you know, could stand to be improved
as we saw yesterday, but into why. But also how politicized the implementation of the policy is.
So when there is a scenario for which there is a policy, the response by, you know, the particular
partisan side that feels itself, you know, to have been weakened in some way or impacted,
negatively impacted by the policy, it becomes a sort of second order story. Tell us what
happened yesterday. So there was a story that the New York Post broke in which a individual
obtained a laptop through sort of spurious circumstances and, you know, there were several
kind of areas related to the story itself that made it sound sketchy, made it sound like
the provenance of the information may not have been what the post, you know, what the
post reported. And so the challenge there for the tech companies, everybody is concerned
about a hack and leak. Everybody is concerned about a redux of Goethefer 2 and the DNC hacks
in which not a sufficient amount of attention was paid necessarily to the provenance of
the information, how it came out. One of the things that we've seen with the GRU, with Russian
military intelligence in its hack and league operations, is it often includes either forged documents
or manipulated documents kind of inside the body of a real hack, right? So there's a real need to
verify that information to make sure that what has been put out is authenticated in some way.
And it wasn't clear from the story that any of that had been done. And so what the platforms did
was they had come up with these policies that said that in a case of a hack and leak or dumped
materials, they didn't want their platforms to be used to achieve the kind of instantaneous
virality in which false or manipulated or misleading documents go viral or to have a situation
where dumping things on Twitter is in any way incentivized because that leads to other problems
with doxing and, you know, a range of other kind of issues. So what they created were policies
that said, okay, if this happens, we're going to throttle it.
We're going to have our fact checkers and authenticators, you know, people who can look
at the story and try to get to the bottom of what's happened before it, you know, to kind
of, you know, add some friction to the virality cycle.
So that's the ideal policy, I think, where what they're basically saying is this is an
important story, and rather than having the wild falsehood or, you know, misleading documents
go viral, let's take a second.
and try to investigate what's happening here
and then, you know, decide what comes next.
David, why don't we get you in?
Yeah.
So as somebody who's been,
I spent most of my career as a lawyer
and it kind of relatively new to the world of analysis
and journalism.
And but I already, I found out almost as soon as I jumped ship
from the legal world that you become a rumor magnet
when you're in journalism, that people are always peddling to you,
salacious, unverified allegations.
And in this case, you know, and, you know, my response has always been,
well, what's the evidence?
What's the evidence?
And so I put on your hat, put on it, let's imagine for a minute that the New York Post
gets this hard drive and coming to, from the sources,
the sort of nefarious sources, it's from a computer repair shop in Delaware through Rudy Giuliani's
lawyer through Rudy to the New York Post. And they call you and they say, what should we do now?
What would be the best practices if you obtain what purports to be a hard drive possessed or
owned by an important political figure? I know 100 by.
Biden's not running, but he's got information, certainly relevant to American politics.
What would be best practices? What should the New York Post do in that circumstance?
Well, I'm not a journalist either. So the kind of process of journalistic...
Well, how about this? If they said to you, not what should we do as journalists, how could we
authenticate this? We don't want to be a vehicle for Russian misinformation. How can we
authenticate this as valid? What should we be looking for to determine whether or not this is
part of a disinformation campaign? I think in that case, it's working with forensic researchers,
working with cybersecurity experts and people who have gone through these troves of documents
previously, reaching out to the relevant government agencies potentially as well who are examining
these things. I think it's also taking that critical view and saying this is an area where
people have actually been expecting to see a leak specifically related to Burisma. That's one of
the things that's so interesting about this is that a few months back there was a story that
broke saying that the Russians had hacked Burisma. I think that even in that particular story,
the attribution was a little hazy. Nobody knew exactly to what extent even that was an accurate
kind of reporting of what had happened. But to then have this figure where, you know,
there had been kind of almost like a telegraphing that this would be the type of narrative that
would come out, that this would be the type of documents that would be released, that this would
be the time frame in which those documents are released, should have been a pretty significant
red flag for anybody looking at that material to think about it in that context and then to go
through all of the relevant proper forensic authentication procedures with relevant outside experts
who, you know, where that's their, that's their business.
What do we do as news consumers to determine for ourselves? You know, if we can't count on
the platforms and we can't count on the news organizations necessarily, as consumers,
the difference between bias news, fake news, and disinformation
seem to be getting hazier and hazier.
And one of the pieces you wrote even said that the disinformation of the future
won't even need human effort.
It can be done by artificial intelligence.
This is not good.
Well, there's a, I mean, there's a lot of real challenges, right?
So I think it's important to, you know, if you take a step back and look in historically,
there's always been disinformation. There's always been misinformation, right, inadvertent mistakes
where wrong information is published. There's always been just human error in how stories
evolve, particularly today where the speed of, you know, the news cycle, I haven't been on
Twitter since this morning, and I feel like I'm unprepared to even comment on this because I'm
sure 60 different things have happened in those two hours. But the real challenges is actually the
speed, I think, at which this happens now, and that's one of the key challenges for individuals.
We've developed an expectation that when we open our social media platforms, you know,
Twitter in particular, if there's a breaking new situation, we'll have reputable information.
You know, new information will have been made public and we can get that information on Twitter.
And the challenge there is that these stories as they break, you know, any journalist knows that a lot of, you know, there's sort of a fog there, particularly if there's a significant story or something that is breaking. And so we're not going to get the most reputable information instantaneously, like news, you know, news journalism, investigations in particular process that takes time. And so we're looking for the kind of high quality, you know, verified information.
of the past, but we're expecting it to come to us in the timeline of today.
And that, I think, is a thing where we do need sort of a social shift in our understanding
that things that are going viral are often the most sensational things.
The velocity at which they go viral means that there potentially has not been the kind
of sophisticated analysis or, you know, fact verification that might have happened in a slower
time frame. So it's that velocity and virality that I think are some of the real challenges here.
And then I would say in combination with the fact that anyone can create content and anyone can
disseminate it. And so that democratization piece in that process of creation and dissemination
also means that anyone with an agenda and anyone with a sufficient number of people who can
support that agenda, have the ability to make a story go viral. And what that means is that it's
really democratized access to running disinformation campaigns. So whereas previously, in historical
periods, there was a, there was almost an infrastructure that was needed. You needed your content
creator. You needed your plants at front media organizations. You needed to own front media
organizations. There was a process by which you had to amass an audience and launder a narrative
through to make it reach the public, whereas now that has really been flattened. Anybody with a
laptop can make themselves look like a media organization, can use a variety of social media
marketing techniques to grow an audience, and then can push out whatever they want to say
to that audience. So we've really kind of rendered anyone able to carry out the kind of
campaigns that used to be a little bit more in the purview of the extraordinarily well-resourced.
So when we're looking at disinformation campaigns, I mean, I think that, you know, most people are
familiar with sort of purely partisan fake news that you will see generated and spread and go viral
on places like, you know, on virtually every social media platform. And we're familiar with the
social media platforms kind of campaign to get rid of outright misinformation.
in specific areas, but let's delve a little bit into Russian, for example, foreign interference.
If you have a, if you have an, could you talk a bit about what is the intent behind, say, a Russian
interference? What is the strategic intent? What is the intent behind a Russian interference
program? Is it just purely to sow chaos to sort of just turn ourselves against each other?
Is it that blunt of an instrument or is it more of a scalpel?
whole than we might realize.
It's very interesting.
So to stay focused on Russia in particular, what we see is a remarkable commitment to a long
game, right?
And this is something that very, very few other actors that we've seen, you know, my team
looks at disinformation campaigns by, you know, any nation state you can imagine at this
point.
And one thing that Russia does is they lay the groundwork early.
They build relationships with their own.
audiences. So they're not just running social media marketing campaigns. It's not just a bunch of
memes, which is how it's often described. They're using different tactics to reach different
targets. And so there are the narrative laundering, that sort of story planting, you know,
run a fake narrative up the chain. Operation Denver, the story that the CIA created AIDS is kind
of the canonical example that everybody's heard. So the GRU, military intelligence, is still
running that strategy updated for the age of the internet. Then there is the memetic propaganda,
you know, the memes, the, you know, everybody's seen Hillary fighting Jesus, you know, that kind of
stuff. And that's, it's funny and it's very, very easy to disparage because it's memes, but what
they're doing is that allows them to go social first, right, to reach ordinary people by pretending
to be ordinary people. So while narrative laundering is targeting the media and trying to work a
longer-term story, which is really aligned to achieving a particular geopolitical goal, right?
They really want to shape the way the world thinks of, you know, for example, U.S. policy
versus Russian policy in Syria.
For the social stuff, that's where you see much more of this, like, exploit fault lines
in American society.
So by creating these fake media properties like Black Matters, for example, in 2016,
they created a community that was really rooted fundamentally in pride in being a Black American
and in frustration at police brutality.
Again, two very, very real sentiments, to very justifiable and understandable, you know, pride in who you are
and frustration in how, you know, how you're treated.
And they took that same model that sort of pride and frustration.
and they made micro communities for a whole range of segments of American society.
So they did one for Confederate, supporters of the Confederacy, the Southern Heritage folks as well.
And what they did was they entrenched people in those identities.
And then very subtly, you see them begin to point out how other groups are different.
So it's never done as explicitly, you know, post just saying we should hate X or we should not like X.
it's arguing about sort of the who is America for is the kind of one of the dominant themes and here is why it should be for us and here's how those other people are taking resources away from us.
And once they have those communities established, then you see them begin to nudge to use that influence, that goodwill that they've built up by establishing this trust with their audience.
That's when you see them begin to nudge into things like voter suppression.
or vote for Trump, right? So we saw them use the, the goodwill they'd built up with the
right-leaning communities in the Republican primary to denigrate Ted Cruz. So they had a page that
was for tea-partiers. And, you know, Ted Cruz was the darling of the tea party, but they still
are in there. And you see them very subtly saying, hey, you know, I kind of feel like Ted was the old
and Trump is the new. And I mean, you know, I kind of feel like Ted's actually a rhino now. And maybe
we should be going for this new guy. So there is this shift, right, where, you know, once you have
established that trust, once you've presented yourself as a member of that group, that's where
you do see the nudge come into play. So just saying social division is an oversimplification,
social division is, you know, kind of an overarching theme. It's very effective to have this chaos.
But there is some element of persuasion. And more importantly, there's an element of this
agents of influence model where you also see once they have that trust, once they're in those
communities, they galvanize people to do things in the real world as well, whether that's
turned out for a protest or write for one of their fake publications. So that same model of
infiltration has also been ported to the internet. Can you talk a little bit about how what you
just described and the goals of the Russian intelligence agencies are different than how China
is supposedly supporting Joe Biden, for instance.
We had that report out of the intelligence communities
that the Russians would like Donald Trump to win
and that the China state government
would like Joe Biden to win.
But everyone on both sides was quick to point out,
they're using very different tactics.
They are using very different tactics.
So we were not, you know,
we haven't seen the same information that the IC has
on that claim about China supporting Biden.
But what we do see, Chinese influence operations, there's a lot of similarity between what China puts out and it's extremely overt state media, right?
So what you see from CGTN, you know, the narratives about telling China's story, right?
telling China's story well, ways in which presenting a particular picture of China is so core
to the way that they work their influence operations. You really don't see this extreme outward
focused, long game manipulation type tactics like you see from Russia. Instead, what you see
is social media, the influencers, and then the sort of secret bots, which are mostly garbage, to be
perfectly honest. Their accounts are just not very sophisticated. They're oftentimes created,
you know, a couple months before they need to be used or they're purchased or, you know,
taken over from some either black market entity or, you know, compromised accounts. But they,
they weren't even clearing out the history, actually. So you can see that these were accounts
that had been, you know, UK pop fan group account gets compromised and begins to talk about
Hong Kong, right? So there's a just, it's, it's not,
as, it's not as well executed. I mean, so much so that actually when we, when we saw the first
proper attribution to Beijing back in, I think it was August of 2019, a lot of us who'd been
waiting to see, you know, when this was happening, everybody was expecting it to happen,
really thought that maybe it was almost like a deflection, like here's these crap accounts
while the real ones are off doing something else, you know? Yeah. Because they were so bad.
But I think that what you see from China is much more focused on presenting a positive picture of China.
The conversations that you see the Chinese bots and sock puppets, troll accounts having,
is really related to areas that China feels it needs to create a particular perception of China's behavior.
So the Hong Kong protests with fake Hong Kongers
talking about how great China is.
But ultimately it comes back down to telling China's story.
You know, it's funny you mentioned that Russian pattern
if you start something that seems facially reasonable
and then you tweak and tweak and tweak until I actually,
I have to confess, I fell for a Russian troll account
shortly after 2016, in the sense fell for it,
in the sense of I retweeted it.
And it was called Tennessee GOP.
Ah, TGOP.
Yeah.
Oh, so you know it.
Oh, yeah.
And, yeah.
And it said something kind of anodyne and reasonable, and I just thoughtlessly retweeted it.
I mean, I never would have occurred to me that 10 GOP would have been a Russian account.
And immediately, I got some DMs from folks that follow me who are, you know, highly clued in.
And they said, that's Russian.
What?
you know and what it what it was interesting to me is it taught me a little bit about their
sophistication because Tennessee is not a internationally famous place it's not a it's not a
place that if you're thinking I'm going to penetrate the United States of America how do I
get to those Tennessee Republicans and it but it taught me that there's a
is a lot of intention here. There's a lot of thought here. And it was quite frankly, sobering.
I think you're absolutely right. There is a remarkable understanding of American society,
American culture, very, very, you know, subcultures. So I did the, I did, I led one of the
research teams for the Senate Intelligence Committee investigation into the data that the social
platforms turned over. And so I had all, you know, sort of 400 gigs worth of this stuff. And
and I actually went through the memes, you know, one at a time, a couple hundred thousand of
them. But I felt, you know, we can do a lot of technical analysis. We can extract, you know,
what the images are and so on and so forth. But I wanted to look at them. And so I would just sit there
and I just arrow through. And some of them were very funny, which I think, you know, people see
the ones that have been made public. There's also a lot that have
not been made public because they use pictures of real people. But some of them were very funny.
Some of them were very kind of in jokes. They're speaking to the community. They had an
understanding, you know, among the conservative pages, for example, there was one called being
patriotic. And that had a lot of Ronald Reagan, kind of, you know, lots of flags, lots of, you know,
waving grain and, you know, the beauty of America and so on and so forth. Very much more,
trying to appeal to kind of an older conservative audience. And then they had an Instagram page called
Angry Eagle. And that account was basically just a lot of times they were grabbing Turning Point
USA memes slapping their own logo on top of them. And that's where you saw the like, you know,
the more kind of like edge lord type stuff, right? The more, you know, words I'm probably not allowed
to say. But it just really kind of got down into the, you know, the vernacular. And it was, it was really
something to see
like their understanding the sophistication
and you know not all conservatives are the same
not all conservatives of a particular age
are going to be receptive you know
want to see the same content
within the black community they had
about 33 I think different Facebook pages
and they were
segmenting according to you know here's a page
for black women talking about beauty
and then all of a sudden voter suppression
here's a page for families
with incarcerated family members
here's a black liberation theology versus black Baptist page right there were just so many
different segments of American society that were reflected in each of these different pages and
accounts I mean I even felt like me as an American who has you know born here I am extremely
online you know I follow tons and tons of different types of people and there were things where
I would have to go Google to try to figure out you know okay here's this claim that's being
made they you know I think Mozart was black was one of the memes that they were sort of
pushing out and I was like well that's interesting now I've never heard that is this true I'm
going to have to go Google for that is this seems to be a thing that speaks to the community that
they're targeting it at that there's a whole lot of likes on this thing is this a prevailing belief
among among the particular targeted community and I just felt like I was seeing all of these
facets of American society that normally, I think unless you're part of that community,
you're not seeing. And it was remarkable to me that they had actually zeroed in to that extent
and a little bit disconcerting. Let's take a quick break to hear from our sponsor today,
Gabby Insurance. When you've had the same car insurance or homeowners insurance for years,
you kind of get trapped into paying your premiums and not thinking about it. That can make it
really easy to overpay and not even realize it. So stop overpaying for car and homeowners insurance
and see about getting a lower rate for the exact same coverage you already have thanks to Gabby.
Gabby takes the pain out of shopping for insurance by giving you an apples-to-apples comparison
of your current coverage with 40 of the top insurance providers like Progressive, Nationwide,
and Travelers, just link your current insurance account and in minutes you'll be able to see
quotes of the exact same coverage you currently have. Gabby customers save an average of
$825 per year.
So if they can't find you savings, they'll let you know.
So you can relax knowing you have the best rate out there.
And they'll never sell your info.
So no annoying spam or robocalls.
It's totally free to check your rate and there's no obligation.
Take a few minutes, right now, and stop overpaying on your car and homeowners insurance.
Go to gabby.com slash dispatch.
That's gabby.com slash dispatch.
Gabby.com slash dispatch.
You wrote this fantastic piece in March of this year
called The Conspiracies are Coming from Inside the House for the Atlantic.
And, you know, the thesis was in 2016, we were so concerned, post-2016,
so concerned about Russian interference.
And fast forward to 2020, and a lot of these conspiracy theories and division
are coming from real people, real influencers,
there's occasionally blue check marks, right?
Like, this is not a foreign disinformation campaign.
Is there a sense among the Russians that their work here is done,
that we're now just doing it to ourselves?
Or is there a real difference you're seeing
between the homegrown conspiracy theorists
and foreign disinformation campaigns?
Yeah, I think that's a great question.
I don't think that they think their work is done.
No, I mean, we're still seeing a count.
There was this piece data operation and then a conservative counterpart where they were still trying to make fake content and share it and hire real journalists to byline their new fake properties.
But a lot of that's very small, right? I think they did take a pretty significant hit when a lot of their old infrastructure and accounts kind of were taken down.
But per your point, a lot of what we're seeing is domestic.
And so there are these little Russian operations, but they're not having an impact.
They're just, you know, very, very tiny.
What is having an impact now, though, and what is very, very, I think, extremely frightening, actually,
is that a lot of the narratives that we saw Russian trolls pushing in 2016, the election is compromised,
the voting machines are rigged, the election is going to be stolen from Donald Trump,
the, you know, these kind of wild claims, the oathkeepers need to show up to the polling places.
will you join them and defend the vote, right?
These were all narratives that they were pushing the two weeks before Election Day in 2016.
It was very small then.
That wasn't a thing that was, you know, that many Americans felt, right?
We all still went into the election in 2016, as, you know, vitriolic as the campaign might have been.
Nobody was out there thinking that the questioning the legitimacy of the process itself.
And now, now that is every.
The delegitimization is coming from prominent blue check influencers, right?
We have stories of, you know, people going on Tucker Carlson talking about how there's a color
revolution happening here.
We have prominent blue check influencers, you know, tweeting dire warnings about some random
ballot in a garbage somewhere before figuring out what the actual story behind that really
is.
It's not limited to conservatives either.
You know, the article in the Atlantic was me writing about what happened.
during the night of the Iowa caucus, right, the Twitter conversation immediately devolving
into somebody somewhere, you know, rigged the app to, you know, to steal Bernie's victory
on caucus night, right? You know, through six degrees of separation, Pete Buttigieg's team
was connected to Robbie Mook, who was, you know, theoretically checking the cybersecurity, you know,
kind of like doing a security check on the app,
ergo the Hillary Cabal was stealing it from Bernie, right?
So there are these wild, outrageous accusations
that are framed, you know, by the hyperpartisans,
they're just framed a statements of fact.
But then you see people who should know better,
like journalists, who come in and retweet that stuff
with a kind of some people are saying or, you know,
just asking questions, is it possible that, you know?
And so then these folks who have remarkable followings, trusted followings,
and they've been kind of given the seal of approval of, you know, authoritative figures
by the blue check go on and perpetuate it themselves.
So in that particular case, the reach of those claims, even if there are some Russian trolls
in there hitting, you know, hitting a retweet button and, you know, jumping for joy at the chaos,
ultimately, it's our own influencers that have now become an integral part of pushing this
stuff out because we're in an environment of such low trust and in so many people feeling
that they have to weigh in on these breaking, emerging kind of conspiratorial accusations as
they happen, as opposed to stepping back from the keyboard and waiting for the facts to come
out. Now, you know, going back to your earlier answer, it was really fascinating to me. Again,
I just keep being fascinated by the extent to which there seems to be this granular knowledge
on the part of, we're bouncing back and forth between domestic and foreign here on the part of
these Russian influence operations. Now, it strikes me as extremely difficult for a
an intelligence operation to gain that level of granular knowledge without domestic assistance.
I mean, you were saying you're extremely online and there are things that you don't know.
Do we have evidence that this is really sort of, is this level of granular knowledge just gained
by external study of the United States of America? Is there evidence? Is there any indication
that there is domestic assistance? It's really fascinating to see that level.
level of knowledge being implied into an attempt to influence our own political system.
Yeah, and I think that perception in some ways led to a lot of the research that I was doing
being interpreted by folks, kind of tying it into the collusion narrative, right?
The collusion stories.
And that was frustrating for me, honestly, because I felt like I was telling a story of interference
rooted in, you know, 400 gigs of data.
right right and it was somehow like you know mishmashed into this like rushagate hoaxer you know like no
this really happened whatever is going on with the muller investigation whatever is going on over
there with how much the campaign knew the undeniable truth is that this did happen and here's what
it looked like and here's what we can learn from it and i felt like i was you know fighting this like
geeky almost you know i wasn't an academic at the time but like kind of geeky um researcher point
of view, like, you know, these two things can be true. There can be absolutely no collusion
and this still happened. But one of the things that we were trying to understand was we looked
very closely at the ad targeting, right? Because at the time the Cambridge Analytica story was
also unfolding. And we, those of us who, you know, we had visibility into the targeting data
and what they were using. And they actually weren't using very sophisticated targeting. And that
was really remarkable to me. So there was some geographical stuff that was linked to things like
the, you know, if there was an incident of officer involved violence and there were protests related
to that, you would see them, they would actually add these cities onto their targeting list.
So it started with, you know, Ferguson and places where there were sort of early atrocities.
And then gradually, as there were new cases that would emerge, they would just add that city
onto the targeting data so that whenever there was a new incident, they would promote the story
to all of the old communities as well. So there was some sophistication in terms of that tactic.
But they weren't using what are called lookalike audiences. They weren't using anything that
indicated that they had access to, you know, past voting records or campaign type targeting
materials. So one thing that we were seeing, though, and that I was kind of curious about is
On Facebook, this was before some of the reforms were made to how Facebook's ad targeting tool worked, you could really get in there and find granular interest based. And they had, gosh, I'm forgetting the exact words, but it was, you know, like, they had a sort of, I think they called it a behavior almost. It was like interest in African American content. It was a little bit of a weird wording. And so you could actually target people who were interested.
interested in Malcolm X, people who were interested in Martin Luther King, like they had these
historical figures as interests, and so you could target particular types of content to people
who had indicated an interest in certain specific historical figures. That kind of thing I think
you can get through research. The other thing they did is they did send a team here to do a kind
of, you know, tour, a little tour of the U.S. It was a road trip, I think, was how it was reported
out by Russian press, where they actually kind of went around, spent a bunch of time in Texas,
you know, excuse me, traveled the country and tried to, you know, take in America and all of
its, you know, interesting cultural quirk. So they did actually send people here. The other thing
is people put out a lot of information about themselves on the internet, both individual, but also
in terms of communities. And I think if you were to go and join a Facebook group in which people
are already self-declaring who they are, you know, what interests they have, how they
identify. Within a couple weeks of kind of consuming that content, reading it, getting a sense
of what's happening, you know, what getting a sense also at that point of like, what's getting
a lot of clicks, what's getting a lot of shares, you do see that informing not only Russian
intelligence, but we've seen like, you know, spammers in Macedonia and
elsewhere who also managed to kind of pick out what narrative seemed to work based on
participation in groups where they just kind of learn as it goes by.
Renee, what will we say we missed on November 4th?
What will we look back and say that we didn't concentrate enough on, we didn't know enough
about, we didn't focus on, and we'll regret later?
You want the unknown unknowns?
Yeah, yeah.
Um, I mean, I, uh, I feel like if I knew the answer, I mean, if I, just by verge of answering that, you know, they're no longer on known unknowns. But, um, you know, I have absolutely no doubt that there will be some sort of surprise. I'm, I am consistently amazed at the ingenuity of people who want to manipulate systems, right? Um, not so much in the fact that they exist, like that I'm pretty jaded on.
But in the incredible unique ways in which new product and feature fronts were used,
we spend a lot of time on what's called like adversarial thinking, right?
Exactly this.
You know, there's a lot of different groups are doing these kind of red teaming scenario planning,
you know, war game type exercises related to the election.
And that really runs the gamut, a lot of just political folks doing that.
we think about it in the context of how is the technical front going to be manipulated or misused
on those nights? And so I see this as this is a communication infrastructure and it operates
as a system. Every single one of these platforms has some sort of different affordance, right?
If you want to post a video, you go to YouTube. If you want to do something live and attract
a ton of attention for demographic X, maybe you go to Facebook. Maybe you go to Facebook. Maybe you
to TikTok. There's different ways to use the ecosystem as a whole to reach the people you want to
reach. We are constantly seeing new and exciting ways in which things like live video are
misused and manipulated. During the Floyd protests, there was a Pakistani spam ring. Wall Street
Journal reporter actually sent us a tip and asked us to look into it, and we spent about a day
actually doing the investigation in which this group of enterprising economically motivated folks
overseas decided to use gaming technology, live streaming technology to pretend to be streaming
live from the Floyd protests. And they just ran this sort of like 24-7 stream in which they were
pushing stolen video from activists who had actually been there. But pushing it out as like
things that were happening right now. And they expanded from originally taking actual protest
footage that had at least happened, you know, that day or within the week to going and grabbing
incidents of officers arresting people in, you know, in violent or physically active ways, right? And so
I remember this. Yeah. Yeah. So all of a sudden you had a domestic violence arrest being,
you know, in which the guy is on the front lawn with, you know, somebody kind of, um, uh,
mealing on his back to cuff him, that arrest all of a sudden becomes recast as this is a thing
that is happening right now to kind of feed into the outrage. But really what they wanted was just
the clicks. They just wanted people to like and follow their page, right? Because if you can amass
that audience, you can theoretically potentially monetize it later, right? And so this, or you can sell your
page to somebody who does want to monetize it later, even though that's a violation of terms of service,
it does happen. So these are the sorts of things where,
you know, using gaming tech to fake out Facebook's live video streaming tool to dominate
the hashtag justice for Floyd, you know, the sort of thing where it wasn't really on the radar.
You know, I think we thought more, we didn't realize the extent to which live made that
possible, meaning, you know, we thought at least lives would be live. So this was a new
new and exciting way to
to use technology for
for evil
and a quick break to hear from our
sponsor the Bradley Foundation
Americans are navigating through
several unanticipated crises this year
We the People is a new
Bradley speaker series that offers insights
and ideas on the current challenges we face
from some of the remarkable organizations
the Bradley Foundation supports
visit BradleyFDN
dot org slash Liberty
to watch their most recent video episode
on the Electoral College featuring Trent England. England is the founder and executive director
of Save Our States, a group dedicated to educating Americans about the Electoral College and defending
it from the national popular vote campaign. In this episode, he explains the history of the
electoral college, how it works, and what happens if the rules change. The discussion is an insightful
analysis of the many merits to the way the president is elected. That's Bradley with an
L-E-Y at the end, F-D-N.org slash Liberty to watch the video.
New episodes will debut weekly,
so come back often and subscribe to their YouTube channel
to be notified whenever a new one is posted.
Well, Renee, I just want to thank you so much for joining us for this pod.
This has been really informative and so important, you know,
two and a half weeks out from Election Day,
but I do have a really important final question for you.
Okay.
earlier this month, you stated that your six-year-old was listening to a kid's book on the
Bermuda Triangle and watching unexplained mystery series for first graders, including one on
Bigfoot. I am curious how the child of a disinformation expert approaches such things as
unexplained mystery shows and which, if any, have been most persuasive? He really is. We go camping
a lot. It's like our thing to do as a family. We just kind of get in the van and go. And he does
occasionally, as we're walking around the woods, ask me like, are you sure there's no big foot?
That's definitely been the one that for some reason he's latched onto and things is plausible.
But it was, you know, I was really proud. I was, you know, we're homeschooling. Of course, we've got three
kids, the oldest is the six-year-old. So the other two were more trying to just keep the chaos
to a manageable level. But with him, he has to actually learn something. And so I was sitting next to him
and he started laughing because, you know, the Bermuda Triangle person was going on about in this book,
this audiobook, you know, some people say that spirits are involved. Some people say it's a portal
to another universe. And he started laughing and he said, maybe the, maybe the boat captains just can't
Dale very well. And I thought, like, yes.
Proud mother.
Very proud mother. So hope for at least a couple generations, you know, hence.
He'll be able to vote in, you know, 12 years.
David, did your kids have any special disinformation activities they enjoyed stories?
I think my oldest two kids eventually at some point became convinced that some of their Lego
minifigures were alive and had personalities.
But no, no, the thing that I was convinced about as a kid was UFOs.
And you're not now?
Well, I suddenly have done almost a 180, maybe a 179.
But yeah, I would lay awake at night reading these comic books about real-life UFO sightings.
And it would literally keep me up.
I was so terrified of UFOs.
But, yeah, I'm almost back where we started.
I'm really rooting for the Loch Ness monster.
I really want to believe that one dinosaur made it.
That would be so amazing.
That'd be unbelievable.
But hope is fading, Sarah.
Hope is fading.
Renee, what's your pet one?
I was always interested in the past live stuff.
So not so much, you know,
physical, external thing, but these stories of, you know, people in India who seem to remember
having children in another village and then they go there and the children really exist and stuff
like that. And so, yeah, so that was sort of my rabbit hole. I like it. Well, thank you listeners
for joining us. We hope that this has been informative on disinformation that can affect our country,
our campaigns, our elections, but that you still have some joy in your
own personal little disinformation loves like mine for Nessie. We will see you again next week.
I'm going to be able to be.