CyberWire Daily - Election Propaganda: Part 2: Modern propaganda efforts.
Episode Date: October 9, 2024In preparation for the US 2024 Presidential Election, Rick Howard, N2K CyberWire’s Chief Analyst and Senior Fellow, discusses recent international propaganda efforts in the form of nation state inte...rference and influence operations as well as domestic campaigns designed to split the target country into opposing camps. Guests include Nina Jankowicz, Co-Founder and CEO of the The American Sunlight Project and Scott Small, Director of Cyber Threat Intelligence at Tidal Cyber. Check out Part 1! Make sure to check out Election Propaganda Part 1: How Does Election Propaganda Work? In this episode, Rick Howard, N2K CyberWire’s Chief Analyst and Senior Fellow, discusses personal defensive measures that every citizen can take—regardless of political philosophy—to resist the influence of propaganda. This foundational episode is essential for understanding how to navigate the complex landscape of election messaging. References: Scott Small, 2024. Election Cyber Interference Threats & Defenses: A Data-Driven Study [White Paper]. Tidal Cyber. Renee DiResta, 2024. Invisible Rulers: The People Who Turn Lies into Reality [Book]. Goodreads. Nina Jankowicz, 2020. How to Lose the Information War: Russia, Fake News and the Future of Conflict [Book]. Goodreads. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
You're listening to the Cyber Wire Network, powered by N2K.
Air Transat presents two friends traveling in Europe for the first time and feeling some pretty big emotions.
This coffee is so good. How do they make it so rich and tasty?
Those paintings we saw today weren't prints. They were the actual paintings.
I have never seen tomatoes like this.
How are they so red?
With flight deals starting at just $589,
it's time for you to see what Europe has to offer.
Don't worry.
You can handle it.
Visit airtransat.com for details.
Conditions apply.
AirTransat.
Travel moves us.
Hey, everybody.
Dave here.
Have you ever wondered where your personal information is lurking online?
Like many of you, I was concerned about my data being sold by data brokers.
So I decided to try Delete.me.
I have to say, Delete.me is a game changer.
Within days of signing up, they started removing my personal information from hundreds of data brokers.
I finally have peace of mind knowing my data privacy is protected.
DeleteMe's team does all the work for you with detailed reports so you know exactly what's been done.
Take control of your data and keep your private life private by signing up for DeleteMe.
Now at a special discount for our listeners.
private by signing up for Delete Me. Now at a special discount for our listeners,
today get 20% off your Delete Me plan when you go to joindeleteme.com slash N2K and use promo code N2K at checkout. The only way to get 20% off is to go to joindeleteme.com slash N2K and enter code
N2K at checkout. That's joindeleteme.com slash N2K, code N2K. This is part two of our three-part election propaganda series. week, we covered how propaganda spreads on social media platforms via a system of systems
that is made up of five distinct propaganda agents that I call the Pintad.
They are the platform, the algorithm, the influencers, the crowd, and the media, where
each element feeds the others in a massive self-sustaining do-loop designed to keep the
machine humming and the crowd engaged so that
the platform owners can bring in massive amounts of revenue. And just so you know the ballpark of
what we're talking about here, in 2024, Facebook generated over $80 billion in revenue, Instagram
almost $50 billion, and YouTube close to $32 billion. We put this series together because we have a
theory that if average American citizens, not culture warriors looking for a fight, but just
normal people trying to stay informed, trying to separate the signal from the noise in order to
make a reasonable voting decision in November, if they understood the forces at play and the
motivations behind each Pentad element,
they could inoculate themselves from the viral machine and resist hitting the rage button and
unwittingly feeding the machine with all their likes and shares when whatever viral things happen
in the course of the day. To accomplish that, we provided you a toolkit, a thought framework,
if you will, that gave you
things to think about regarding each pentad element, designed to keep you from getting
sucked into the rage machine and free you from the hermetically sealed information bubbles,
those bespoke realities that have their own set of facts, regardless of clear evidence that
contradicts the accepted bespoke view. Our goal wasn't to stop you from engaging
with the rage machine, if that's what you want to do. If you're a culture warrior, by all means,
feel free. But if you're just an average voter, our goal is to explain how the system worked
and describe who benefits from the system so that you can make a reasonable decision yourself
about future engagement. So that was part one. And if I do say so myself, I think it's really good.
If you haven't listened to it yet, stop right now and go do that.
For part two, though, we want to show some real propaganda examples from the past 10 years or so
to demonstrate the power that propaganda has in our individual lives,
that this isn't just some distant threat.
Election propaganda is here today.
When propaganda experts discuss this topic,
they usually roll out the success that both Hitler and Stalin had with it during World War II.
But those examples are old and stale and have been talked about in depth elsewhere.
We wanted to provide some recent examples,
like how the then Brazilian president Jair Bolsonaro used propaganda
to question the legitimacy of the Brazilian electoral system
when he called previous elections rigged as he lost the 2022 election in a close runoff.
Bolsonaro's supporters stormed multiple government buildings in Brasilia,
resulting in 40 protesters and 44 military police officers injured in the riot,
approximately 1,500 people arrested in connection
with the attack, and approximately $3 million of physical damage. Or how the current Venezuela
president, Nicolas Maduro, used extensive propaganda to claim the highly disputed election
win in July 2024. In part one of this series, we used Rene DiResta's book, Invisible Rulers, The People Who Turn Lies Into Reality, to show how the social media propaganda machine works.
To discuss how it has been used in the past, we turn to another expert, Nina Jankowicz,
author of How to Lose the Information War, Russia, Fake News, and the Future of Conflict.
Nina has quite the resume.
She studied abroad as a Fulbright Fellow in Ukraine,
researching the impact of disinformation and online narratives in Eastern Europe.
She was the George F. Kennan Fellow and Disinformation Fellow at the Woodward
Wilson International Center for Scholars, a prominent think tank here in the U.S.
established by the U.S. Congress in 1968. She holds a master's degree from Georgetown University's School of Foreign Service,
where she focused on Russian and Eastern European studies,
and she speaks Russian, Ukrainian, and Polish.
I started out by asking her why she wanted to write this book.
Yeah, so when I initially thought about the book,
I was living in Ukraine from 2016 to 2017,
working as a Fulbright fellow advising the Ukrainian government on strategic communications.
And obviously, we had just seen this big watershed moment in the election of Trump and the revelations
about Russian election interference and, I would say, information operations more broadly,
not just attempts to access election infrastructure,
which did not succeed, but also these broader info ops. And from my seat in Kiev, it really
seemed like the U.S. was sleeping beauty waking up from a slumber and having no idea what was
going on where our allies in Central and Eastern Europe had been dealing with this threat, of
course, during the Soviet era, but also in the modern era. And so I just really wanted to make sure that if there was
anybody listening in the U.S. government, that we would be able to address some of these threats and
not repeat some of the same mistakes and also not start from scratch, not reinvent the wheel.
But as I was researching the book, the thing that surprised me that I didn't really expect, it became so clear that the domestic information environment and the vulnerabilities in each country play such a big role in the success of foreign information operations.
And that, I think, is really the key to unlocking, protecting our information environment, which we're still struggling with to this day.
protecting our information environment, which we're still struggling with to this day.
I think when most people think about this kind of stuff, they think about nation-state influence operations, and you cover Russia in the book, okay? But there's also all these
in-state actors, you know, our own people, you know, at each other's throats that
can do this kind of thing too. But your book specifically covers Russia and their activities
against five Eastern European countries, right? They are Estonia, Georgia, Poland, Ukraine,
and the Czech Republic. Did I get all that right? Yes, yes, you did. Yeah, they're an interesting
group and kind of in some ways a little bit dissimilar. You've got some countries that are
newer to the transatlantic orbit, some like Poland and the Czech Republic who were in the
first accession wave, and then some countries like Georgia and Ukraine, which aren't officially part
of NATO or EU structures yet. And they've all got kind of different approaches to the problem,
but all of them recognized it from an early stage, which I think separates them from countries in Western Europe and then especially from the United States as well.
In the first part of this series, we talked about the generic influencer.
They came in two varieties.
The first kind is relatively benign.
Call them the brand influencers like Taylor Swift that we mentioned in that episode. She's connecting with her fans
to enhance her career. The second kind of influencer, though, is more evil. These people
try to fan the flames of both sides of the political spectrum for personal gain. Let me say
that again. These people are evil. But a subset of these evil influencers
consist of nation-state information operations operators,
influencers for sure, but propagandists with a capital P.
They don't do it for individual personal gain.
These groups do it to pursue national objectives,
in the same vein that Joseph Goebbels did it
for Hitler in World War II. Goebbels was
Reich Minister of Propaganda and Public Enlightenment. He controlled news media, arts,
and information, orchestrated book burnings of un-German literature, and created a cult of
personality around Hitler. But in 2016, Russia ran impactful propaganda campaigns designed to
influence the vote in favor of the then-candidate
Trump over his opponent, Secretary of State Hillary Clinton. Since then, Russia has continued
to run election propaganda campaigns in the 2020, 2022, and 2024 U.S. elections. And lately, China,
North Korea, and Iran have been active. They're not as mature as Russia in tradecraft,
and their purpose does not directly align with Russia's efforts,
but they're in the game.
Nina's book is about how Russia does it.
Here's Nina.
So Russia is not necessarily after some sort of political outcome.
They might prefer one candidate over another in an election,
and sometimes, you know, with issues like aid to Ukraine, for example, they might try to spread disinformation in order to change the
public's viewpoint about that. But in general, polarization and pitting Americans or, you know,
Estonians or Europeans in general against one another really benefits Russia because Putin can
say to Russians who are protesting, you know, you think
democracy is so great? Look at what's happening over there. It also distracts us. So when Russia
is, you know, on its latest adventure in Ukraine or Venezuela or the Central African Republic,
we're going to be subsumed with problems at home. And then the third important thing that these
kind of exploitation of societal fissures
does for Russia is it gets Russia a seat at the global negotiating table. I mean,
U.S.-Russian relations are pretty bad now, and yet President Biden had a summit with Putin.
Trump met with him multiple times. Even before the full-scale invasion of Ukraine in 2022,
before the full-scale invasion of Ukraine in 2022,
Macron was floating, inviting Russia back to the G7.
I mean, it was shocking for me to hear,
given what I know about Russia.
But it was, you know, despite all of this,
it made Russia into that global player that it hadn't been since the fall of the Soviet Union.
So there's a little bit of...
Because we were distracted, worried about domestic things, not really paying attention to what Russia was doing abroad. Well, yeah. And because, you
know, in order to make them stop or encourage them to stop these operations, you had to meet with
them. And when you sit at the same table with Putin, even if that table is very long, as we saw
during COVID, right? Like that's, I mean, that elevates him. That says to Russians,
oh, Putin is a world leader again now.
And, you know, we have this status that is to be coveted
and we're going to continue engaging
in these operations
so that that asymmetrical control
can keep happening.
As we learned from various
U.S. intelligence agencies,
Congress and social media companies
after the 2016 election,
one play out of the Russian
Information Operations Playbook and used by their internet research agency was to pose as American
activists from across the political spectrum using various social media platforms like Facebook,
Twitter, and Instagram. They organized rallies in support of candidate Trump while simultaneously
organizing counter-protests from the Secretary Clinton side. When journalists asked the activists from either side if it bothered
them that the Russians got them so mad that they came out to protest, some said that the Russian
involvement didn't ultimately matter to them. Their perspective was that regardless of who
organized the events, the causes that they were advocating for were real and important.
In their view, the authenticity of the issues,
whether racial justice, immigration reform,
or other social concerns,
was not diminished by the origin
of the rally's organization.
Talk about a bespoke reality.
The Russians learned that using existing social movements rather than fabricating issues out of thin air were more effective.
A notable instance occurred in Houston, Texas, where Russian-linked accounts promoted both a pro-Muslim rally and an anti-Muslim counter-protest.
By promoting these opposing events, Russian operatives sought to sow chaos, confusion, and conflict with the American population.
And so far in the 2024 U.S. presidential election, the Department of Justice seized over 30 websites linked to a Russian disinformation campaign known as Doppelganger.
The campaign used social media platforms to distribute content, sometimes using fake pages resembling prominent news outlets to
add credibility to their messaging. Scott Small is the Director of Cyber Threat Intelligence at
Tidal Cyber, and full disclosure, I'm on the Tidal Cyber's advisory board. But when we began
researching this propaganda series here at N2K, we immediately found Scott's 2024 white paper called Election Cyber Interference, Threats and Defenses, a Data-Driven Study.
So we invited him on the show.
I started by asking him about the thesis of the paper.
So what we're trying to accomplish with the white paper is the entire globe is facing
a huge threat around election interference.
What we're seeing is a very, very large portion of the total global population is going into democratic elections this year.
Many of them have occurred now at this point in the conversation.
There's a few left, a few big ones left in the year. But we're trying to take a look at what is this threat of what we call interference, these kind of technical efforts to tamper with and interfere with elections,
as opposed to the very understandable huge focus around disinformation and influence operations,
which many of these operations go kind of hand in hand. So I want to double tap on the difference
between what an influence operation and an interference operation is. That's basically the distinction you make in
the paper. Can you explain more about the distinction between those two things? Technically,
the definition, one is kind of a subset of the other. So you could look at it as influence being
any attempts by, usually we're talking about a foreign power, to kind of shift the results of
an election, convincing a population that's going through an election that some information may be
inaccurate to sway those results. Now, there's this subset of how you can go about performing
this influence, which is to do all sorts of range of technical means to interfere with the election.
So a lot of what we more commonly see is attempts to steal information that might be then used in an influence operation.
So that's technically hacking into an account or a system that has this information, exfiltrating it out, and then using that as part of an influence operation.
So most often, these things are going hand in hand as opposed to one being entirely exclusive
of the other. So in the paper, you list these big national powers, and that's what we're talking
about here from your white paper. This is not, you know, Joe Bob on Facebook trying to influence
people. These are nation states trying to interfere and influence elections across the world, not just
in the United States. And you list Russia, China, Iran, and North Korea as the four big bads that
engage in this activity. But you have a great chart in the white paper that talks in terms of
volume. You show Russia and China executing the bulk of these interference attacks with Iran, a distant third, and North Korea, a very distant fourth.
I wonder if you could walk us through the mechanics of a specific interference operation.
All right.
And what do you think they were trying to accomplish when they did that?
on the chart, a good example here is we wanted to just try to lean on as much data as we could with this study because influence, interference, and elections are all kind of hot button issues.
And so just trying to look at actual examples of this stuff happening to try to kind of prioritize
which of these threats might be most notable. And the way that we kind of generated those rankings,
like that top four list and the ordering of them, was based on especially the
number of groups that have conducted these types of operations in the past, which ones are aligned
with certain of those countries, and then looking also at which ones have been known to target
countries that had elections this year. So just unpacking a little bit of the methodology that
we used. Now with that, I mentioned we wanted to highlight specific adversarial cyber threat actor groups that have carried out these operations.
We further broke that down into a few different types of the operations. So, social engineering
has been and I would say continues to be one of the top threats when it comes to election-related
interference. Very closely related to that are going to be attacks against
kind of what we call like a cyber identity. So your personal accounts, your email accounts that
are associated if you're part of a campaign, for example, attacks on those. And then kind of the
third most common bucket that we've seen in the past is attacks on election infrastructure.
Back in 2020, we saw cyber threat actors associated with
the Iranian government compromising election informational websites hosted by state governments.
So these are the places that are giving voting information, information about where to find,
you know, polls, where to go vote, and uh sites that maintained kind of voter registry databases
we saw these actors going out and actually using publicly available tools to scan for
vulnerabilities and these websites the web infrastructure finding a lot of those
vulnerabilities that were available to them and then just going out exploiting those um and being
able to actually successfully exfiltrate voter registration information from those sites. This was reported by the U.S. CISA
organization. We have a lot of good details about how this happened and what was done with it.
The government strongly suspects that those emails containing that voter information were
then used to target and send disinformation and
propaganda emails to those individuals in a handful of states. This is a very clear example
where we did have a lot of detail both about how it happened and the intent, but it was being used
to attempt to influence those voters and try to scare them. Actually, in some cases, there was
some very intimidating kind of language. If you don't vote this way something bad might happen to you so that's a clear example of how you can use an
interference methodology to eventually get to some sort of influence operation you have a really
nifty chart in the white paper that kind of shows that progression so hats off to your digital artists. Nicely done, guys. Thanks.
Scott's research describes attempts by nation states to compromise election infrastructure
in order to steal information that could be used in an influence operation at some point down the road.
Election infrastructure includes things like the people working for a candidate, the candidate's staff in other words, websites that support the candidates, and government-run official
voter registration databases.
But I want to be very clear here.
They uncovered no evidence, zero, that nation-states have compromised the voting apparatus in an
effort to change votes in favor of one candidate or the other. Nor did they discover any activity where nation-states prevented people
from voting or added ineligible people to the voting rolls. They discovered no voter fraud,
zippy, nada. And this is in alignment with the over 60 lawsuits filed by former President Trump
and his allies alleging voter fraud and other irregularities
in various states. All of these cases were dismissed, withdrawn, or resolved against the
plaintiffs due to a lack of evidence or legal standing. Courts at both the state and federal
levels, including the conservative majority Supreme Court, consistently have found no
substantial evidence of widespread voter fraud
that could have affected the election outcome.
We'll be back with more of our election security dismiss special after this. With all that said, Scott, the question that comes to mind is how do
network defenders or InfoSec professionals defend against these interference operations?
From your description, it sounds like this is just traditional intrusion kill chain prevention
strategy.
You look at the actors and their typical campaigns and make sure you have prevention and detection controls deployed in your network stack.
That's one thing that didn't really surprise us because this is kind of a lot of the takeaways that we get.
We do a lot of these kind of broad-based assessments at Title Cyber, try to take a step back, look at kind of the threat
landscape as a whole. We're really big into drilling down into the specific techniques and
TTPs used as part of any of these operations. But then the whole takeaway is, what do you do about
that? How does that relate to defenses, both proactive mitigations and things like detections
and response? All of those things can and do and probably increasingly change from
underneath you. Digital transformation, more and more election-related infrastructure going into
digital assets, potentially into cloud-based environments. You need to be continually
reassessing all of these controls and measures as time progresses because things inevitably can and do change.
And you need to make sure things are as properly deployed
as they were on day one,
as it is in day 100 and day 200 and beyond.
So one of the things I like about Tidal Cyber
is one of the strong points is you're identifying
the tactics, techniques, and procedures
of various adversary campaigns so that we can
devise prevention and detection controls for our own internal environments. And in your white paper,
you were listing, I don't know, it's over a dozen different adversary groups that are participating
or engaging in this activity. Is that about right? Yeah, yeah, that's exactly right. I was actually
surprised myself when we sat down and looked at the data, just the number of groups that have been in this activity. Is that about right? Yeah, that's exactly right. I was actually surprised
myself when we sat down and looked at the data, just the number of groups that have been previously
identified with this activity, and it actually just continues to grow. One thing that has been
reported just in the past few weeks is another extremely clear-cut example of what appears to
be an interference attack. This has actually been pretty well attributed back to a named adversary group.
So this is one not highlighted in our report at the time we published it.
It is closely related to one of the other Iran-based actors.
So the one I'm talking about and focused on with this election cycle,
it's most commonly going by APT42.
This is associated with the Revolutionary Guard organization within Iran. They have been
tied to other previous cyber interference attacks, but they have actually been attributed by private
sector cybersecurity organizations. And a news report just came out yesterday morning that we're
expecting to see charges brought against most likely elements of this group for actually what appears to be a
successful compromise of the Trump presidential campaign and exfiltration of information there.
I definitely want to highlight that they are believed to have targeted the Harris campaign
as well. They don't appear to have been successful in exfiltrating data in that case. But I guess
just to highlight that even with all this close looking and we know a lot of the countries who are behind these types of attacks, we continue
to see even new groups. APT42 is not a brand new group, but this is a clear indication of some
election targeted activity associated with them. So the good news there is that an adversary group like APT42,
they're using the same tactics and techniques and procedures
that they use to do their other work for these interference operations, right?
So we already know most of what they do.
I know they invent new things as they go, but we know the bulk of them, right?
That's the good news for the network defender.
That's exactly right.
I think what I would highlight
in that case is looking at the types of organizations that would be targeted in an
attack like this. And are they necessarily prepared? Do they have the resources to defend
against some of these TTPs? I think that's one thing that concerns me. So when we're talking
about campaign staff, for example, I know a lot of organizations, a lot of resources have been put into this, a lot more awareness.
But it is worth noting that these are sophisticated adversaries, and they are very intent on carrying out these attacks.
We look at state and local electorate bodies and the resources that they may or may not have available to defend against some
of these. That is kind of an ongoing concern. So to defeat interference operations, we're
mostly talking about InfoSec programs dealing with election infrastructure and campaigns.
And I guess that's about it, right? Is there any other cybersecurity professionals that need to worry specifically about these kinds of interference operations?
For sure.
I would say definitely not to overhype it because fortunately we haven't seen too many examples of this.
But we do need to be aware about the supply chain itself, which some of that is going to come down exactly to the staff that you just mentioned.
Doing some of the supply chain auditing, seeing both what software and maybe what hardware is coming in so when we talk about voting
machines that's obviously a very critical piece of the supply chain that needs a lot of due diligence
on it but i think probably the the greatest attack service is uh yeah kind of the information sources
uh local state governments uh national level as well, and then certainly kind of the campaigns too.
Scott's report and Nina's book turns a spotlight on some of the more recent nation-state influence operations.
If you're looking for more detail, I highly recommend them both.
But I want to turn our attention to domestic influence operations now.
The kinds of polarizing information campaigns that use the social media pentad to great effect.
Orchestrated by world-class culture warrior influencers.
See part one of the series to understand that group.
In that series, we use the book Invisible Rulers,
the people who turn lies into reality, written by Rene Durista,
to understand how the pentad works. As I said in the episode, I only covered the basics.
If you want more details, I highly recommend her book. But in a cruel twist of fate, Renee's become
a victim of the viral machine. The crowd turned into a mob. Until recently, she's been working
at Stanford's Internet Observatory, SIO for short, as the technical research manager.
The SIO is a cross-disciplinary program of research, teaching, and policy engagement for the study of adversarial abuse in current information technologies.
Think the many ways that people attempt to manipulate, harass, or target others online.
or target others online.
Using Stanford students as the workforce,
Rene and the SIO team published reports on viral narratives across various social media platforms,
notified platform administrators
when viral stories seemed to violate
the platform's content moderation policy,
and notified government leaders
when viral narratives published conspiracy theories
that negatively impacted the national effort
to combat the
coronavirus. In response, the angry mob accused the SIO of censorship and violating the U.S.
First Amendment, which in hindsight is ridiculous, since the SIO reported on viral events that had
already happened. They didn't censor anything, nor did they have the power to do so. The group
published research on publicly available information
and pointed out when content violated the social media platform's own policies.
Here's Renee being interviewed at the Commonwealth Club World Affairs Conference in June 2024.
We kept track of whether or not they actioned it, and 65% of the time, they did not.
And about 20% of the time, it got a label, and I think 10% of the time, they did not. And about 20 percent of the time, it got a label. And I think
10 percent of the time, it came down. Maybe 13 percent of the time, it came down. And so it was
very interesting to us about that, was that even things that seemed to clearly violate their
policies actually stayed up. And recall from part one of the series, platforms deleting content or
labeling content is not a violation of your First Amendment rights.
It's the platform owners acting on their own content moderation policies.
Enter Congressman Jim Jordan, an Ohio Republican and chairman of the House Judiciary Committee.
As part of the select subcommittee on the weaponization of the federal government,
Jordan led efforts to investigate how the executive branch may have
coerced or colluded with companies and other intermediaries to censor lawful speech. Here's
Renee again at the Commonwealth Club World Affairs Conference being interviewed by Quinton Hardy.
Let's talk about Matt Tybee, Michael Schellenberger, Congressman Jim Jordan, and Congressman Dan Bishop. So in December of 2022,
this project called the Twitter Files begins to happen. These writers are given access to these
internal Twitter emails. They write these stories, and these stories go viral. And this becomes a
launching pad. Tybee gets, I think, a million followers, a million new Twitter followers in like two days after he puts out the first Twitter files. And one of the others, Michael Schellenberger, launches his newsletter. I think Barry Weiss rebrands her newsletter. The November of 2022, January of 2023, he gets
gavel power.
He launches this committee on the weaponization of government.
And so what you see him do is he invites in the Twitter filed researchers, writers, whatever,
to come in and to tell his committee about how conservatives have been viciously censored
by Twitter.
They're just pointing to yet another right-wing blog that said it. But it doesn't matter because now it's in the congressional record. And two days later,
we get an email from Jim Jordan, a letter requesting all of our emails communicating
with the executive branch of the United States. So in a novel way, we wound up getting sued by
Stephen Miller, by America First Legal, which sues us on behalf of the Gateway Pundit, the sort of right-wing kind
of blog that actually has now declared bankruptcy.
Recently departed Gateway Pundit.
Declared bankruptcy because it wrote a bunch of election lies.
So it wrote a bunch of lies about two election workers, Ruby Moss and Shea Freeman, claiming
that they had done, I'm trying to remember the specifics, either brought in ballots or gotten rid of ballots, but they alleged that they somehow interfered with the vote count.
Donald Trump tweeted this, Rudy Giuliani went after them also, and they had to flee their homes because of death threats and things like this.
So when we did our election research, Gateway Pundit was one of the kind of repeat spread you know, repeat spreaders who is very remarkably effective
at making things that did not appear to be true go viral. And so he had appeared in some of our
writing. There was also this anti-vaccine activist that we just never heard of. So all of a sudden,
we find out we're sued on Breitbart by somebody we've never heard of. And Stephen Miller's firm
is conducting this lawsuit. And I can't talk about pending litigation, but what winds up happening is that Jim Jordan takes the material that we've turned over under subpoena and gives some of it to Stephen Miller.
He just goes and he gives it to him.
And that is a remarkably unprecedented state of affairs.
That is an astonishing breach of norms and
procedure and everything else.
Yes, and it becomes this self-sustaining
loop of outrage.
And as I was saying,
the new system plays
jujitsu with the old system,
which I think that's a
defensible example
of that happening.
And it also takes us to Stanford because Stanford has now not renewed your contract.
Kind of like getting fired.
I don't know.
Quiet fired.
Yeah.
And they're disbanding the Internet Observatory.
It's all gone now.
They don't know if they're disbanding the Internet Observatory.
That's one of the interesting things about this.
First they did and then they didn't.
To put it as bluntly as
I can, has a major
American institution
with an endowment of $36.4
billion been cowed
by this?
In your personal view.
Yes.
Yeah.
And how do you feel about that?
Disappointed.
Fair enough.
And that's actually led me to another question.
How are the conspiracy theorists treating your removal from Stanford?
your removal from Stanford.
Jim Jordan's tweet was like,
we exercised robust oversight over Stanford,
either the Internet Observatory or the university.
And I thought like that should be absolutely horrifying to anybody who actually cares
about the First Amendment, ironically.
That is an American congressman,
the House GOP, more than one,
saying we effectively got an academic research center that did First Amendment protected research shut down.
Victory for free speech.
That's Orwellian.
For you youngsters out there who didn't have to slog through George Orwell's book 1984 in high school English class, let me give you the
Reader's Digest version. It is a dystopian novel set in a totalitarian society ruled by the
omnipresent figure of Big Brother and the party. The party maintains power through rewriting of
history, constant surveillance, and manipulation of language called newspeak in the book. It
embraces the concepts of doublethink,
the ability to simultaneously hold two opposing ideas in one's mind
and believe in them both,
and black-white, to know that black is white
and to forget that one has ever believed the contrary,
all orchestrated by the government's ministry of truth.
When Rene says that Congressman Jordan was Orwellian,
she means that when his
actions effectively shut down the free speech of the Stanford Internet Observatory and then claimed
it was a victory for free speech, that's Orwellian newspeak. That's Orwellian black-white. That's
Orwellian doublethink. So Renee DeRista and her team at the SIO, some of the nation's leading
experts on how propaganda spreads on social media platforms and who had no power to restrict free speech whatsoever,
have been shut down by an angry mob of social media influencers who broke out of their bespoke realities on their social media platforms to infect at least a couple of congressmen, and because of that are now dealing with multiple lawsuits and caused the business leaders at Stanford University
with an endowment of over $36 billion
to shut down their research project
because it wasn't worth the hassle.
That is the power that social media propaganda has today.
An angry digital mob has made it impossible
to even study the situation.
And this is not an isolated incident.
Nina Jankowicz experienced the same digital mob hatred when she was hired by the U.S. Department of Homeland Security, DHS, in 2022 to help with the government's internal policies on propaganda.
The short version is DHS offered me a political appointment to go work at an entity that they called the Disinformation Governance Board, which is admittedly not a good name.
One and all.
That is right out of Orwell.
Yeah.
I mean, I don't know why they didn't think that one through. I guess they were thinking this is for an internal audience and, you know, we're governing how we deal with disinformation, which is what we're doing.
We're setting the guardrails for how the agencies within DHS would approach disinformation within
their portfolios, making sure that privacy, civil rights, civil liberties were protected while we
do that. Again, things that conservatives and liberals can agree on, but I digress.
When this effort was announced about eight weeks into my tenure at DHS, it was done in April of 2022, and then disbanded
it a month later, right? So that is just ridiculous to me, but go ahead. Yeah, so timeline's a little
off. I started in March. They took eight weeks to announce it. They did that at the end of April,
and then within three weeks, I made the decision to resign because what had happened was the way
they described it led to this huge vacuum of information
that they didn't respond to. And people claimed within hours of the announcement going up that
this was going to be a ministry of truth and I was going to be Joe Biden's chief censor,
which is hilarious to me. I've never met Joe Biden, just for the record. I don't think I've
said that out loud. I've never met him. He probably doesn't know who I am. If he has a vague idea, it's probably a negative association now. But I was a GS-15 federal employee, a political appointee, who my job was to herd
cats. And they were saying that I was going to have the power to decide what was true or false
on the internet. Tucker Carlson said that I was going to have the power to send men with guns to the homes of Americans with whom I
disagreed. Oh, my. And yeah, he also called me a highly self-confident young woman, which I think
he meant as an insult. And Biden's new thought cop in chief has been revealed. She's a 33-year-old,
highly self-confident young woman called Nina Jankowicz. Her job is to restrict any speech
that challenges Joe Biden or the Democratic Party.
Now, you'd think that would be illegal in this country as a federal employee because we do have
a First Amendment. But Nina Jankowicz doesn't believe in the First Amendment. If you read
Jankowicz's book, which we did, you will realize very quickly she was hired to police domestic
social media use, period. She's an illiterate fascist. Wow. But as you can imagine,
when you end up on Fox News every hour on the hour, you deal with a lot of nasty stuff. And
particularly the implication that I was committing treason against the United States
led people to threaten me and my family. We were doxxed. And I'm somebody who has pretty great
IT and operational security.
Like I've written a book on how to stay safe online.
So I was pretty well prepared for all of this,
but the administration did very little to support me.
And I was pregnant at the time,
which made everything much scarier
because I was about to bring my first travel done
to the world.
And it was really, really rough.
And it still goes on to this day. I still receive horrible stuff fairly regularly. I had a
cyber stalker that I had to get a protective order against. I had a frivolous civil suit
filed against me that alleged I was committing a RICO violation, an attempt to censor someone,
a man I'd never met or heard of
until he inserted himself into my life. And that cost tens of thousands of dollars to get dismissed.
I was initially named in that Missouri v. Biden lawsuit, but the government got me removed.
And, you know, to this day, there's a large contingent of people in America who believe
that I did commit treason, that I, you know,
was trying to censor them. And I think that the Disinformation Governance Board issue,
which, as you said, they eventually decided to disband the entire thing long after I resigned.
They did a review and claimed they believed there was no need for a Disinformation Governance Board,
which is laughable, in my opinion. What they really said was there's no need to have an organization that's named disinformation governance board.
We need to name it something else.
Yeah, well, I don't know.
The sad thing is DHS really isn't engaging in that work in the same way anymore.
So the Cybersecurity and Infrastructure Security Agency had been doing a lot related to disinformation under Chris Krebs' leadership. Now they're doing
much, much less since the board fiasco. They also have rolled back a lot of the work that they were
doing in the science and technology directorate that funded research in this area. And the whole
field has become embattled. And the board fiasco was the first salvo in a long campaign against disinformation researchers and practitioners, counter practitioners, I should say.
We're not practicing disinformation.
But folks at Stanford, the University of Washington, think tanks around the country who are just attempting to get good information out there in order to help people navigate today's information environment.
Well, I want to be clear about what happened here too, right?
Because it's the same playbook.
They took something that was sort of true,
that you belong to an organization interested in disinformation,
and then broadcast propaganda to both sides, right?
Saying all kinds of horrible, not true things in order to pull
the country apart, to give them something to get mad at, to hit the rage button,
because they can't believe the government's doing all that kind of thing. So,
you got, that organization got destroyed by the same tactics that we were talking about.
A hundred percent. Yeah. Well, and the sad thing is too, that there were a lot of
members of Congress who were deliberately engaging in that sort of thing because they fundraised off of it.
It allowed them to say to their constituents, look, we're doing something.
We got Nina out of government and the disinformation governance board is gone.
Jim Jordan, through his weaponization of the federal government subcommittee at the Judiciary Committee, I was the first person in the disinformation kind of sphere that he subpoenaed. And I got to sit with him for five hours. And the transcript of that
deposition is now public because I fought to get it released. He was going to keep it under lock
and key because he was embarrassed of the contents, which showed that under oath, I talked about how
the intention of the board was never to censor. I would never have censored anyone. I didn't have
the power, capacity, or intention to do that. And I wouldn't have taken
the job if that were the case. I'm the daughter, granddaughter of somebody who was in a gulag in
Russia. I feel pretty strongly about censorship.
Talk about Orwellian doublethink and black-white.
Wow.
In both Nina's case and Renee's case,
culture warrior influencers prevented even the study of propaganda, let alone what we as a nation should actually do about it.
And if you throw in nation-state propaganda activity,
information operations from the likes of Russia, China, Iran, and North Korea,
clearly the citizens of the U.S. have a clear and present danger to their chosen system of government. But like I said at the beginning of the series, for this
presidential election, even if the two sides of the political spectrum, the
Republicans and the Democrats, could put a miracle together and come up with
several courses of action that might mitigate the risk, those actions would never be in place in time for this election.
And citizens will get no help from the social media platforms either.
There is no incentive to do anything that reduces propaganda efforts.
As I demonstrated in the first part of the series, the platform owners bring in more
revenue when the culture warriors from both sides are yelling at each other.
So there will be no help from that quarter. In part three of the series that comes out next week, we'll talk about
what could be done later after the election, assuming that the country survives the aftermath.
But for this election, the only way to reduce the impact of election propaganda is for the users of
social media, the citizens of the U.S., to not contribute to the rage machine,
to not get so mad that they help to transform their crowd discussions into angry mob action.
The first part of this series provides an entire toolkit individual citizens might consider
to help mitigate the risk. Or, as Nina says,
An acronym that I like from some media literacy educators at the University of Washington called
SIFT. So the first letter is stop.
The second thing, if you want to do further investigation, is to investigate the source.
F is to find other coverage.
And then if you really want to go deep, you can trace the claims to see where they came from.
And all of that, I mean, unless it's a really complex story, you don't need open source investigative skills to do this.
You can trace it in five minutes if you have that five minutes and you're not on the metro or scrolling in bed.
And that's a wrap for part two of this election propaganda series. This special bonus CSO Perspectives episode on election propaganda is brought to you by
N2K CyberWire, where you can find us at thecyberwire.com.
In the show notes page, I've added some reference links to help you do more of a deep dive if
that strikes your fancy.
And believe me, the well is deep here.
We've just scratched the surface with this show.
And don't forget to check out our book, Cybersecurity First Principles,
a reboot of Strategy and Tactics that we published in 2023.
By the way, we'd love to know what you think of our show.
Please share a rating and review on your podcast app.
But if that's too hard,
you can fill out the survey in the show notes
or send an email to csop at n2k.com.
We're privileged that N2K CyberWire is part of the daily routine
of the most influential leaders and operators in the public and private sector,
from the Fortune 500 to many of the world's preeminent intelligence and law enforcement agencies.
N2K makes it easy for companies to optimize your biggest investment, your people.
We make you smarter about your teams while making your team smarter.
Learn how at N2K.com.
One last thing.
Here at N2K, we have a fantastic team of talented people doing insanely great things to make me and the show sound good.
I think it's only appropriate that you know who they are.
I'm Liz Stokes.
I'm N2K's CyberWire's Associate Producer.
I'm Trey Hester, Audio Editor and Sound Engineer.
I'm Elliot Peltzman, Executive Director of Sound and Vision.
I'm Jennifer Iben, Executive Producer.
I'm Brandon Karf, Executive Editor.
I'm Simone Petrella, the President of N2K.
I'm Peter Kilpie, the CEOlla, the president of N2K. I'm Peter Kilby,
the CEO and publisher at N2K.
And I'm Rick Howard.
Thanks for your support, everybody.
And thanks for listening. Thank you. insights, receive alerts, and act with ease through guided apps tailored to your role.
Data is hard. Domo is easy. Learn more at ai.domo.com. That's ai.domo.com.