CyberWire Daily - Election Propaganda Part 1: How does election propaganda work?
Episode Date: October 2, 2024Rick Howard, N2K CyberWire’s Chief Analyst and Senior Fellow, discusses personal defensive measures that an average citizen, regardless of political philosophy, can take in order to not succumb to p...ropaganda. References: David Ehl, 2024. Why Meta is now banning Russian propaganda [News]. Deutsche Welle. Jeff Berman, Renée DiResta, 2023. Disinformation & How To Combat It [Interview]. Youtube. Niha Masih, 2024. Meta bans Russian state media outlet RT for acts of ‘foreign interference’ [News]. The Washington Post. Quentin Hardy, Renée DiResta, 2024. The Invisible Rulers Turning Lies Into Reality [Interview]. YouTube. Rob Tracinski, Renée DiResta, 2024. The Internet Rumor Mill [Interview]. YouTube. Robin Stern, Marc Brackett, 2024. 5 Ways to Recognize and Avoid Political Gaslighting [Explainer]. The Washington Post. Sarah Ellison, Amy Gardner, Clara Ence Morse, 2024. Elon Musk’s misleading election claims reach millions and alarm election officials [News]. The Washington Post. Scott Small, 2024. Election Cyber Interference Threats & Defenses: A Data-Driven Study [White Paper]. Tidal Cyber. Staff, 2021. Foreign Threats to the 2020 US Federal Elections [Intelligence Community Assessment]. DNI. Staff, 2024. Election Cyber Interference Threats & Defenses: A Data-Driven Study [White Paper]. Tidal. Stuart A. Thompson, Tiffany Hsu, 2024. Left-Wing Misinformation Is Having a Moment [Analysis. The New York Times. Stuart A. Thompson, 2024. Elon Musk’s Week on X: Deepfakes, Falsehoods and Lots of Memes [News]. The New York Times. Will Oremus, 2024. Zuckerberg expresses regrets over covid misinformation crackdown [News]. The Washington Post. Yascha Mounk, Renée DiResta, 2022. How (Not) to Fix Social Media [Interview]. YouTube. Renee DiResta, 2024. Invisible Rulers: The People Who Turn Lies into Reality [Book]. Goodreads. Nina Jankowicz, 2020. How to Lose the Information War: Russia, Fake News and the Future of Conflict [Book]. Goodreads. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
You're listening to the Cyber Wire Network, powered by N2K.
Air Transat presents two friends traveling in Europe for the first time and feeling some pretty big emotions.
This coffee is so good. How do they make it so rich and tasty?
Those paintings we saw today weren't prints. They were the actual paintings.
I have never seen tomatoes like this.
How are they so red?
With flight deals starting at just $589,
it's time for you to see what Europe has to offer.
Don't worry.
You can handle it.
Visit airtransat.com for details.
Conditions apply.
AirTransat.
Travel moves us.
Hey, everybody.
Dave here.
Have you ever wondered where your personal information is lurking online?
Like many of you, I was concerned about my data being sold by data brokers.
So I decided to try Delete.me.
I have to say, Delete.me is a game changer.
Within days of signing up, they started removing my personal information from hundreds of data brokers.
I finally have peace of mind knowing my data privacy is protected.
DeleteMe's team does all the work for you with detailed reports so you know exactly what's been done.
Take control of your data and keep your private life private by signing up for DeleteMe.
Now at a special discount for our listeners.
private by signing up for Delete Me. Now at a special discount for our listeners,
today get 20% off your Delete Me plan when you go to joindeleteme.com slash N2K and use promo code N2K at checkout. The only way to get 20% off is to go to joindeleteme.com slash N2K and enter code
N2K at checkout. That's joindeleteme.com slash N2K, code N2K.
It's 2024, and in the United States,
we're rolling up on the next presidential election, November 5th,
between Vice President Kamala Harris and former President Donald Trump.
And although social media isn't new for this election, social media platforms have matured enough
to be the presumptive source of news and debate around the world. But they're also ripe with
misinformation, disinformation, rumors, opinions, fake news, conspiracy theories,
and advocation of absolutely abhorrent ideas and evil behavior.
See your side of the political spectrum for what they think the other side's evil things are.
Now, none of this is new.
The Greeks and the Romans almost made a science out of it.
They didn't call it propaganda, but that's what it was.
And people from all sides of the political spectrum engage in it. But unless you are a self-proclaimed culture warrior looking for a fight, which
the average American citizen certainly isn't, I know I'm not, most of us are left bewildered
and uncertain as to what to believe and what to think whenever the current viral event of the day emerges. Since the 2016 election between the then-candidate
Trump and Secretary of State Hillary Clinton, pundits and scholars have suggested sweeping
fixes to combat propaganda efforts on social media platforms from both sides. They usually
come in the form of self-imposed technology improvements on the various platforms and or government regulations
to force change. But none are likely to be in place in time for this election. How then does
the average American citizen pick between the signal and the noise when both sides are launching
propaganda bombs into the ether? That's the question we're going to try to answer in this three-part miniseries on election security propaganda.
platform architecture and a system of paid influencers, unpaid wannabe influencers, culture war policy propagandists, and nation-state chaos instigators, hyping up the rage machine
on both sides of the political spectrum.
The goal of the series is to help the average citizen who is in any of those things navigate
the 2024 presidential election information storm by providing a toolkit that helps distinguish between deceptive narratives
and legitimate content in the ever-evolving world of election security. According to Rene DiResta, author of Invisible Rulers,
The People Who Turn Lies Into Reality,
propaganda is a deliberate presentation of inflected information
that attempts to further the agenda of those who create it.
Assuming she's right, and I think she is, everything is propaganda. Wherever you get
your information, whether you fall to the left or the right on the political spectrum,
no matter how legitimate you think the information you consume is, rest assured that it's passing
through some kind of inflected prism that bends toward the purpose of the people
pushing the information. I read the New York Times and the Washington Post. Those papers bend to the
left. I also read the Wall Street Journal. That paper bends to the right. This isn't good or bad.
It just is. The trick is to know that going in and to evaluate the value of the information
with that in mind. This isn't a new
thing. According to Susan Wise Bower, author of The History of the Ancient World, From the Earliest
Accounts to the Fall of Rome, the art of propaganda can be traced as far back as 515 BCE with Darius
the Great, one of the most important rulers of the ancient Persian Achaemenid Empire. He used it to
legitimize his rule. In modern times, though, before the
internet, but after Gutenberg invented the printing press in 1440, after the first modern newspaper in
1605, after Marconi invented the radio in the late 1890s, and after Philo Farnsworth invented the TV
in the late 1920s, successful propaganda efforts were in the hands of the few. Government leaders who could
step behind the bully pulpit, like U.S. President Theodore Roosevelt, government influence operators
like Hitler's chief propagandist Joseph Goebbels, media organizations like the Printed Press, like
the Times of London, radio shows like CBS's See It All Now with Edward R. Murrow, TV shows like the
CBS Evening News with Walter Cronkite,
and advertising organizations that produced ad content like the Marlboro Man selling cigarettes
in the 1950s. Before the internet, all of these propagandists broadcast their message using a
one-way, one-to-many model. People got their information about the world from a handful of
sources they trusted. And the sources they didn't trust, like that crazy dude wearing the chicken suit
spouting his manifesto in the public square, could easily be ignored.
His message didn't spread.
It lived and died there.
His reach was small.
And yes, chicken soup propagandists were mostly dudes.
We should look into that.
After the internet came online in 1969
and social media platforms started to emerge in the late 1990s, purveyors of propaganda got an
exponential lift in broadcast capacity. Propaganda efforts were no longer restricted to the one-way,
one-to-many model. Now, your crazy Uncle Joe, who always caused family drama at the annual Thanksgiving dinner
spouting conspiracy theories about UFOs, could easily find his people online.
They could exchange information with like-minded people,
and they could now broadcast their collective propaganda message with a new model,
two-way, many-to-many.
If they learned how to use it correctly,
they now had a megaphone broadcasting mechanism
at their fingertips that was free of charge. After the internet, the public square in the
form of social media platforms evolved as the source of news and debate in the United States
and around the world. And it has a massive reach. According to Simon Kemp from his Digital 2024 report, there are approximately
331 million internet users in the United States. 97% of them regularly use the internet. 72%
are active on social media. 72% use YouTube. 57% use Facebook. 51% use Instagram, 45% use TikTok, and about 31% use X.
But if you agree with me that a lot of that content spread on those platforms is mostly propaganda,
and you don't consider yourself a culture warrior looking for a fight,
how then do you, the average American citizen, pick between the signal and the noise when it seems there's no escape from it?
I'm glad that you asked.
We're going to show you how platform designers created these diabolical systems designed to
arithmetically amplify the messages associated with a system of systems containing algorithms,
influencers, crowds, and the media in order to hype up the
rage machine on both sides of the political spectrum. They are designed to convince you,
the average American voter, to hit the like button and broadcast your rage to your friends,
family, and colleagues, all in the service of making more money for the platform owners who
are already bringing in billions of dollars of revenue. Now, you may be asking yourself,
why is N2K doing a series on election propaganda?
This is a topic that is, strictly speaking,
not about cybersecurity.
That's true, but let's call it security adjacent.
It's an anti-propaganda toolkit for everybody,
not one side of the political spectrum or the other,
and it's an effort on our part
to separate the signal from the noise, which is an N2K motto. With that explanation out of the way, let me start with
examining how social media works. It's way more complicated than you think.
And we'll get to that after a quick break.
There are two recently published books that address this subject that I found.
One is from author Nina Jankowicz called How to Lose the Information War, Russia, Fake News, and the Future Conflict.
The other is from the aforementioned author Rene D'Aresta.
D'Aresta identifies five distinct propaganda agents that interact with each other on all social media
platforms that enable viral propaganda spread. I call them the pentad. The agents are the platform,
the algorithm, the influencers, the crowd, and the media. These pentad elements are distinct,
but their incentives are inseparable. Each agent thrives on the proliferation of attention-capturing
pseudo-events in an effort to go viral. All reward and reinforce sensationalism,
spectacle, and tribalism. DiResta credits Daniel Burstyn, the late-grade American historian and,
by the way, the 12th Librarian of Congress, how great is that, with a definition of pseudo-events,
of Congress, how great is that, with a definition of pseudo-events, synthetic media moments,
events that exist solely to be reported on. And let me set the boundary for what constitutes virality. Common indicators are the number of likes the original message received, this should
be over tens of thousands, the number of times members reshared the message, this should be over
a thousand, and the number of comments the message receives, this should be in the hundreds of thousands.
But to be truly viral, engagement metrics should significantly outperform the influencer's
follower count.
For instance, if an account with 100 followers receives thousands of likes and shares, that
message is likely viral.
But even when a message receives hundreds of thousands of comments, that represents a very small number
of actual American citizens engaging in the content.
Just do the math.
100,000 comments divided by 331 million internet users
is a really small number.
When the New York Times reports
that a social media viral event is important,
it's equivalent to them giving significance
to the
crowd cheering a touchdown at last night's Giants-Cowboys football game. Not to put too
fine a point on it, but it's also likely that those engaging with the content are culture
warriors from one side or the other of some issue anyway. It's not the average American.
So when the Washington Post publishes a story of how viral a TikTok story went,
American. So when the Washington Post publishes a story of how viral a TikTok story went,
that's a pseudo-event. A small number of people talking about some culture warrior issue is not an event. It's not valuable information. The viral event creates way more noise than it
does signal. It's meaningless unless you happen to be part of the tribe that cares about the issue.
The point is that each element in the Pintad contributes to
this notion of virality. Let's examine each to see how they contribute individually.
First up, the platform. According to Darista, social media platforms provide crowds and
other Pintad elements that we'll discuss in a bit, with tools that encourage
the formation of groups and to influence how those communities behave. They provide opportunities for
spontaneous crowd formation, which carries with it the potential to suddenly devolve into a mob,
an unruly collection of angry people. Still, all platforms have some content moderation policy,
from X's bare minimum policies against violent content,
hateful conduct, abuse, harassment, and child exploitation,
to YouTube's more nuanced set of clearly defined policies
on hate speech, harassment, violent extremism,
and misinformation.
Now, I'm not making an argument here
that YouTube is better than X
in terms of content moderation.
My argument is that all platforms have a content moderation policy. You should pick the platform that
more closely matches your taste. If you like X's bare minimum approach, go with that platform.
If you like YouTube's more nuanced approach, go with that one. It makes sense, though,
that you at least know what the policies are so you know what you're getting yourself into.
But when you post something that flies in the face of the platform's content moderation policy
and the company deletes the message, let's say, that's not a violation of the U.S. First Amendment.
The platform companies aren't the U.S. government. They're not empowered to protect your civil
rights, nor are they obligated to. Unless there is some obvious law being broken by the company
or by the user of the platform, platform owners can do whatever they want with their content policy.
Users can like it or not, or they can change platforms if they wish. But American citizens
have no inherent right to publish whatever nonsense they want on a public company's platform,
and the platform leadership has no obligation whatsoever to amplify that message. If that were true,
Uncle Joe could just walk into the lobby of any corporation and start yelling about his UFO
conspiracy theories. When security comes to escort him out of the building, he could just point out
that he was executing his First Amendment rights. I don't see that happening anytime soon. And keep
in mind, the platform leadership's sole purpose is to bring in revenue. All platforms
craft their own content moderation policies designed to appease and cultivate ad buyers,
to cater to certain audience demographics, and to comply with international laws. All of that
capitalist idea of generating revenue in mind. Again, this isn't bad or good. It just is. And I
love capitalism. It drives the country.
But it's good to keep that in mind when we're yelling at the other side while defending our culture warrior issue.
Let me say that again.
The platform owners don't care about your issue.
What they care about is making money.
So under the covers, what makes all of that work is the algorithm.
Again, according to DeRista, algorithms exploit the human brain's attraction to divisiveness.
Platform owners believe that encouraging the rage machine is much more profitable than
encouraging holding hands and singing Kumbaya.
Think of algorithms as the wedge splitter, the tool that partitions and pushes American
citizens into opposing sides
and compels them to respond. Algorithms determine what content to show to the crowd,
and they do it by shaping information flow and by influencing the narratives that crowd encounters
by their individual preferences and engagement patterns. Algorithms reinforce existing beliefs
and sometimes nudge the crowd down some horrifying rabbit holes by catering to their biases and boosting relevant influencers.
Another pentad element that I will get to in a bit.
In order to keep the crowd on site, they steer the influencers' attention and encourage the crowd's output.
If the algorithm creates a viral event that the traditional media, cable TV, and newspapers
report on, that's an indicator of success.
If the viral event spreads to other social media platforms, that's also an indicator.
It means that the algorithm has elevated the viral event to such an extent that it has
broken out of the platform like a contagion to spread to other mediums, like the bird
flu infecting humans.
The algorithm targets several Pintad members to achieve a viral event like influencers and the media, but liftoff is attained by targeting the crowd.
Diresta says that virality is a crowd collective behavior.
Each user makes a deliberate choice to post or retweet content because they find the post appealing,
they believe in it, or they're outraged by it. In a kind of enclosed system, algorithms feed the
crowd with content generated by inside-the-bubble trusted spokespeople, influencers, and then the
crowd amplifies the Rage Machine with their response, which in turn gives the algorithms
and influencers indicators about which ideas
are working and which ones aren't to help them generate and highlight more
content in the same vein which causes the crowd to engage with the RAGE
machine again. This is a massive self-sustaining do loop designed to keep
the machine humming. The most diabolical feature of the system is that new ideas
that exist outside the bubble,
even if they somehow miraculously make it inside for a bit, do not live long within the bubble.
Derrista says that consensus reality, our broad, shared understanding of what is real and true,
has shattered, and we're experiencing a Cambrian explosion of subjective, bespoke realities.
and we're experiencing a cambering explosion of subjective bespoke realities.
A deluge of content sorted by incentivized algorithms and shared instantaneously between aligned believers has enabled us to immerse ourselves in environments tailored to our own beliefs and populated with our own preferred facts.
Once you're inside the bespoke reality, it's hard to disengage, too.
We've all talked to friends, relatives, and colleagues who are deep within their own information bubble
that no matter what counterfactual you present, they see it as fake news or some organized conspiracy to stifle the truth.
That is the very definition of bespoke reality.
Before the Internet, crowds in the real world were local and fleeting. They'd show up,
protest, maybe engage in various degrees of violence, and then disappear. The same members
of that crowd would likely never see each other again. After the internet, crowds became persistent
and global. Durista says they engage symbiotically with influencers, but don't require a leader or physical space to assemble.
Crowds decide independently what to amplify.
When they do this, they help influencers rapidly identify the memes and messages
that truly resonate and capture public attention.
So yes, algorithms manipulate members of the crowd,
and influencers monetize their rage.
But in exchange, the crowd finds community, entertainment, and camaraderie.
A sense of shared identity, which leads to collective behavior and the formation of online factions in which they actively participate.
And sometimes, crowds turn ugly.
They transform into mobs.
Think angry people wielding pitchforks and torches and angry hunting dogs like in the old Frankenstein movies, but digital.
The 2014 Gamergate controversy is a good example.
A group of mostly Twitter and Reddit members organized to harass female game developers and critics.
The mob blended coordinated attacks, harassment, and the spread of false narratives.
The digital equivalence of dogs, torches, and pitchforks. DiResta says that this mob behavior
requires fuel to sustain its outrage, to sustain its bespoke reality. And each pentad element
provides the fuel in their own way. But the element that leads the charge is the influencer.
But the element that leads the charge is the influencer.
A social media influencer is an individual who has built a significant following on one or more social media platforms
and leverages this audience to influence their opinions, behaviors, or purchasing decisions.
By a significant following, I mean they have at least 10,000 followers,
but the most successful have millions. For example, Charlie D'Amelio has approximately 156 million followers on TikTok. Influencers often have a niche or area of expertise too,
such as fashion, fitness, travel, technology, or beauty, and they create content that resonates
with their followers.
Darista says they seek virality, a word lifted from epidemiology, the study of how diseases spread.
They use targeted ads and paid boosts that appeal to either the algorithm, the crowd,
or both.
They dynamically test messaging strategies designed to grow followers or uplift engagement,
and they're always aware of how the algorithm and the crowd responds. World-class influencers innately understand how to connect with their base.
Seemingly, at the same time, they appear to develop intimate, trusted relationships with
their followers, but achieve a massive reach that rivals what TV networks used to achieve before
the internet. And let's be clear, they cater to the algorithm and the crowd because the success of that effort determines how much revenue they bring in. This is not bad or good
either. It just is. Some influencers are just in it to make a buck by selling the brand and their
products. Taylor Swift is a good example of this and good on her. She has found a way to connect
to her audience to sustain her career. If only all of us could be that successful.
Other influencers, though, the culture warrior influencers, bring in revenue by fanning the flames in their bespoke realities, pointing the finger at those people over there as the cause
of all the problems for the true believers, the members of their bespoke community. Their
methodology is to look for the weak spot in the national culture and drive a
wedge through it. They broadcast rumors as if they are true and never retract it if it turns out not
to be true later. They just move on to the next rumor in order to keep fanning the flame, to keep
driving the wedge. And with all of that effort, sometimes the message breaks out of the bespoke
reality and into the mainstream media. What I mean by the media is
the collection of old guard traditional TV and newspaper companies like Fox, CNN, The New York
Times, The Washington Post, and The Wall Street Journal. If Daniel Borson were alive today, he
might say that the media sometimes reports on a viral event as if it was important, but in truth,
it is a mirage that looks real, but it isn't.
To be fair, though, mainstream media companies are incentivized to cover sensational content
in the same way that social media influencers are incentivized on the platforms to generate
revenue. DiResta says that the media practice of 24x7 news causes it to create content for
content's sake and the elevation of people who are famous for being famous.
Media generates manufactured important moments that capture public attention but are in fact meaningless.
Now that we understand how propaganda works in the modern digital age,
thanks to the books from Rene DeRista and Nina Jankiewicz, and we understand how each element of the propaganda pin-tad functions, the platform, the algorithm, the influencers, the crowd, and the media, and they each work in a mostly closed system of systems designed to efficiently broadcast propaganda messages,
designed to efficiently broadcast propaganda messages,
and assuming that you're just an average citizen who has no desire to be practicing
culture warrior stuff yourself,
what then can you do to inoculate yourself
from the spread and influence of propaganda
and to not get sucked in to the rage machine?
It turns out that there are several things
you can do as an average American
for each element of the Pentad.
For the platform, know the content moderation policies
of your social media platforms of choice.
Make a conscious decision as to whether you agree
or don't agree with what they're trying to do.
For bonus points, when you notice behavior
that violates the policy,
report it to the platform administrators.
Understand that the execution
of content moderation policies is not a violation
of the U.S. First Amendment. Don't believe it when culture warrior influencers try to tell you that
it is. It's just another way to hype the rage machine. Remember that platform owners don't care
about your culture warrior issue, except that they make money by hyping your rage about it.
For the algorithm, know that algorithm designers leverage
the brain's attraction to divisiveness. If you reach for the rage button to like, comment, and
rebroadcast a message that makes you steaming mad, that's the algorithm pulling your strings.
The algorithm wants to manipulate you like that. When that happens, the algorithm wins. If you're
okay with that, fine, hit the rage button. But maybe a better solution
is to sit back, cool off for a day, and re-evaluate. If 24 hours go by and you're still stomping mad,
you might have to come to terms with the idea that you have slipped into a cultural warrior role,
living in your own bespoke reality. This toolkit isn't for you. Unless you're a culture warrior seeking to gain followers,
your engagement with the rage machine isn't essential.
Nobody cares what you have to say about
Pick your favorite culture warrior issue.
Your like, reshare, or comment is not fundamental to the national debate.
Consider stepping back from the rage machine.
The algorithm can't create a viral event if the crowd doesn't respond.
For the crowd, ask yourself if you have slipped
into a crowd's hermetically sealed information bubble,
your own bespoke reality.
Do you reject authority figures in science,
government, and academia out of hand
in favor of what Kevin from the Bronx
has to say on the subject?
I'm not saying that authority figures
are always better informed and never make mistakes.
Sometimes they aren't, and sometimes they do.
But if your go-to move is to reject them out of hand
without any thought,
you might be in your own bespoke reality.
Do you favor scoring points against the other side
instead of actually considering the issues?
Have your friends, colleagues, and family members
stop talking to you because of
your ideas? Do you only consume content from your side of the culture war? Do you immediately assume
that Kevin's rumors about pick your favorite culture warrior issue are true without any
source material? Do you immediately assume that Kevin's rumors about pick your favorite culture
warrior issue are true because you want it to be true.
Do you often think that you and your side are the only ones who know the truth?
And do you engage in digital mob behavior?
One or more of these traits might indicate that you're in your own bespoke reality.
And finally, for the influencer, remember that some influencer's main motive is to drive a wedge through the culture, to create size.
Think about the source of the information and why you trust what Kevin from the Bronx has to say about Pick your favorite culture warrior issue.
You might say that Kevin makes good points. Fine.
But maybe also consider the motive behind his message.
Consider how Kevin makes money and other ways he might benefit in his effort to fan the flames of the culture wars.
Does Kevin regularly broadcast rumors saying things like,
if this turns out to be true, it could be bad, or I'm just asking questions.
If you hear those kinds of caveats from Kevin, consider not mashing the rage button until you know for sure.
If that's Kevin's go-to get-out-of-jail-free card when he is accused of
broadcasting misinformation or disinformation, consider that Kevin's motives might not be pure.
Does Kevin routinely point to some other individual or group as the source of all the problems in his
bespoke reality? Is the other the focus of his rage? Does it make you feel good to focus your
rage on this other? If that's true,
it might be time to step back, take a breath, and consider why you feel this way.
For the mainstream media, remember, just like social media influencers, they need to generate
revenue too. They are looking for stories to get your attention. Remember that a viral event,
although noisy, on a social media platform is just a relatively small crowd compared to the U.S. population.
This is a group of like-minded people yelling into the ether.
Think New York Giants fans cheering a touchdown.
You probably don't have to pay attention to it.
When you hear the mainstream media reporting on viral events, be suspicious.
If they do it regularly, consider changing your source of mainstream news.
At the beginning of this episode, I promised that I would provide you with at least the beginnings
of a toolkit that will enable you to identify the propagandist tropes that have been around
for centuries, but have been arithmetically amplified in the modern age on social media
platforms. My purpose
was to help the average citizen, not
the culture warrior thriving in their own
bespoke realities, navigate
the 2024 presidential election
information storm by enabling
them to distinguish between deceptive
narratives and legitimate content.
My theory is that if we all understood
how the system works, we could become
self-aware and at least recognize the mechanism.
We still might decide to engage,
but at least we are aware of who is profiting from the endeavor
and how we are being used by the Pintad.
I think I've done that.
Next week in part two of the series,
we will examine how propaganda efforts
have influenced countries in the past.
So stay tuned.
This special CSO Perspectives episode
on election propaganda
is brought to you by N2K CyberWire,
where you can find us at thecyberwire.com.
On the show notes pages,
I've added some reference links
to help you do more
of a deep dive if that strikes your fancy. And believe me, the well is deep here. We've only
just scratched the surface in this episode. And don't forget to check out our book,
Cybersecurity First Principles, a reboot of Strategy and Tactics that we published in 2023.
And by the way, we'd love to know what you think of our show. Please share a rating and review in your podcast app.
But if that's too hard, you can fill out the survey in the show notes or send an email to CSOP at N2K.com.
We're privileged that N2K CyberWire is part of the daily routine of the most influential leaders and operators in the public and private sector,
from the Fortune 500 to many of the
world's preeminent intelligence and law enforcement agencies.
N2K makes it easy for companies to optimize your biggest investment, your people.
We make you smarter about your teams while making your team smarter.
Learn how at N2K.com.
One last thing.
Here at N2K, we have a wonderful team of talented people doing insanely great things to make me sound good.
I think it's only appropriate that you know who they are.
I'm Liz Stokes. I'm N2K's CyberWire's Associate Producer.
I'm Trey Hester, Audio Editor and Sound Engineer.
I'm Elliot Peltzman, Executive Director of Sound and Vision.
I'm Jennifer Iben, Executive Producer. I'm Brandon Kariben, Executive Producer.
I'm Brandon Karf, Executive Editor.
I'm Simone Petrella, the President of N2K.
I'm Peter Kilpie, the CEO and Publisher at N2K.
And I'm Rick Howard. Thanks for your support, everybody.
And thanks for listening. Your business needs AI solutions that are not only ambitious, but also practical and adaptable.
That's where Domo's AI and data products platform comes in.
With Domo, you can channel AI and data into innovative uses that deliver measurable impact.
Secure AI agents connect, prepare, and automate your data workflows,
helping you gain insights, receive alerts, and act with ease through guided apps tailored to your role.
Data is hard. Domo is easy.
Learn more at ai.domo.com.
That's ai.domo.com.