The Decibel - The changing face of election interference
Episode Date: April 20, 2025Presenting Machines Like Us, a Globe and Mail podcast on technology and people.We’re a few weeks into a federal election that is currently too close to call. And while most Canadians are wondering w...ho our next Prime Minister will be, host Taylor Owen and his guests today are preoccupied with a different question: will this election be free and fair?In her recent report on foreign interference, Justice Marie-Josée Hogue wrote that “information manipulation poses the single biggest risk to our democracy”. Meanwhile, senior Canadian intelligence officials are predicting that India, China, Pakistan and Russia will all attempt to influence the outcome of this election. To try and get a sense of what we’re up against, Owen wanted to get two different perspectives on this. Aengus Bridgman is the Director of the Media Ecosystem Observatory, a project that they run together at McGill University, and Nina Jankocwicz is the co-founder and CEO of the American Sunlight Project. Bridgman and Jankocwicz are two of the leading authorities on the problem of information manipulation.This episode was originally published on April 8, 2025.
Transcript
Discussion (0)
Hi, it's Manika.
This weekend, we're bringing you an episode from another Globe and Mail podcast, Machines
Like Us.
It's a show about technology and artificial intelligence, and this episode is about election
interference.
In the middle of our federal election campaign, there are a lot of questions about how to
keep our democratic
process free and fair.
This episode explores how information can be manipulated.
You can subscribe to Machines Like Us wherever you listen to podcasts.
Hi, I'm Taylor Owen.
From the Globe and Mail, this is Machines Like Us.
I've just requested that the Governor General dissolve parliament and call an election for April 28th.
She has agreed.
So we're a few weeks into a federal election that is currently too close to call.
And while the question on the minds of most Canadians is who will be our next prime minister,
some of us are preoccupied with a different question.
Will this election be free and fair? In a recent report on foreign interference, Justice Marie-José Oge wrote that information
manipulation poses the single biggest risk to our democracy.
And senior Canadian intelligence officials have predicted that India, China, Pakistan,
and Russia will all attempt to influence the outcome of this election.
To try and get a better sense of what we're up against, I wanted to get two different
perspectives on this.
My colleague Angus Bridgman is the Director of the Media Ecosystem Observatory at McGill
University, a project that we actually run together, which gives us this really incredible
window into the way information travels across Canada's digital
ecosystem.
And Nina Jankowitz is the co-founder and CEO of the American Sunlight Project.
She's a leading international expert on disinformation and has spent time in the Biden administration
working on the problem in the US.
I sat down with both of them in the first week of April. And guess, I want to start with you.
At the time we're recording this, we're a little over a week into this federal election.
Is the observatory seeing any signs of interference?
We're seeing some forms of interference and it's kind of a really interesting question
what exactly interference means.
So I think it's worth just sort of like taking a minute on that because interference typically
like if you ask security folks, they say interference is covert.
Interference is when it's hidden.
And that's kind of a useful heuristic, but it's not quite so clean cut all the time.
So when you have somebody like Trump going in and endorsing one candidate, or when you
have Musk kind of a few months before the election, going in and endorsing one of the
candidates, you know, it's influence politics is a rough and tumble game.
It's part of the way it is, but it can kind of get pretty uncomfortably close to
interference.
And Canadians with kind of a longer memory might remember that when Obama endorsed
Trudeau before, that was sort of seen as a major kind of a longer memory might remember that when Obama endorsed Trudeau,
before that was sort of seen as a major kind of case of interference and a violation of
norms and was really kind of a big issue.
So sort of getting back to what we've seen so far, there's been a lot of the stuff that
you see in online spaces, a lot of the sort of forms of manipulation that we're used to
seeing.
So there's been a lot of bot activity.
There's been some interesting things about Facebook groups that have been purchased and repurposed. Can you say a bit more about that?
I mean, I think that's an interesting case. Yeah, yeah, for sure. So we're actually running a
tip line during the election and we got a tip about this and we looked into it. And what we saw was
there was a Facebook group with about 18,000 members that used to be a buy and sell group
based in Hamilton, Ontario. And if you scroll back to kind of like 2017, 2018, there's like
pictures of kids jackets and apartment buildings.
And then January 11th, 2024, there's like this switch that was flipped and the
name of the group changes and it becomes this horrible kind of racist place.
And then it gets transformed again, just before the election. Canadians for Trump and
Elon Musk for governor of Canada. And so it's still got 18,000 members and you scroll through
the post and there's a bunch of Canadians going, wait, what? Why am I in this group? I don't agree
to this. I want to come back to that case because I want to get your thoughts on it, you know,
because it parallels some things that happened in the US in previous elections frankly. But
just quickly, how are you looking at the ecosystem?
How do you understand this in Canada?
When you say, what, when we're looking at the ecosystem, what do you mean by that?
So the way we do that is that we have this idea that something doesn't really matter
in online spaces until somebody with enough clout and visibility and followers says it.
So what we've done is we've kind of identified about 5,000 of the most influential Canadian voices.
That's all Canadian politicians, influencers, journalists,
media outlets, anyone with a decent following.
And we follow them across their whole social media footprint.
And then we collect everything that they say,
and we analyze kind of how information moves
throughout there.
Now, often these 5,000 aren't the ones coming up
with an idea, Sometimes they are.
But once it hits them, once it hits their radar,
once they start to share it, then it takes life.
Then it has power and then it sort of has meaning
in the information ecosystem.
So we do that and then we've got just sort of a great team
of qualitative researchers scouring the internet
for manipulation and then we do some survey work.
So we kind of get this really rich, full picture
of the ecosystem during an election.
And we're using that to really try to understand
when something is a little bit suspicious or manipulated.
And what we're trying to do during the election
is let Canadians know when we see that
and to help them sort of navigate it a little bit better.
And it's early so far, obviously, we're a week in
and you say there's some altered fake Facebook groups,
there's some bot activity.
Are you saying anything that you think really raises
concerns around an explicit foreign interference
by a foreign actor in the Canadian election?
So we haven't seen anything yet
that we can kind of definitively tie.
There is, there's some stuff around AI-generated content, particularly, that has been amplified
very heavily, kind of, particularly on X by a lot of out-of-country folks.
And so this is sort of an interesting case where there's some amplification from outside
the country.
It's a little bit concerning.
There's some sort of out-of-country ties to where this information came from.
But we haven't seen anything.
We have this whole sort of incident response protocol.
And something has to be fairly major for us to flag
and say, hey, look, this is a pretty serious incident
that rises to the level of potentially changing
enough Canadians' minds that it matters for the election.
And you haven't seen that yet.
And we haven't seen that yet.
So Nina, you just went through an election in the US. And you haven't seen that yet. And we haven't seen that yet.
So Nina, you just went through an election in the US.
What happened in the last American election?
Are there things we should be paying attention to
in Canada based on what happened or didn't happen
in the last US presidential election?
Yeah, I mean, it's a huge question, right?
And I think one of the main trends
that we saw in the United States
that I don't think you will see in Canada
is kind of the like explicit foreign interference
because of the pullback on the US government's
kind of oversight of and communication
about disinformation in our country.
It became obviously it was always a polarizing topic. It's become kind of one of the key things that Republicans harp about all the
time. And so I think our adversaries saw it as open season on the US, and we saw a number
of different attempts at both kind of information operations, but cyber exploits as well. You
know, famously, the Iranians hacked President Trump's campaign.
We also saw the expansion of this
spam-aflage campaign, which of course is the
Chinese campaign where fake profiles
are used to inflame tensions on either side
of the political spectrum.
And then the Russians made, interestingly,
fairly kind of, you know,
rote videos.
Like they didn't use AI.
Everyone was talking about
this is going to be the deep fake election, blah, blah, blah. They just hired actors to pretend that
they were committing voter fraud and some people bought it, right? So that's the sort of thing that
we saw. And right afterward, you know, we put out a statement and said, you know, disinformation
didn't win the election, but the normalization of lies did. Like, I think people just don't care if their politicians are telling the truth or not,
or if they're reading truthful information. They want information that makes them feel good.
And that's really what we've seen in the first couple of months of the Trump administration as
well. I mean, that's such a gray area to pin down of when just the spread of false information
or the amplification of divisive content
leads a society to not trust anything they hear
or care anymore.
And I mean, I think that's a very worrying tipping point,
but it's really hard to know
when that's happened, isn't it?
Yeah, and I wouldn't say that people don't trust anything.
They just don't trust the other,
which actually is potentially worse.
It might be worse, yeah.
Because of the inflaming of the tensions, right?
You mentioned Russian activity in the last election, US election, and obviously it's been
pretty present in the American media ecosystem for a long time now. And how do you know and how do you decide if what they're doing works or not?
Yeah, this is the key question of all of this research. And I guess to me, it is not important
whether Russia successfully influences somebody or not. I would prefer them to stop masquerading
as Americans and attempting to influence our elections.
Principally, that is really important to me.
You can make an argument that for every Russian bot network, we don't need to do a crazy press
release or have a feature news story, and I would agree with that.
That's giving them more credit than they're due.
But I still do think we need a baseline level of understanding among the population that
there are foreign sources, adversarial sources, who are attempting to influence our domestic and foreign policy,
and we need to be wise to that. Back in 2018, as we were heading toward that midterm election,
I identified a group that had been targeted by a Russian internet research agency troll
who posed as an American and gave them a bunch of money
to advertise a protest that they were having.
So in this case, and this is not the only one,
the Russian interference worked, right?
It got people out on the street.
It was an, interestingly, this was a pro-democratic protest,
anti-Trump protest, so it just goes to show
it's on both sides.
But it actually escalated into action, right?
It wasn't just about sharing or liking a post.
People say, oh, they might engage
with Russian propaganda online,
but is that gonna change how they vote?
Well, it got them to go out to a protest, right?
Can I just say here, this is a continuous issue.
It's like, how do we know that this is mattering?
How do we know that this is consequential?
And there's like at least two ways that we know.
One is that there are thousands and thousands
of testimonials.
You can go on the internet,
you can go to certain online communities
where people share stories of themselves or their families going down rabbit holes in online spaces.
And there are just so many, there's a torrent of, yes, testimonial evidence.
Like this, this is real evidence.
And I'm a political behaviorist and political behaviorists love to say, yes, but you know, what is the statistical significance of the mean effect on this population of interest. Like, okay, buddy, but at the end of the day, if you go to a Reddit community and you see
thousands of testimonials, at what point does that collective weight start to inform the
way, you know, we start to think about the world?
Um, so that's kind of like one major piece.
And then the second and like, look, not to be all political here, but like, look at
what's gone down in the States.
Like, look at the continuous rejection of interest in scientific evidence or the truth.
Like, there's been a systematic undermining in that country of that and everyone else in the world can see it.
They have just completely rejected as a political community, like there is this rejection of truth.
There's a systematic rejection of truth.
And, you know, was Russian disinformation a drop or a torrent in that?
I'm not sure.
Yes, I'd like to know if it was a drop or a torrent.
Okay, but the giant lake is there, right?
Like it's happened.
And so yes, it'd be great to precise an effect size
for this type of thing, but we can't.
And there's enough evidence out there now
that we know that online communities shape
the way we see the world.
And one of the things that can happen
is the steady erosion of trust.
And we're seeing it in Canada to a lesser extent as well.
And like, we can read the tea leaves, like trust in old media, traditional media is down significantly, about 10 points in five years.
That is a massive shift.
Social media use is enormously up, particularly amongst youth in terms of their primary source of news, their primary source of political information.
And you can't look at these and go,
yeah, but is it really this or that?
No, there's this pattern happening.
And so that demands a response.
That demands a soul searching.
Nina, did we build an ecosystem that's vulnerable
to this kind of manipulation or is it in itself?
I think actors are taking advantage of the vulnerability.
So we can look at historical disinformation campaigns.
A lot of the tools, tactics, and procedures are the same.
The thing that has changed is the internet
and the ability to target your messages
to exactly the people who are gonna be most vulnerable
to them, number one, but also to test those messages
over time and one of the really amazing resources that we have that I actually teach to all my graduate
students back in, I think, 2018, 2019, the House Intel Committee in the US, I believe
it was the House Intel Dems, released all of the Russian ads because Facebook wasn't
going to do it.
And a lot of people looked at these ads and they were like, oh, these are terrible.
They're dumb.
Look, this performed so poorly.
The point wasn't that they were going to perform well, necessarily.
It was testing.
It was A-B testing.
Anybody who runs a newsletter knows that you test different subject lines to see which
one gets opened more, right?
And the Russians were doing that at such a mass scale, right?
They were doing that at such a mass scale, right? They were doing that with Facebook ads
and when they found a message that resonated,
it's kind of like throwing spaghetti at the wall
and seeing what sticks.
They found what stuck and they rammed that message
over and over and over to that audience.
And the technology that we have enables anybody to do that.
They don't have to be sitting
in St. Petersburg or Beijing, right?
They can be sitting anywhere in either of our countries or around the world
and target those messages the same way.
The predictability of some of this is frustrating sometimes.
And I mean, the that release of those ads in the US
led Canada to mandate ad archives during elections.
Right. And so, like like sometimes we can learn some lessons
from this and governments can patch these holes
that we're seeing.
But now it feels like you mentioned we're right
back where platforms are taking down safeguards
again, and governments are throwing up their hands
that they can't possibly do anything in this space
because it would be an infringement on speech or
whatever it might be.
And so can you paint a bit of a picture on how you see the state of the safeguards maybe on the
platforms? And you mentioned at the beginning that they're sort of, they have a new posture, right?
MS. MCENANY You know, it's really interesting because in 2020, we were in the midst of a
pandemic. This is the first presidential election since 2016, where we had this massive foreign
interference story that kind of persisted from 2016 all the way to 2020.
And the platforms, in particular Facebook, were trying to make amends for that.
We're trying to atone for their sins.
This is an extraordinary election, and we've updated our policies to reflect that.
We're showing people reliable information about voting and results, and we've strengthened our ads and misinformation policies.
Four years later, with the rise of Trump 2.0 and the bullying of the social media platforms, these multi-billionaires have chosen to acquiesce and they have rolled back their content moderation.
After Trump first got elected in 2016, the legacy media wrote non-stop about how misinformation was a threat to democracy.
We tried in good faith to address those concerns without becoming the arbiters of truth.
But the fact-checkers have just been too politically biased and have destroyed more trust than they've created, especially in the US.
All the while, when they go to other democracies like, for instance, Australia, which has quite
robust internet regulations, they're saying, oh, actually, we're still doing fact checking,
we're still doing content moderation, and all those things are good, right? So it's just kind
of typical lobbying. I think there's another play in all of this as well, though, which is that the
social media companies for years have been fighting against regulation, not only here in the United
States, but everywhere else, Canada, the EU in particular, the Digital Services Act in the EU is in force now.
And so what I think Zuckerberg and Musk and others see in the Trump administration is
not just something to fear, but they see an asset.
Now they have the potentially most powerful lobbyists they could possibly hope for in
the US government because they have acquiesced to lobbyists they could possibly hope for in the US government
because they have acquiesced to the Trump administration's demands and Trump and Vance
are now bullying Europe and other regulators to not regulate our American companies.
So that's the landscape.
It's really scary.
We've got, it's interesting because, you know, in this election right now, we're going
through this where the decisions of these platforms are enormously influential on our politics and the way we communicate in an election. So like, it's
interesting, think back on 2021 and how worried we were about Facebook in that election. And
we look back and none of these things is true today. There is no election integrity initiative
whatsoever from Facebook. There is no crowd tangle. There's no data access for researchers
during the election.
So good luck. Good luck getting insight into what's going on. And there's no news on the
platform either. So Canadian journalism and news content is blocked on the platform. You cannot
post it. We can talk about the why of all of these things. But the reality is, is you now have a
Facebook where it's still
the platform, the most used for Canadian politics and news and information among social media
platforms. And it doesn't have the news. It doesn't have election integrity or adequate
moderation or fact checking. And it doesn't have any data access. So we're going to an
election in that state.
Angus, is there any risk in relying on private actors to fulfill core functions of our democratic
society, particularly in moments like elections like we're in now?
Yes. I mean, I absolutely, there's a few different sort of layers to this answer though. So I
think X is the best example of it.
And it's the most like flagrant one right now,
given kind of the dynamics of that platform
and the dramatic shift to the right in the United States
and by consequence in Canada as well.
So on that platform,
it is still the platform the most used
by Canadian politicians to communicate out.
For example, Christia Freeland, when she resigned,
which led to the fall of this
government, which led to our current election, when she resigned, she posted that on X.
So that's maybe the system working as intended, right?
You know, there's a public square, you can say something there.
What happens when Elon Musk on that platform has such a large voice that when he chooses
to amplify a particular message, then distorts the entire platform and everything else
that gets amplified out over that platform.
Now, is that interference?
Is that manipulation?
I don't know, but if Musk is able to unilaterally mean that millions of
Canadians are going to see one piece of content over another, and he's able to
do that not by tweaking the algorithm, but just by the way he acts on
that platform, by his choices of who to platform and amplify.
Of course this is an enormous risk.
He's part of a US administration joking, not joking about annexing our country.
I mean, it's just like, of course, what is this question?
Of course this is a huge, a huge security risk.
And yet we find ourselves in an election with absolutely no choice.
Politicians are all still on X.
That is their primary way that they communicate.
All the political parties are still spending a huge amount of money on Facebook.
Like Blue Sky, Mastodon, other friends.
Like it's just, there is not enough Canadians using those platforms.
And so what you have is people don't trust Elon Musk.
We've done some surveying about this.
I'm like 90% of Canadians don't trust Elon Musk.
Many fewer Canadians are using X.
So the number has been steadily going down and we're down about four or five points from a year ago.
And yet that's still where our politicians are spending their time.
That's still where they're doing their primary announcements.
And so it's an enormous challenge in the context of an election and the lack of
transparency in particular, you know, the information space is, is a tricky one.
Like the best way to fight misinformation, the best way to fight information
manipulation is transparency,
is sunlight, is people understanding what's going on.
And once people understand what's going on, influencers, content creators, the news can
comment on it, politicians can have a conversation about it, and we can kind of move forward
all in the know in a world where there is so little transparency, we cannot know.
And so, look, we're gonna go through this election
and we're gonna do our best.
Like the observatory is gonna do its best
to communicate and to understand.
But our visibility is very imperfect.
The only people who know how many bots are active on X
are internal to X and they are not forthcoming
with that information at all.
Wouldn't you like to know how many accounts that you follow are bots?
Wouldn't you like to know that those posts that you shared
were actually from an out-of-country account
that was trying to amplify and push a political agenda?
I want to know that.
I'm sure Canadians want to know that, this Musk issue because this is a really tricky one in this election.
Some of the survey findings from the observatory found that Canadians are more
concerned about American foreign interference than
any other country, including China and Russia
during this election, which is striking for a
whole host of reasons.
Um, it's clearly the case that the U S is in some
respects, a hostile actor in the Canadian
discourse at the moment.
Um, and as Angus said, the co-president or whatever his role is,
owns one of the most influential communications platforms in our country. And yet, he's a public
figure who's allowed to speak and he can weigh in on the election in Canada if he wants to. I mean,
there's nothing illegal about that. So in this context, how should we determine
whether America is interfering in our election? You don't ask easy questions. I mean, listen,
Elon Musk, I think the most important thing to know is and remember is that he's a government
operative right now. I do worry when Elon Musk says something
that it's gonna become policy very soon.
Is that interference?
I mean, I think there have been instances
where American politicians in the past
have commented on Canadian politics.
You mentioned Obama.
I think Clinton also had done something like that
early on. I don't think that's interference of the same sort that we see coming from adversaries.
Now, I would not put it past Elon Musk to have a bot army that is giving some amplification to
certain candidates on a certain side of the spectrum in Canadian politics. But until
we see evidence of that, then I wouldn't say it's quite, it's definitely nefarious,
but it's not as covert as the sort of thing that we're seeing. But you know what? It
gives me hope that Canadians see this for what it is and that they're more worried
about American interference than Russian or Chinese or Indian, because frankly, I think we are a bigger
threat right now and the posturing that's happening
shouldn't be really concerning.
And it gives me hope that Canadians frankly,
see right through that and see it for what it is.
I guess one of the challenges here, right,
is that we're deeply interconnected
with the American ecosystem.
And like we are immersed in American voices all the time.
So how do we decipher a Musk torquing an algorithm
or artificially inflating a discourse
with Joe Rogan speaking to his millions
of Canadian listeners, right?
How do we separate those two from each other?
Yeah, the thing for me is the risk here.
So like, there's what's happening right now.
Like Maxime Bernier, who is the People's Party candidate here,
went on Alex Jones.
Actually, you know, the last three years,
80% of the terrorist suspects came from Canada.
80% of them, because of our mass immigration,
they're coming to Canada.
This is kind of above board in that, like, Canadians can go, okay, do I think our leaders
should be going and talking to Alex Jones? They can make kind of a decision about that. They can
be informed. We might judge their decision. Yeah, but that's great. We're in a democracy
and people are able to do that and can judge it. And that's great. That's exactly how it should be.
were able to do that and can judge it. And that's, that's, that's great.
That's exactly how it should be.
The fundamental security issue is that we would have no idea if Musk did that.
It would be so difficult for us to know.
Like at the observatory, we might detect, hey, there's like a 15, 30% spike over
the last week and things like this.
It's hard to know if that's organic.
We saw in the Romanian election,
right? This was a huge question about like the tick-tock, the explosion of
George Eskou's campaign in the last two weeks of the election.
And can you just pause there for a second and explain what happened there?
Because it's interesting.
Yeah. So in the most recent Romanian election, it's a two-stage election like in France. So you
have an initial kind of slew of candidates and the first and second candidate kind of
go head to head in the second round and that's who gets to be president of the country.
And in the first round, in the last two weeks of the election, one of the candidates kind
of came out from nowhere, pulling three, four percentage points and then I think one second
or was second and that was Gheorgheczkyu and it turned
up that there was this sort of massive TikTok mobilization.
Half of Romania is on TikTok.
It's one of the most densely kind of like the highest saturation of TikTok users in
the world.
You know, anyways, he was disqualified from running and the court annulled the results
of that election and it was just this huge sort of thing and some of the intelligence
was around Russian interference of that election. And it was just this huge sort of thing. And some of the intelligence was around Russian interference in that election. Romania is obviously very attuned
to Russian interference and paying close attention to what's going on. It's hard to tell if that was
sort of a groundswell or how much nudging or pushing or shoving was from Russia there.
But the point is we can't know because we don't have visibility into the design of the system.
We have very little visibility.
That was the last two weeks of the election.
So this is the thing about democracies is there's this heated conversation and by the
time the dust settles and you're able to really assess what happened, it's too late.
The election is over.
The decision has been made.
And if it comes out later that there was a major, let's say that Musk
tweaked the algorithm in the last two weeks of the Canadian election,
and it doesn't get picked up.
And it affects the votes or the opinions of 200,000 Canadians.
And that changes a few seats and leads to a different government.
And that comes out two weeks later or three weeks later.
What do we do?
Like this is, this is kind of existential for a democracy.
Do you annul the election?
I mean, that's a pretty dramatic thing.
You need to have rock solid evidence.
Is X gonna give that to you?
Of course not.
And so we're, like, some might think,
oh, this is being too, this is being paranoid.
This is unlikely to occur.
This is our democracy.
This is our like, our system of government.
If there is a 5% chance of something like this happening, we should be fundamentally concerned about it
because it's existential for our future. So I hope it is just a 5% risk. That's my hope,
is that this is a very small risk. But there is this real possibility that this could occur
and we wouldn't detect it. There's a lot of concern this last year was going to be a year of
it. There's a lot of concern this last year was going to be a year of deepfakes and AI in elections. Did you see that? How should we be thinking about the role AI might play in this
election? So I don't think AI was totally a nothing burger, but it wasn't nearly as bad as
everybody thought it was going to be. Early on in 2024, we had an audio deep fake of President Biden robocalling New Hampshire
Democrats and telling them not to turn out for the primaries there.
It's important that you save your vote for the November election.
Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again.
Luckily, New Hampshireites are very smart and they were like, hang on a second,
this is weird. Why would Joe Biden who loves democracy tell anyone not to vote, especially
members of his own party? So that went up, I think to the FCC and they quickly passed
a rule that you're not allowed to, um, the telecoms actually targeting the telecoms saying
that they're not allowed to put forward robo-calls that are AI-generated.
And audio deepfakes are some of the most difficult ones to detect because you don't have the
same kind of hallmarks that are visible to the naked eye when you're looking at a photo
or video as you do to the naked ear, so to speak.
And there were a variety of photo deepfakes.
I don't think anything that really tipped the scales, you know, President Trump posing
with black voters,
all sorts of things like that, a lot of sexualized deep fakes of Vice President Harris.
And actually when it came to the foreign interference, as I said before, we saw the Russians actually
hiring actors and making videos rather than relying on deep fakes that could be quickly debunked.
The other thing though that we've started to get worried about at the American Sunlight Project is the poisoning of LLM models.
We have been looking at this network, the Pravda Network,
which is no relation to the news outlet Pravda.
They are a content aggregation network
that is run by some guy in Crimea.
He's pro-Russia, but not necessarily paid by the Kremlin.
No one has evidence of that yet.
And what this does is just regurgitate
a bunch of Russian propaganda translated
into a million different languages
on different web domains.
And it's pumping out at least 3.6 million pieces
of content per year.
Now, when you go on these sites, they're not very usable,
like the humans wouldn't find them pleasant to use, they'd find them confusing, the UX is just bad.
And so our hypothesis, since confirmed by two other research organizations, is that these sites
aren't made for human users. They're made to be scraped by the models that are training AI. And
if you ask the 13 major chat bots, you know, questions that this content aggregation
network has reported on things like who's responsible for the massacre in Bucca, for
instance, it will then spit out Russian propaganda and source it back to this Pravda network.
That is so interesting. We're creating a loop of disinformation inside the AI training.
Exactly. So I'm more worried about stuff like that for people who don't necessarily know, oh,
what is this website?
It seems legit.
I'm sure it's cited somewhere.
So like it's legit and it's in this AI model, so it must be okay, right?
But don't know that there is this network that somebody who is pro-Russia is building
for that very purpose.
I guess is that the vulnerability you see in AI too, that it's not just the one deep fake
video that can be easily debunked, but it's the flow of content that can be generated
by AI that could be the bigger concern?
Yeah, there's a few concerns with it.
The one that I'm most concerned about right now is just it is an enabler.
So it's so easy now to produce content at scale, images, video, audio,
you know, stuff with meme potential.
Like it just, it, think about how, how AI is useful in making your self
productive in your day-to-day life.
Like just like emails or helping with document editing or whatever.
Like speak for yourself, Angus.
We use our minds.
But like it helps, right? Like speak for yourself Angus. We use our minds. But like it helps,
right? Like it enables us. Well, it's like that, but 10 times for somebody trying to manipulate
the information environment, right? Like it just, it really, the, I mentioned that bot incident last
year. Like that was pennies to produce that amount of text. Whereas before you would have like that
amount of distinct texts, you would have had to like have somebody write that. That would have been pretty costly. You don't need to do it anymore. And we're just beginning to see the use of text. Whereas before you would have like that amount of distinct text, you would have had to like have somebody write that.
That would have been pretty costly.
You don't need to do it anymore.
And we're just beginning to see the use of it.
And you know, the use that we're seeing the most in Canada is for crypto scams.
This is the one we're seeing the most use.
We're talking about A-B testing.
There has been some crazy A-B testing on X in Canada over the last year and a half
for crypto scams and the A-B testing is quite sophisticated.
So it's different news sites, different AI generated images of famous Canadians or Canadian politicians,
different types of headlines, different languages.
And we've seen these ads at scale on X.
We see them now on Instagram and Facebook as well.
They're primarily driving people to crypto scams.
We saw one just last week about Carney being detained for sharing this secret in an interview. Guess
what the secret was? A great way to make money quickly. So these things are happening and it
hasn't yet, but this could be very easily purposed for political objectives. And we haven't seen that
yet. It's crypto rent seeking, but hey, grift and mis-disinformation are kind of two sides of the
same coin. We've covered a lot of ground here on the vulnerabilities and threats and risks
in democratic societies. What advice would you give to Canadians heading into an election
in this vulnerable, high-risk information ecosystem?
So, I mean, far be it for me to give advice to a country that I think is doing a much
better job than my own.
But the advice that I give to every audience is just to remember that engagement equals
enragement.
And if you're feeling yourself getting emotional when you're consuming content online, think
for a second about who is giving you that content, what their purpose in spreading it
is. And then if you're still thinking about it
in a couple of minutes after you go take a walk
and as the kids say, touch grass, right?
Then you might wanna do a little bit of research
about where it's coming from
and only then should you share and amplify
and engage on that.
But just be aware of that, consume content deliberately
and know that you might be under kind of the spell of somebody whose truth,
the truth is not at the heart of the matter for them.
I guess what do you want Canadians to know? And early in a cycle, do you think this
election is going to be pretty unfair? The thing I've been sort of saying,
and I think it's real,, look, when you're consuming
content, especially during an election, and you want, you care about your community, you care about
your country, just be super intentional about it. And what that means for me is ask yourself,
is this person showing the content Canadian? And do they care about me and my community?
That's it. It's actually really simple. And the answer to both of those has to be a vigorous yes.
It's not like a maybe.
Maybe they care about me.
Like, they might.
No, no.
If it's not a vigorous yes, then maybe during the election,
that isn't somebody you should be listening to.
For Canadians who spend a lot of time listening
to American content, just for the election,
just maybe they don't know what they're talking about.
Turn it down a bit.
Yeah.
Turn it down a bit.
They really don't know what they're talking about. They don't live here. Turn it down a bit. They really don't know what they're talking about.
They don't live here.
They don't know the country.
Everything that they say is about their political context and their life.
It's not about you or your community.
So that's sort of my advice for Canadians.
I think Canadians are pretty tuned into this.
And so far in the election, you know, there's been a lot of online manipulation stuff,
but it's not been too bad yet.
It's, it's out there.
People who are spending time online are going to see it.
What I, what I am kind of concerned about, and I've seen a lot of already is a lot of
claims of foreign interference from different political parties directed at others.
And we've seen this in the States too.
It's just, it's, it's the weaponization of foreign
interference that is a little bit boy cries wolf. It's a little bit like all foreign interference
is just a political gimmick and anyone can accuse anyone of it.
Well, and if you go too far down that road, you end up where we are now,
where I just came from a hearing where they're talking about they had defunded the only entity in
the State Department that was working on foreign interference because of the politicization
of this issue.
So that's something we've got to be really careful about.
And I actually expect better from our political class to be able to actually sort of say,
this area, this information manipulation space is probably one that
we want to be really careful with fake accusations and we want to be very
cautious but so far in the campaign I'm not I'm not hopeful that that's gonna be
the case and but yeah just be intentional. Well look I mean understanding
the nature of this threat is both the hardest but most important job to
protecting ourselves from it and I thank you both for the work you're doing in this.
Two of the best.
So thanks so much for joining me.
Thanks for having us.
Thanks, Doug.
Machines Like Us is produced by Paradigms
in collaboration with The Globe and Mail.
Our senior producer is Mitchell Stewart.
Our associate producer is Sequoia Kim.
Our theme song is by Chris Kelly.
The executive producer of Paradigms is James Milward.
A special thanks to Matt Frainer and the team at The Globe and Mail.
If you liked the interview you just heard, please subscribe and leave a rating or a comment.
It really does help us get the show to as many people as possible.