Front Burner - Is Canada ready to combat election meddling online?
Episode Date: June 4, 2019Democratic Institutions Minister Karina Gould on Canada's plan to deal with interference and disinformation ahead of the fall election....
Transcript
Discussion (0)
In the Dragon's Den, a simple pitch can lead to a life-changing connection.
Watch new episodes of Dragon's Den free on CBC Gem.
Brought to you in part by National Angel Capital Organization,
empowering Canada's entrepreneurs through angel investment and industry connections.
This is a CBC Podcast.
CPA 21, how do you read?
I really want to know what happened.
And it makes me extraordinarily angry that it's always been a big secret.
Uncover bomb on board.
Investigating the biggest unsolved mass murder in Canada.
CP Flight 21. Get the Uncover podcast for free on Apple Podcasts and Google Podcasts.
Available now.
Hello, I'm Jamie Poisson.
At this point, anytime there's a vote in a democratic country,
we hear a story about fake social media accounts and trolls mucking around in elections or campaigns with disinformation.
This happened in the US, the UK, France, and even last week in the European Union.
There were multiple, systematic efforts to interfere in our election.
Deploying its state-run media organizations to plant fake stories and photoshopped images.
If we want to protect our liberal democracies,
we must have strong legislation.
And so now we're worried it's going to happen here,
in Canada, during our election.
We've talked a lot about this on the podcast.
And today, we're talking to one of the people in charge
of actually doing something about it.
Karina Gould is Canada's Minister of Democratic Institutions.
We're going to talk to her about what the federal government has committed to doing
before the federal election in October,
and whether their proposals will actually address the problems.
This is Frontburner.
Hi, Minister Gould. Thanks so much for taking the time to talk to me today.
Oh, I'm glad to be here.
Good. So I want to talk to you about election interference today.
And look, we've seen it in every major election in the world recently.
A big part of your job is dealing with this in Canada.
And so let's go through some of the things that this government has done to address the issue of election interference.
Cole's notes version, and then we can unpack it more as this conversation rolls along.
One, you put in Bill C-76, which deals in part with election interference. And
correct me if I'm wrong, but my interpretation of it is that it says companies like Google and
Facebook have to set up a registry before and during election season of who is buying ads
on their platform. And advocacy groups, they're not allowed to use foreign money to fund their
campaigns. Am I right? That's correct. Okay. And another thing you guys have done, you've created
this five person committee made of deputy ministers to alert the public if there's election
interference during the election. And my understanding is that their focus is really on
foreign interference. Yep, that is correct as well. Okay. And my understanding is that their focus is really on foreign interference.
Yeah, that is correct as well. Because we already have Elections Canada to implement the integrity of the election domestically. Okay. And tell me what else was part of that announcement back in
January. Yeah. So on January, we announced our four-point plan. So one was with regards to
citizen resilience, because we understand that informed and engaged citizenry is really the best anecdote to attempts to interfere in our electoral system.
And so part of that was the announcement of the protocol, which you alluded to, which is this five-member panel that will inform Canadians if there is foreign interference in our elections.
The second part of that was with regards to institutional resilience. And so ensuring that, you know, our communication security establishment, which is our signals intelligence agency, is providing support to Elections Canada, but also to political parties.
The third pillar was with regards to actually combating foreign interference. And so we established what's called the SITE Task Force, which brings together CSIS, CSC, the RCMP, and Global Affairs,
all into one unit. And then the fourth one was with regards to social media and really outlining
the expectations that we have for how social media companies will act in the lead up to and
during our elections here in Canada. And last week, I announced the Canadian Declaration for
Electoral Integrity Online, which Google, Facebook and Microsoft have all signed on to.
This is essentially a non-binding agreement and it calls on the platforms to sort of step up their game when it comes to cracking down on disinformation and bots.
And they've committed to it, right?
You know, the three companies that I mentioned have all said that this is, you know, the standards that they will be relying on in the lead up to and during the election.
And so it gives Canadians kind of a temperature to check and to measure against how they think those activities are carried out over the coming months.
And so I want to talk to you in a minute about whether or not you think that these companies are up to the task. But first, all of this stuff that we've just gone through, I have to be honest with you, it feels quite high level to me.
And so what I want to do with you today is get a sense of how this would work
practically. So if we could start with foreign interference. So let's say the election, the
RIT drops, it's election season, and there are a bunch of sketchy Twitter accounts, and they're
posting about something divisive, like pipelines or immigration. First, are there people in the
government who are watching for this kind of suspicious activity and flagging it?
So the first thing that I would say is that our signals intelligence agency doesn't monitor Canadian accounts. That's really important to note. So part of this is, it's kind of tricky, because, you know, what we saw with the Russian activities in the US presidential election, for example, is that they were posing as Americans and as American actors. And so what we do know, though, is that the social media
platforms did have insight into the fact that these were not U.S.-based accounts. We have
relationships with all of the platforms. And part of the declaration was to share information within,
you know, the appropriate legal framework that can help both the government of Canada, but also the platforms to identify malicious, inauthentic or foreign activity on the platforms. you know, de-Twitter accounts that could be sowing divisive messages that just look off?
And then do you go to the social media platforms and ask them where they're originating from?
So we would have to have someone who's a bit more of an expert in the exact
details and mechanisms of this. I'm not an intelligence officer. I'm the minister.
Right.
But one of the tools that we have set up within Global Affairs Canada
is what we're calling the rapid response mechanism, which Canada is leading in partnership with our G7 allies to monitor foreign interference in elections around the world. looks at kind of aggregate social media data and they can kind of see what's authentic and organic
content versus content that is very clearly coming out of, let's say, a troll farm in St. Petersburg.
So there's a number of different tools that are available to analyze what's going on. Even if that
was the case, right, it would be, you know, is this something that is garnering national attention?
Or is this something that is, you know, really just speaking to a few people here and there?
Everything will be really context dependent.
Okay, so let's say that there are a bunch of Twitter accounts swarming around these divisive
issues, and they look like using these tools, they're coming from a troll farm. What do you do?
What happens?
Let's say that it's not a couple of people. There's more going on here. Let's say that it's something that is making headlines across the country that is not based
on factual information, right? Or it's being manipulated in a way that's selling division.
Then that's something that the members of the protocol would make a
determination as to whether they inform the Canadian public, assuming that they have, you know,
credible sources saying that this is foreign interference. They would contact the prime
minister and the leaders of the political parties, they would contact the head of elections Canada,
and then they would issue a statement to the media to inform them of the information that they have. And so would something like that be enough for this committee
to tell the public that this is election meddling if you were able to trace those back to, let's say,
Iran or Venezuela or Russia? It's a little bit confusing to me what the threshold is here for
this committee to tell people that there has been foreign influence in
our elections? So again, it just depends on, you know, is this something that's capturing national
attention? Is this something that, you know, what is the impact of this interference having on public
discourse? Those are all things that they will have to determine. And that's why I'm a little
hesitant to say it would or wouldn't, because it's going to be context dependent.
And the other thing is, is, you know, there might be some interference campaigns that
are completely ineffective and have no bearing in terms of what people are talking about.
And then there might be others that are much more effective. Oftentimes, you know, you learn about these things after the
election. But part of what we tried to do with the protocol is to provide a mechanism to inform
Canadians when we do know something is happening. So we've tried to build a mechanism that has all
party support, that has all parties involved with it to try and build that trust amongst the players to
say this is not a question about partisanship. This is not a question of supporting one party
or the other. This is about our national security and the integrity of our elections.
Right. So I mean, look, talking to people who are pretty steeped in this issue, I keep hearing
these concerns, though, that it's not clear ahead of time what the threshold is for this committee.
So, for example, if we find out that the Chinese government is spreading rumors related to Meng Wanzhou and trade talks on WeChat,
and those messages are reaching thousands and thousands of Canadians,
is that enough for the committee to come forward and say
publicly that there are foreign entities meddling in our election?
So I think we can look to past examples, right? So what happened in the US presidential election,
I think is very clear that, you know, if there's a hack and a leak of political emails,
there's a good example. Same thing during the French presidential election.
We know as well from both the US presidential election, but also the UK Brexit campaign,
if there's a coordinated attempt to create false personas and pretend to be Canadians.
personas and pretend to be Canadians. You know, these are all things that I think Canadians can look to in terms of past examples, as well as coordinated campaigns to spread misinformation
by a foreign government. I don't want to say it will absolutely happen because, you know,
it will be up to the panel to make that decision the idea of creating this was specifically
to address the kinds of issues that we've seen in previous elections that were maybe not addressed
in a timely way that i think would have been important for those populations to know about
so in your opinion i realize that this isn't your call at the end of the day it's the committee's
call if there were rumors being spread on at nap like WeChat about Meng Wanzhou and trade relations with China,
and it was traced back to China and it reached thousands of people,
would that meet the threshold for you to tell the Canadian public that there's foreign meddling in their elections?
Well, so I think it's really important that, you know, we be transparent with Canadians about the information that they're receiving.
So it would?
Yes.
Okay.
We'll be back in a second.
Discover what millions around the world already have.
Discover what millions around the world already have.
Audible has Canada's largest library of audiobooks,
including exclusive content curated by and for Canadians.
Experience books in a whole new way, where stories are brought to life by powerful performances from renowned actors and narrators.
With the free Audible app, you can listen anytime, anywhere,
whether you're at home, in the car, or out on a
jog. The first 30 days of the Audible membership are free, including a free book. Go to www.audible.ca
slash cbc to learn more.
We've been talking about, you know, activity from Russia or Iran or China,
you know, these sort of coordinated Twitter campaigns.
And of course, you mentioned also these leaks and hacks that we saw in other elections.
But I also want to talk to you about what happens when there's domestic interference,
because we know that there are domestic actors that are using the exact same tactics and strategies as these foreign entities. And so what would happen if a video of conservative
leader Andrew Scheer starts circulating on Facebook and that video has been doctored in a
way that it makes it seem like Andrew Scheer is planning to ban abortion if elected. And of course,
that is not true. Andrew Scheer has not made any remarks
saying that he wants to legislate abortion at this time.
And this doctored video is made by a Canadian person.
And it starts circulating on Facebook
and it gets a million views.
What do you do?
Well, so I don't do anything
as the Minister of Democratic Institutions.
The government.
Yeah.
Well, nor does the government. What happens is minister of democratic institutions. The government. Yeah. Well, nor does the government.
What happens is it would go through Elections Canada.
Okay.
And if there was a complaint, it would go through the commissioner of Canada elections who has been kind of staffing up and resourcing up to kind of deal with these things.
And does it only go there if there's a complaint or is someone from Elections Canada also looking for this?
No, it needs to be complaint driven.
Okay.
And what we did in C-76 was to clarify the language around false and misleading statements with regards to political actors, but also election officials.
Essentially, you can't impersonate an election official.
Right.
Is that fair to say?
Yeah, or like a candidate or a political party leader.
And we did make a caveat for satire.
However, that caveat would be that, you know,
you would have to make it clear that this is satirical, right?
So, and then the other thing that I would say is that
in the declaration that we released last week, the companies that signed on, Facebook including, made a commitment to remove inauthentic content on their platforms.
And the idea would be that Elections Canada would then inform Facebook or Twitter or whatever platform was hosting this video and ask them to remove it.
Yeah.
I want to talk to you about this because last week at the grand committee,
Canada hosted a committee of politicians from nearly a dozen countries,
which got the chance to grill representatives from Facebook,
Google,
and Twitter about how they manage data,
how they handle the spread of misinformation and whether there are potential
threat to democracy.
There was this exchange between an MP from the UK,
I believe,
and one of the representatives of Facebook.
We are aggressively downranking that.
No, okay, so I know you're downranking it.
Why aren't you taking the film down?
And there was this video that's been going around Facebook of Nancy Pelosi,
and it was doctored to make it seem like she was drunk.
They just sort of slowed down the playback.
Do you not see that what Facebook is doing
is giving a green light to anyone in the world that wants to make a distorted or fake film about a senior politician?
I think you're right that the philosophical questions are that should we remove or should we inform people that it is fake?
We have taken the approach to inform people that is fake.
Facebook did not take it down, although they did change their algorithm to make it appear less often. And millions of people have seen this.
And Facebook is saying that they're not taking it down because this is a freedom of speech issue
and also that the account belongs to a real person.
So it's not like if it was like a Russian bot or something
and they would take that down.
So why do we think that they would take down this video of Andrew Scheer
if we asked them to or if Elections Canada asked them to?
So if it was found to be illegal in Canada, we would expect that they'd take it down.
And one of the things that we have seen from the platforms is that
when something is clearly in violation of the law, they do remove it.
I think Facebook's response to the Grand Committee was confusing at best.
Very.
And I think part of what Facebook and the other platforms need to do is to be more clear and transparent with their users.
They all have terms of service.
They all have community standards.
I think their users, Canadian citizens and citizens around the world, should have the benefit of understanding how they're going to interpret those.
If it's it's on the line. Right. Like, you know, in Canada, we have fairly clear standards when it comes to what we accept during the election, which honestly is quite broad.
And we have to be thoughtful about ensuring that we are allowing people to express themselves freely, which is why, you know, we made that clear caveat around satire. But at the same time, you know, are you trying to change the dialogue
by, you know, sharing something that's clearly false about someone else? So these are, I think,
really important conversations that we're having because you on the radio or on TV or in print probably wouldn't share that video of Ms. Pelosi on your platforms.
No, because we are also subject to, you know, libel laws and defamation laws.
Yeah.
Facebook has to kind of figure out where they fall in that.
So it's essentially, sorry, I don't want to interrupt you, but essentially what you're saying here is that this law on the books in Bill C-76 where you can impersonate a politician, Elections Canada could use that law to, if needed, force Facebook to take down a video like that.
Yeah, I think it would be an interesting test during the election to see how that goes.
Is your sense that these companies can do that?
Oh, absolutely. Yes. I mean, I don't mean logistically. I mean that they can do that? Oh, absolutely. Yes.
I mean, I don't mean logistically.
I mean that they will do that.
Well, so we have seen all of the companies,
so Facebook, Twitter, Google, including YouTube,
over the past year really remove a lot of inauthentic content.
I mean, I think pages and the accounts are in the billions
of users that have been removed because they're either bots or they're specifically foreign.
So absolutely they can.
And I think they are doing it.
It's a little bit of whack-a-mole, though, too, right?
In the sense that it's pretty easy to open up a fake account and it does require a lot of resources to do it.
But they are working that way.
They do have a lot of money. So they could certainly throw a lot of resources to do it. But they are working that way. To be fair, they do have a lot of money.
So they could certainly throw a lot of resources at this.
I just want to push back on this a little bit because, you know, I do understand that these platforms have made moves
to remove a lot of accounts and some information that's posted in the last year in particular.
But, you know, again, they're refusing to take down this video of Nancy Pelosi.
Facebook is, although YouTube has taken it down. Last week at the committee, an MP from Singapore
told a Facebook representative that the platform was told multiple times to take down threats of violence that seemed to predict the Easter bombing,
and they didn't take it down. The leader of the Easter terrorist bombings in Sri Lanka had posted
videos. Local Muslim leaders had flagged it to Facebook, and Facebook had not taken it down.
Let me suggest to you that you don't remove it because such content being sensational
is what drives eyeballs
to your platforms. Mr. Tong, I reject the premise. I reject that full-heartedly. I mean, these
platforms also, as we've seen, have a terrible record, particularly when it comes to protecting
privacy. And Mark Zuckerberg has apologized more times for the same thing than I can count.
This was a major breach of trust.
And I'm really sorry that this happened.
That was a mistake.
And I'm sorry for it.
And it was my mistake.
And I'm sorry.
So why do you think that they're just going to pull this together?
They've now overseen the UK election,
like multiple elections, Brazil, the EU,
where there's disinformation has been rife.
So I would just push back in that I wouldn't say that I think that they will do all of these things.
Okay.
Your question was, do I think they can?
Yeah. Yeah. Yeah, they certainly can.
And they absolutely can, right? I mean, when you're talking about, you know,
countries that are multi-billion dollars and, you know, they have quite small workforces,
actually, that they absolutely can
put the resources behind this and make changes and be good actors, right? I mean, you know,
there's some pretty abhorrent stuff that has transpired because of the business models and
because of the decisions that they have made, right? I mean, one of the things that the platforms
push back against is that they don't have any responsibility.
They don't have any control of what's happening on their platforms.
And I think that that notion has been blown apart in that there's very clearly editorial decisions that are made as they develop their algorithms.
I don't think the answer to solving these problems is more AI.
think the answer to solving these problems is more AI, right? I think the answer is likely a regulatory framework to protect Canadians and to protect citizens. We are establishing the baseline
with regards to what we expect for them for how they should behave. And, you know, we're going
to see over the next couple of months whether they fulfill that or not. I want to be optimistic.
Do I have total confidence? No.
But I think we have to demonstrate here in Canada what our expectations are.
And if they're not meeting them, then, you know, whoever comes, I've said this numerous times, into government after October 21st, that regulation is likely the response.
You've been talking about regulating these companies for months.
And during this time, you know, France has passed a law
which aims to empower judges to order the removal of fake news.
In Germany, tech giants can be fined up to 50 million euros
if they don't remove fake news.
So why are we waiting until after our election to make a move here? So I mean,
I already have regulated social media with regards to ads, right? And part of the challenge is,
you know, since I've been in this portfolio, which is about two and a half years now,
every day we learn something new in terms of how they're operating and what they're doing.
And I think we're actually now at a point globally where we have
like-minded democracies coming together to say we need global norms, and then we need domestic
approaches. I think the German example is interesting, but that's been really criticized
as well. I mean, in terms of the rollout and the implication for freedom of expression.
The France example, I think the weight paper that they just released about embedding regulators within the tech companies
is an interesting one. And this is a big undertaking. So while it's easy to say,
like, why haven't you just regulated? Well, we also want to make sure that we're not doing
something that's going to create the barrier of entry that would be so high that it would make it difficult for new players to get into the market.
Right. The idea being that no company wants to enter into a market when they have 7 million rules that they have to follow in order to get there.
It just makes it much more complicated for them.
And we can do an entire other podcast conversation around the role of antitrust and whether or not these companies
should be broken up, but we don't have time for that today. But look, look, I just want to do this
one more time with you. You know, I take your point that this is complicated and that there
are a lot of issues around free speech in particular, but what about the argument that
when it comes to the content, the speech, like, we actually don't live in a country where we have
unfettered free speech rights, you're not allowed to say whatever you want. And these platforms
have algorithms that amplify hate speech and misinformation and disinformation. And at what
point is there going to be some regulations around them to say, look, like, you don't have the right to amplify this stuff.
This is actually undermining our democracy.
So, Jamie, what I would say is that I don't disagree with you on that and that actually the government is working on this right now.
We want to get it right.
And we want to make sure that we're protecting people's freedom of expression,
but also protecting people's safety. And this is something that, you know, I'm so glad and grateful
for the work of the Ethics Committee, and for the Grand Committee. And, you know, now the government
is taking that work, alongside the work that we've already been doing to develop a plan. I mean, this is normal policy process,
right? You know, committees of parliamentarians do their work, they provide a report to the
government, and the government provides a response. And this is why I say all the time when asked
about it that, you know, whoever forms government in October will have to deal with this issue. And I think what's been very clear is that
we've set out expectations of how these platforms should be acting. I hope they live up to these
expectations. But if and when they do not, we will have a plan to ensure that the rights of Canadians
are being protected. I mean, one of the challenges, I think, is that we live in a world
where everything is very on demand
and things happen very quickly.
But when it comes to regulation,
when it comes to policy development,
we need to be thoughtful in how we do this
and we need to make sure
that we're getting at the fundamental issues.
I mean, really getting at the business model as well
to make sure that we don't inadvertently
make things worse, but we actually make things better. Minister Gold, thank you so much for this
conversation. I really appreciate your candor and I really enjoyed it. I hope that you'll come back.
Yeah, my pleasure. Anytime.
Before we go today, and while we're on the topic of big tech platforms and the ways in which their algorithms work,
I'd like to flag a New York Times investigation published on Monday.
It looks at how YouTube's algorithm has been pushing home videos of partially clothed young children to users who are already watching sexually themed content.
For example, a person could start watching an erotic video on YouTube,
then be recommended videos of women who are younger,
then women who pose provocatively in children's clothes.
And some people might be delivered an otherwise innocuous home video
of a five or six-year-old child playing in a pool.
This is all according to the New York Times.
I found this piece fascinating, and I think you might feel the same way.
That's all for today. I'm Jamie Poisson. Thanks for listening to Frontburner.
For more CBC Podcasts, go to cbc.ca slash podcasts.
It's 2011 and the Arab Spring is raging.
A lesbian activist in Syria starts a blog. She names it Gay Girl in Damascus.
Am I crazy?
Maybe.
As her profile grows,
so does the danger.
The object of the email was
please read this while sitting down.
It's like a genie came out of the bottle
and you can't put it back.
Gay Girl Gone.
Available now.