Ideas - Massey at 60: Ron Deibert on how spyware is changing the nature of authority today
Episode Date: April 25, 2024Citizen Lab founder and director Ron Deibert reflects on what’s changed in the world of spyware, surveillance, and social media since he delivered his 2020 CBC Massey Lectures, Reset: Reclaiming the... Internet for Civil Society. *This episode is part of an ongoing series of episodes marking the 60th anniversary of Massey College, a partner in the Massey Lectures.
Transcript
Discussion (0)
Hey there, I'm Kathleen Goltar and I have a confession to make. I am a true crime fanatic.
I devour books and films and most of all true crime podcasts. But sometimes I just want to
know more. I want to go deeper. And that's where my podcast Crime Story comes in. Every week I go
behind the scenes with the creators of the best in true crime. I chat with the host of Scamanda, Teacher's Pet, Bone Valley,
the list goes on. For the insider scoop, find Crime Story in your podcast app.
This is a CBC Podcast.
Welcome to Ideas. I'm Nala Ayyad.
Look at that device in your hand.
No, really, take a good long look at it.
You carry it around with you wherever you go.
You sleep with it, work with it, run with it, you play games on it.
You depend on it and panic when you can't find it.
It links you to your relatives and kids.
You take photos and videos with it and share
them with friends and family. It alerts you to public emergencies and reminds you of hair
appointments. Traffic is light. If you leave now, you will be on time. In 2020, CBC Massey lecturer
Ron Deibert asked us all to examine the role of the internet in our lives and in our politics.
You check your social media account and it feels like a toxic mess, but you can't help but swipe for more.
Right-wing neo-fascist populism flourishes online and off, igniting hatred, murder and even genocide.
and off, igniting hatred, murder, and even genocide. Ron Deibert wrote his lectures in the early months of the COVID-19 pandemic, when dangerous trends he'd been studying for years began to accelerate.
Trends that made the utopian vision of the internet as a force for good seem like a distant dream.
For much of the 2000s, technology enthusiasts applauded each new
innovation as a way to bring people closer together and revitalize democracy. Now, social
media are increasingly perceived as contributing to a kind of social sickness. And the cure for
that sickness, Deibert argued, isn't to throw our devices away.
There is another way forward.
And in that spirit, he called his Massey lectures Reset, reclaiming the internet for civil society.
Resets allow time to regroup, clean house, take stock and look at the big picture, and launch a new plan.
Take stock and look at the big picture and launch a new plan.
Ron Deibert joined Ideas producer Pauline Holdsworth in conversation about Reset as part of a series marking the 60th anniversary of Massey College,
one of our partners in the Massey Lectures.
So it's very nice to be here in person with all of you.
Their conversation was recorded in front of a live
audience at Massey College. Thank you for having me. Thanks everybody for coming.
Ron Diebert is Professor of Political Science and Director of the Citizen Lab at the Munk School
of Global Affairs and Public Policy at the University of Toronto. The Citizen Lab undertakes
interdisciplinary research at the intersection of Toronto. The Citizen Lab undertakes interdisciplinary research
at the intersection of global security, information and communication technologies,
and human rights. As Edward Snowden puts it, no one has done more than Ron Deibert and his lab
to expose the enemies of the internet. So as I was rereading these lectures, something that
really struck me is the fact that you're writing about something that is constantly in flux and constantly evolving.
So I wanted to start by asking you, in just the three years since you wrote Reset, what do you see as some of the most significant changes?
You know, it was interesting to be writing this book during the pandemic.
I actually, I was working on it and then we had the lockdown and I was
writing it actually in my, one of my son's bedrooms because he was away at university and so was
stuck there. I literally wrote most of it while lying in bed. I hope that's okay to say.
A great tradition of writers writing in bed. Yeah, exactly. And I didn't get to do the cross-country tour. I
recorded in the empty, almost haunted, post-apocalyptic CBC studio. I was saying earlier
that when we walked in, there were literally coffee cups that still had coffee in them and
sandwiches and so on. It was really quite spooky. But clearly, as I was writing, I was thinking,
okay, this is a pretty momentous event.
And it's obvious that a lot of the themes that I'm describing here are playing themselves out
with the pandemic. And, you know, when I think about what has changed since then, part of my
answer will probably sound a bit pessimistic, because I think my brain is wired that way,
as principally an investigator. And looking at with the pandemic, I could see at the
time that, okay, we're clearly now accelerating our reliance on a technological ecosystem that
has all of these dysfunctionalities. And then, of course, there was all the disinformation,
which I think we'll probably talk about later. Social media by design, this is one of the major themes of the book, in order to capture and
retain users. After all, it is us and all of our data that drives the personal data surveillance
economy. In order for them to extract that data from us, they have to keep us engaged. It's just
a very simple mechanistic explanation of the underlying business model.
And unfortunately, the human condition being what it is, people are attracted to sensational extreme content.
That has unfortunately gotten a lot worse since the pandemic.
The major platforms put a lot of effort in the early days in correcting false information and trying to combat disinformation because they were under a spotlight.
After the pandemic has subsided a bit and that spotlight has become less intense, they very quickly laid off most of those trust and safety teams.
And so we're seeing an even ever-widening, deepening crisis when it comes to the kind of dysfunctionality around a lot of social media.
There is an inertia, an inexorable logic to surveillance capitalism.
This logic, manifest in each and every new social media innovation, compels platforms to acquire data about consumers from ever more fine-grained, distributed, and overlapping sources of information.
These sources dig deeper into our habits, our social relationships, our tastes, our thoughts, our heartbeats, our energy consumption, our sleep
patterns, and so on. Each new source of data is like a door that opens, only to present yet another
door to be opened. Amazon is developing drones to deliver their packages. Why not equip them with
cameras to map the houses to see if their customers might want
to purchase some new gutters? And now that we have that imagery, why don't we market it to the local
police too? Google's vehicles roam the streets collecting imagery for their street view feature.
Why not vacuum up internet data from open Wi-Fi hotspots while we're at it. Put simply, there is never too much data.
Sensors are built upon sensors in an endless quest for yet more data about our lives.
You write in the lectures about a kind of inertia with surveillance capitalism, that it's
in the lectures about a kind of inertia with surveillance capitalism that it's
nothing to do with me i swear emergency alert alert this is a test of the ontario alert ready system there is no danger
to your health or safety this was an actual, you would now see instructions for protecting yourself.
For more information about emergency alerts, please visit www.alertready.ca.
So perhaps an example of a positive aspect of the technology.
But then again, you see that there's an algorithmic quality to it, a depersonalization. Of course, the fact that the telecommunications companies at the behest of the government are able to reach inside every one of our pockets,
in this case, it might be for good reason. But of course, there are other ways in which that
pipeline right into our most intimate aspects of our lives can lead to all sorts of abuses.
So I hope that segue was good.
That was an excellent segue.
So picking up on that, I mean, there was a lot happening in the early days of COVID of exactly this nature.
You know, sort of tracking systems,
if you had come within a certain distance
of somebody who had tested positive, emergency alerts,
and you write at that moment about the danger of some of this being
normalized, and that when something is normalized, that we have a kind of inertia, that it's very
hard for us to unnormalize this. Are there particular things or forms of incursion into
our privacy that we have gotten used to in the COVID era that we should not have gotten used to?
Most definitely. I mean, there are many, many examples. But since we're here at Massey College at the university, I'll say that one of
my biggest concerns early on in the pandemic, both as a parent and an instructor, was to learn about
the ways in which universities were pushing out systems to monitor students during exams, to
essentially set up remote proctoring systems to prevent people
from cheating. These systems, you can understand the logic behind it and how some well-intentioned
administrator and perhaps a business person, you know, they converge around this idea.
Obviously, what are we going to do? Students are taking exams from their own homes.
How will we know if they are going to open up a book or cheat or whatever?
I've got a brilliant idea.
Let's plant some kind of technology, make it mandatory, on their desktop computers, on their mobile phones,
which will watch everything they're doing, record all their keystrokes,
record all their keystrokes, place cameras that focus in on their eyeballs to see if they're moving and maybe looking at something that they shouldn't be, what could possibly go wrong with
that? Very quickly, it became apparent that this was not very effective. So the studies that were
done afterwards show that this doesn't really prevent cheating. But it embodied all sorts of discriminatory practices.
People of color, for example, reported that they had much more difficulty setting up the systems to recognize their faces, which sounds kind of trivial.
Oh, that poor person has to take an extra 15 minutes to set up.
take an extra 15 minutes to set up. But if you imagine the mindset of a student, I can still recall the anxiety and the fear that I'd have going into a gymnasium to write an exam. It's
really unfair to put that additional pressure on. And now, although I think there's been some
resistance to that type of remote proctoring, I can see that systems like it have become now deeply embedded
in the university, University of Toronto and other universities as well. You know, I think society,
when we go through crises like this, or even just a major event, a sporting event, for example,
is another good example, like the World Cup or the Olympics. If you look at any host city where the Olympics have been held or the World Cup,
prior to the event, the government invests in a lot of surveillance technology, CCTV cameras,
AI-empowered systems, and so forth, for good reason, to keep people safe. But then after the
event has finished, it's very customary for those systems to remain in
place and then get used for other purposes. That may not be a bad thing were it not for the fact
that usually there are not accompanying restraints. And by restraints, I mean proper oversight
mechanisms to prevent that data that is being collected being used for purposes that can harm people
or undermine civil liberties or human rights.
It's late 2013.
Blockbuster reports of state surveillance are dominating the news cycle,
courtesy of former National Security Agency contractor and whistleblower
Edward Snowden. I settle into a lounge in Toronto's Pearson International Airport and boot up my
laptop. I click the I accept button for the Wi-Fi agreement and connect with a virtual private
network to Citizen Lab's servers, effectively wrapping my internet connection in an encrypted tunnel.
Reflexively, I pause to consider whether I've made any minor errors in my digital security routine
that would inadvertently expose my sensitive communications. Does this really protect me
from a sophisticated threat actor? There is slight relief as the VPN flashes connect
and the encryption is complete, but the anxiety never fully disappears.
Among the emails in my inbox is one from an investigative journalist from the Canadian Broadcasting Corporation who wants to speak to me.
I plug my headphones into my iPhone and call. He has a lead on the Snowden disclosures.
phones into my iPhone and call. He has a lead on the Snowden disclosures. Greenwald and company have shared with us some of Snowden's cash that relates to Canada's spy agency, he explains,
referring to journalist Glenn Greenwald, one of a very small handful to whom Snowden entrusted
his materials, and we want to get your confidential input. Oh really? Sounds interesting, I reply. Tell me more.
It seems to be some kind of top-secret program to spy on Canadians here in Canada, he says in
hushed tones, as if whispering over the phone would shield us from unwanted surveillance.
It's difficult to interpret the slides, he continues, but it looks to be some kind of real-world, proof-of-concept experiment
in which CSEC, the communications security establishment, Canada's signals intelligence agency,
is tracking travellers by hacking into Wi-Fi hotspots in domestic airport terminals and lounges, including Toronto's Pearson Airport.
Toronto's Pearson Airport? Hacking Wi-Fi hotspots in lounges?
I look around me with new apprehension, over my shoulder, down at my laptop, at the mobile phone in my hand.
I focus on the ceiling above and survey the scattered Wi-Fi routers with their green lights and antennae.
What once seemed innocuous suddenly feels ominous.
Each of the lectures in Reset, for those of you who haven't heard or read them yet,
many of them open with a really vivid moment in time and place.
many of them open with a really vivid moment in time and place.
And so I just wonder, of the last three years,
if you were writing a coda now,
is there a particular moment that you would pick?
Yeah, that was a fun part of the book.
And by the way, I should mention the editor, Janie Yoon,
who I worked with for the manuscript.
She's working for House of Anansi Press then, and I really benefited by
her wisdom. When I sent her my early drafts, I had these opening sections of each chapter that
was essentially a narrative, a story. And she really liked those to the point where she then
commissioned me afterwards to write a book, which I'm finishing up now. She said, I want you to write
a book that's nothing but that. So I've had fun completing that manuscript. Most of them are kind of horror stories for the most part.
Given the nature of the work of the Citizen Lab, it's doing the type of work that we do. There are
a lot of experiences with unpleasant bad people, usually some autocrat somewhere around the world,
Experiences with unpleasant bad people, usually some autocrat somewhere around the world,
or some private intelligence agency that's tracking us down,
which is one of the stories that I open a chapter with.
I would say two stick out. One is, I didn't get to write this in Reset, but after Reset came out,
the Citizen Lab did a major investigation exposing a massive domestic espionage campaign in Spain targeting Catalan civil society.
I decided after that report came out, I wanted to travel to Spain, to Barcelona, in order to meet with, I think there were 65 separate victims that we had worked with over two years. And I
thought I want to do a follow-up very quickly to talk to them about their experiences and just
gather more data because we weren't able to collect as much forensic data as I thought
was available. I actually flew to Frankfurt and then to Spain In order to avoid flying directly into Spain, I was very concerned about
the fact that our report had generated this intense controversy. And I thought it'd be just
safer to travel to Europe through Germany, avoid Spanish customs directly. You can't lie to a
customs officer, but maybe you can evade it in certain ways.
Anyway, I arrived in Barcelona from Frankfurt, stepped off the airplane,
and looked down at a newspaper to learn that the head of the Spanish intelligence agency was fired because of our report,
which was a pretty big impact.
In terms of impacts of the Citizen Lab's work,
that's one of the largest.
And it was pretty unsettling to be in that position,
walking off the airplane.
Okay, there are a lot of people that are going to want to know what the director of the Citizen Lab has on his laptop.
And now there are a lot of very powerful people
that are probably pretty upset at the fact that we published this report,
and here I am just stepping into it.
So that would have been a good story to open up with.
And that does, you know, it does give us a sense
of what the real-time, real-world impact of this is,
you know, alongside some of these sort of more horror stories
of encountering, you encountering the Mossad agent
or whoever it is who is trying to find out what you're doing.
That wasn't fun.
That didn't sound fun.
Since you wrote Reset, there have been some significant changes
to X, the platform formerly known as Twitter,
and meta-platform platforms like Facebook and Instagram.
This is something that we've been thinking about a lot in our world of journalism. I want to ask
you what those changes mean, in particular for our understanding of war and crises that are
unfolding around us in real time. Yeah, it's a disaster, actually, I would say. If you were to write a dystopian science fiction novel
and try to conjure up a scenario that would be very bleak,
you couldn't do better than just looking at what's happened
with Twitter, X, and also Facebook.
You have the world's wealthiest person
who purchased Twitter seemingly out of vindictive motivations.
I'm not sure what the game plan was there. It doesn't really matter.
The point of it being that you had a single person, an oligarch really, who took over what has unfortunately become one of our principal public spheres and immediately set into motion all sorts
of ad hoc arbitrary personalized decisions around how to do something so important around that
public sphere in terms of things like content moderation or reducing harm what type of
conversations can happen and how it was all seemingly decisions undertaken on a whim.
You know, for those of you who are into political philosophy,
and maybe you've read Jürgen Habermas,
and he has this book about the ideal conditions for the public sphere.
Well, in every aspect of what went on with Elon Musk taking over Twitter,
you have the antithesis of it, right?
So Habermas
would be freaking out if you were to present this scenario to him. So you're asking about
armed conflict. I mean, just from a basic level, you know, in something so serious as a situation
where you have violence occurring, you want as much as possible to have flows of information
happening. So, you know, on one level, just people's safety.
On another level, doing something like documenting war crimes,
which can end up being very important later on for reasons of justice and holding people accountable.
None of that is really possible or at least easy today
because the entire information ecosystem is
working in a dysfunctional way, entirely contrary to those aims. So everybody who works in my field
and some of the people who work on documenting war crimes that I speak to, they're almost
resigned to, it's impossible to do this in the face of this.
The flood of disinformation is just extraordinary right now.
What kind of conversations are you having at Citizen Lab and with your other collaborators about the impact of all of this on elections?
There's a lot of attention to elections, and elections are an important part of the democratic process.
There's the same syndromes playing themselves out there.
In Canada, we hear a lot about so-called foreign interference in our election processes.
And usually people speak of that with respect to two protagonists, China, Russia,
and we're concerned about those governments' attempts to influence our elections,
which no doubt is going on. But to me, that's kind of a symptom of a much deeper, wider problem
that we have as a society that precedes something like an election process. We can't even have
normal conversations like this. This is a point that Astra Taylor, the 2023 Massey Lecture,
makes a lot in her work on democracy, her film, What is Democracy, that democracy is not kind of
a one-time action that happens at elections, but something that is, you know, a very profound part
of, or should be a very profound part of the fabric of our daily lives, our physical spaces.
part of the fabric of our daily lives, our physical spaces.
In what ways are some of the forces that you study kind of undermining that kind of larger democratic fabric that might get less attention?
The biggest concern, or one of the biggest concerns I have right now,
is the spread of impunity along with authoritarian practices.
And again, here, I think there's a tendency,
maybe because of the way that the international system
has been structured for centuries.
We think about these things in terms of liberal democratic states over here
and bad authoritarian governments over there.
I think that category doesn't make sense any longer because the real concerns I have are around authoritarian structures, which cut across those boundaries.
You can see elements of authoritarianism in the United States. We don't need to look very far to
find evidence of that right now. And then layer on top of it the fact that we face these very
serious existential crises. And if we are going to deal with them, with those structures in place,
it'll be extremely difficult. So we need to somehow solve all of this in order to address
what are the most pressing risks to the human species and to the planet.
So that's how I see it.
It's less to do with technology.
It's more about the phase of history that we're in right now.
Very, very dangerous time. On Ideas, you're listening to Ron Deibert reflecting on his 2020 CBC Massey lectures entitled Reset, Reclaiming the Internet for Civil Society.
You can find ideas wherever you get your podcasts.
And on CBC Radio 1 in Canada, across North America, on US Public Radio, and on Sirius XM,
in Australia, on ABC Radio National, and around the world at cbc.ca slash ideas.
I'm Nala Ayyad.
My name is Graham Isidore.
I have a progressive eye disease called keratoconus.
Unmaying I'm losing my vision has been hard,
but explaining it to other people has been harder.
Lately, I've been trying to talk about it.
Short-sighted is an attempt to explain what vision loss feels like
by exploring how it sounds.
By sharing my story,
we get into all the things you don't see
about hidden disabilities.
Short Sighted, from CBC's Personally,
available now.
Tuesday, October 2nd, 2018.
10.09pm, Dunhaag, Netherlands.
12.09am, Istanbul,
Turkey. A WhatsApp message
appears. I'm not
freaking out, but Mr. Jamal Khashoggi
was kidnapped this morning, so I'm not
pretty sure about what is going on.
Jamal Khashoggi was a
Saudi journalist and dissident
who was assassinated at the
Saudi consulate in Istanbul
in 2018 by agents of the Saudi government.
And that message Ron Deibert received was from Omar Abdulaziz, a Saudi activist now living in Canada.
I looked down at Omar's text as I checked into my hotel room for a cybersecurity conference in Den Haag, Netherlands. At that precise moment,
nearly 3,000 kilometers away in Istanbul, Turkey, a macabre, premeditated act of murder was being
covered up by the butchers responsible. At that time, like just about everyone else in the world,
I had no clue about the gruesome murder.
My pressing concern was with the urgent text message glowing up at me from my phone as I looked out my hotel room window into the night.
Jamal Khashoggi is missing? I asked myself.
What does this have to do with Omar? Why is he so frightened?
The link connecting Ron Deibert, Jamal Khashoggi, and Omar Abdelaziz
was a piece of spyware called Pegasus.
Pegasus allowed operatives of Saudi Prince Mohammed bin Salman
to get inside Omar's phone while he corresponded with Jamal Khashoggi.
Thanks to Pegasus, they could observe his every movement.
They could see his Twitter posts,
emails, and SMS messages,
even while they were being composed.
They could turn on the camera
and microphone to record his meetings,
or gather incriminating pictures
of Omar's private affairs
that in turn could be used for blackmail.
But most insidiously, they could silently observe,
as if looking over his shoulder,
while Omar and his friend Khashoggi hatch plans to mobilize opposition to MBS and the Saudi regime.
Pegasus is the product of a company that Ron Deibert and his colleagues at Citizen Lab have been tracking
for years. Perhaps the most notorious of the spyware companies we at Citizen Lab have been
tracking is widely considered among the most sophisticated. Israel-based NSO Group, also known
as Q Technologies, a company closely aligned with the Israeli Ministry of Defense.
Far from taming abuses connected to the spyware market, Israel's Ministry of Defense routinely
grants export licenses for NSO's sales, as well as those of other Israel-based surveillance companies.
NSO Group first came onto our radar in August 2016 when award-winning United Arab
Emirates-based human rights activist Ahmad Mansour received two text messages on his iPhone
purporting to show evidence of torture in UAE prisons. As a human rights defender,
Ahmad might have been tempted to click on those links.
Instead, he forwarded them to Citizen Lab for analysis.
Clicking on those links in a laboratory setting allowed us to infect an iPhone we controlled
and inspect a copy of NSO Group's custom Pegasus spyware.
The spyware was extraordinarily sophisticated and included exploits that took
advantage of three separate flaws in Apple's operating system that even Apple itself was
unaware of at the time. Throughout 2017 and 2018, we partnered with Mexican human rights investigators at organizations like SocialTIC and R3D to follow up on technical leads from our network scanning to identify abusive targeting in Mexico.
lawyers, and even international investigators into mass disappearances whose phones had received text messages embedded with NSO-laden links.
We found that the Mexican government operators of the spyware would even attempt to infect the devices of Target's friends and family members. Two days after Mexican investigative journalist Javier Valdez Cardenas was gunned
down in the streets of Mexico in a cartel-linked murder, we confirmed that his wife and colleagues
received SMS messages tainted with NSO spyware links purporting to show evidence of who was
responsible for the killings. Similarly, we discovered that the devices
of a group of international investigators
looking into a grotesque 2014 mass disappearance
of 43 Mexican students
were all targeted with NSO's spyware.
We've documented thousands of cases like this.
I think it's the most serious single-issue crisis when it comes to civil society right now,
is the unregulated proliferation of this type of highly invasive technology.
If a government autocrat can get inside the phone of anyone anywhere in the world,
and I'm describing the latest version of this technology,
with no need to interact with that target.
The target has no idea.
One minute their phone is entirely, perfectly fine,
sitting on their bedside table.
With a click of the button, at the next moment,
it's funneling data to a bunker in Riyadh, Saudi Arabia.
This is a recipe for disaster.
As part of a series marking the 60th anniversary of Massey College,
Ron Deibert spoke with Ideas producer Pauline Holdsworth about what's changed since 2020
when Reset, Reclaiming the Internet for Civil Society, came out.
You write that this kind of technology
gives governments an, quote, almost godlike power to get inside people's lives. What does that mean
for the nature of authority in the 21st century? Well, I think it's the consequences of this,
we are seeing it in what I described earlier in the spread of authoritarian practices. There's now a concept that we have helped try to articulate and raise awareness about called transnational repression.
So when you think about authoritarian governments, you think about what they are doing domestically.
And certainly that goes on.
They are doing stuff within their borders to prevent dissent and control populations. But now,
even the poorest country in the world, we discovered Ethiopia, thanks to one of these
mercenary spyware firms, was undertaking cyber espionage against targets in more than 25
countries, including Ethiopians here in Toronto. So you're giving capabilities to some of the world's worst autocrats to track
people and neutralize them, regardless of where they are anywhere in the world. A lot of people
come to a country like Canada for their safety, thinking, okay, I can come to this country,
as long as I'm out of the reach, the physical reach of my adversary, I should be fine. But what they're finding is that
not the case at all. Mohammed bin Salman can be inside not only your bedroom, in your head.
That is terrifically demobilizing for people. Even if they're not hacked and they learn about this,
the fear leads to a kind of paralysis and trauma.
The psychological trauma that we've documented of victims worldwide is astounding because of this type of surveillance.
Are there any particular developments in the last two to three years that to you is most notable for how this technology is playing out?
Thankfully, we've had some progress in that area with the Biden administration.
Remarkably, President Biden signed an executive order recently trying to at least stem some of the harms around this industry that we've identified. At the lab, we've gotten very good
at doing this type of work. Several times we've captured these very sophisticated exploits that
are being sold to governments
and have done responsible disclosures to the tech platforms, Apple and others,
who have tried to patch the problems that are being exploited.
And we're really trying hard to alert governments to the issue and hope that some wake up to the problem.
With our investigations, we do forensic work with victims. So what we do is we actually are able to, with their consent,
under research ethics protocols, examine data from their devices. And in several notable instances,
we've actually been able to acquire copies of the exploits that are used by the spyware firms to hack a device.
That is something that's very difficult to do because you have highly trained engineers
whose job it is to construct an exploit, hack into a target's device in a way that evades
forensic analysis. In fact, some of these firms actually advertise to hire people
specifically to evade the citizen law. But we've been able to have some successes where we actually,
oh my God, this victim has been hacked. And not only that, but the engineers made a mistake. We
were able to capture a copy of this exploit. An exploit, just to give you some sense,
capture a copy of this exploit. An exploit, just to give you some sense, to hack into an Apple product, if you were to go out and try to purchase one, as some of these firms do, there's a market
for these exploits, typically runs about a million dollars each for an Apple exploit. So this is very
expensive technology that has to be used. Each time that we capture one and do a responsible
disclosure to one of the vendors, we're effectively disarming momentarily one of these mercenary spyware
firms, but we're also raising the cost for them because they have to go back to the drawing board.
It must be very frustrating for them. When we did the responsible disclosures to Apple several times,
when we did the responsible disclosures to apple several times apple uh actually a company that's very difficult to work with frankly uh they're very closed they're not transparent about what
they do i have many criticisms of apple so i'm not uh here to promote them but they did do
something quite positive first of all they decide they're going to sue one of these manufacturers of mercenary spyware in U.S. court, NSO Group, and that litigation is proceeding.
Then they decided to do something remarkable, which is notify every victim that they could discover of spyware and send them an alert.
send them an alert. And as these alerts go out, they're essentially, the way I think about is they're shaking a tree, and all of these victims fall to the ground because Apple's notifying them.
And eventually they make their way to either us or one of our partners. So it's like a triage
that Apple is doing for us. And we've had a number of cases now where we've been identified like
dozens of victims in specific countries like
El Salvador, I mentioned Spain earlier, Thailand, 35 victims in the middle of protests there. So
Apple's doing this great service by notifying people. I think that's something very positive.
So if you just want to put your hand up, Emily Mockler will come around with the microphone.
Thank you so much for that.
My name is Sabrina Dellen, and I'm at the Samara Center for Democracy, where we study technology's influence on our democracy, and we also care about youth civic engagement.
So my question is, you opened by saying you started writing your book in your
son's room, you're a parent, and I want to know about how it works in your world with your kids
and technology and how you keep them safe and what your tips are for us.
Yeah, you know, honestly, I don't pretend to have any special answer on that.
I think that I've tried as much as possible just to do the right thing.
And in fact, I've learned a lot from my children because this is what I'm saying, I think, is apparent to a lot of people that children encounter technologies that a lot of other people learn about after the fact.
And so you learn by speaking to them about how they've used it.
I also think my kids, one of whom is here, by the way, Rosalind,
they were very close and they've observed what I've been doing professionally, even perhaps when they were too young to really appreciate the full scope of what was going on,
I think it's fair to say, maybe Rosalind will back me up here or not,
that they learn by witnessing the work that we do.
And I think I know in talking to them, they all have their radar up about these things.
But putting aside my own personal experiences, I think it's really tough.
I see now young kids walking to school with cell phones.
I just think, oh my God, it just must be such a challenge to just manage that
because the pressure is so great.
By the way, there are all sorts of evidence now emerging from a new whistleblower at Meta
specifically about Instagram's harms to children
and how the company knows about those harms,
didn't take the appropriate steps to design their product
in a way that can enable children who experience things
like unwanted sexual advances to be able to do something about them.
Of course they didn't.
Why? Because anything like that is introducing friction in the business model,
and it costs them money.
So they're looking for ways to evade that.
Thanks, Ron, for the great speech and nice reflections.
So when you wrote the book or the speech Reset, we knew about Pegasus.
And then since then, we have more chaos in the world and in the cyberspace, right?
War in Ukraine, whatever.
So the Citizens Lab, do you guys see the next Pegasus somewhere in the shadows, rising up?
And do you have any indications of, like,
is it again from Middle East or is it somewhere in Europe?
If you can comment on that.
About the mercenary spyware industry,
you have companies coming from different parts of the world
for various historical reasons, whatever.
A lot of concentration of companies
that for one reason or another have come from Israel.
And there's a story behind that
that's easily told and understood.
However, the industry is definitely diversifying,
and we see companies now from other parts of the world. It won't be long before we see,
and we already do see, companies from India, companies from China. So it's not a geographical
or country-level problem. It's a global problem. And that's why we need regulations at that level
to fix that particular problem.
In terms of the lab and what we're most worried about,
I will say personally, I know my team shares this,
I'm very, very concerned about disinformation for hire,
or you might call it the dark PR industry.
So mostly this revolves around litigation, but it also involves government attempts to neutralize opposition and dissent.
They can now go out and hire from a firm a very comprehensive set of services to essentially discredit somebody.
And as a victim of this,
it's almost impossible to defend against.
And it can be done in very ineffective ways,
as it is now.
If you look at the examples of what I'm describing,
and we've been hit with this a lot,
they're almost laughable on the face of it, like, oh, this is ridiculous. No one's going to believe this. But with enough persistent
information that's fed out, disinformation that's pushed out about a person, people start
questioning, like, oh, didn't I hear something, Citizen Lab? Wasn't there some controversy?
Wasn't there something they were doing inappropriate? I thought I read that. That alone achieves something. The next level of that will
be something that is far more sophisticated, using deep fakes, artificial intelligence,
to essentially undermine people who are trying to do good or expose wrongdoing. And it's trivial
to go out and buy these services now. I really
worry about what it'll look like five years from now or 10 years from now. And it'll be very
difficult to regulate. Sorry to add more worries for all of you. It's a lot of work for us, though.
It's good. Bad for the world, good for us. James Urbinski Director of the Doddler Institute of Global Health Research at York
and also professor here at the University of Toronto,
Donald Holmes School of Public Health.
Ron, you paint a picture of a digital revolution.
It really started in the 90s,
and we're still in yet another phase of that revolution.
And it covers everything from the introduction of computers
all the way through to artificial intelligence.
And you also paint a picture, frankly, that is quite dystopic and bleak.
And now if you could just kind of step into that space between the revolution and the
bleak dystopia, the little black box.
One possibility of that black box is, in fact, the realization of this dystopic future that you're alluding to.
Another is a redesign of a potential future.
You talk about the importance of oversight.
You talk about the importance of regulation and so on.
But frankly, it seems egregiously ineffective, late, always post facto, never from a precautionary perspective, and so on.
So here's the question. What do you see inside that black box that can lead us to a redesigned
future? What kind of things do institutions, governments, and individuals need to do now in order to move that black box
in the right algorithmic output, which is a redesigned hopeful future?
Yeah, that's a great question. I spent a bit of time at the end of Reset in the final chapter,
at least trying my best to describe what I think are some of the elements of what would be that
pathway. And by the way, the title of the book is meant to evoke that, right? Reset. It's time to
stop, take a pause. Where are we going? And I should say, by the way, Philip Coulter, the producer
here, was the person who suggested the title to me because I was describing the topic. He said, hey, what about reset? That's great. That captures it. I don't think it's like rocket science.
To me, maybe I'm wrong here, but it comes back to some basic things that most of us recognize
and understand. Just let me give you one example uh the importance of civic culture what it means to
be part of civil society how one conducts themselves in a setting like this that that
maybe sounds a bit old-fashioned maybe even to some people part of a system that needs to be
done away with um but i think uh there are elements of that that thinkers have identified going back to ancient times that are important ingredients to how we live together.
Just treating each other with respect and dignity and how you conduct a conversation.
Where does that come from? It's not something that comes
innately, because we're very complicated beings, and we have lots of competing instincts, not all
of them good. So it has to come from education. The problem is, if you look at systems of education,
I'm going to simplify a bit, but there's been so much emphasis on training people to feed the machine, to basically, you know,
it drives me crazy, but you often hear the university being described as a job creation
enterprise. And I get why, you know, young people want to have jobs. I get that. A lot of people
want to justify giving money to the university in order to create jobs. That's fine. But let's not
forget what the university is ultimately supposed to be about. Part of it is what I'm just describing
here. It's indoctrinating people in how to conduct yourself as part of civil society.
In order to do that, you need to be learned in the arts and in literature, in areas that are being decimated as we speak.
This is a very rare thing we're doing here.
Day to day in the university,
you don't get a lot of opportunities to do this sort of thing.
This is a lost art, what we're doing right now.
So I won't go through all the other elements,
but that's just one.
I think you could do something similar
when it comes to this idea of getting back to nature, so to speak, right? Getting away from our devices. There's a
lot of people think, oh, maybe we should address the problems that Ron and others are identifying
by throwing our phones in the ocean and never using technology again. I don't think that's
possible, and I don't even think it's ideal. However, the idea of
spending a bit more time in touch with the natural world on a multitude of levels, I think, is obvious
to me. So how do we do that? How do we encourage people to do that more in their day-to-day life?
You know, those are a couple of pathways that I would suggest.
To get us started, I propose a single, simple, but hopefully potent principle, restraint.
Restraint is primarily defined as, quote, a measure or condition that keeps someone or something under control or within limits.
Secondarily, it also means quote, self-control, as in unemotional, dispassionate, or moderate behavior.
Both senses of the term point to general qualities that will be essential to preserving
rights and freedoms in our supercharged, hyper-networked world of data.
The last lecture is about restraint and restraints. And you're calling for sort of a
small-R Republican concept of restraints on some of these technologies and forms of impunity, uses of authority.
Three years later, what kinds of restraints, I guess, feel most successful or most urgent to you?
Yeah, that's a tough question.
So restraints, and what Pauline was saying there about small R republicanism,
important not to confuse with the Republican Party, definitely not that. there's a tradition within liberal theory and i would argue at the heart of of
liberalism really is this idea of checks and balances we all know of the architecture of
of the united states for example and its founding this idea of the separation of powers is an example of one very important type of political restraint.
At the heart of the liberal philosophy is the idea that you replace a system of rule that's based on either wealth or some kind of religious devotion with a rule-based system.
system and you specifically design that system with oversight and various checks and balances to prevent the abuse of power and to protect civil liberties and human rights. So at the end of Reset
I was simply advocating for us to remember this is a very important part of the liberal story. We
need to have restraints and at a time when technology is accelerating and invading our
lives and all around us, we need to think about how to build an architecture of restraint around
this. Since I wrote the book around social media, it's been a bit of a mixed picture.
So you've seen some efforts to restrain the big tech companies. In the European Union, for example, we have the
Digital Services Act. In the United States, there are investigations into monopoly practices
with Google and other tech platforms. Unfortunately, a lot of security agencies still
operate in the shadows, largely without restraint.
And that's a big problem because today you have super empowered intelligence agencies that can do the sorts of things I've described without accompanying checks and balances, or at least not robust enough, in my opinion.
I don't think it's gotten any worse since I wrote Reset. I wouldn't
say it's gotten better either. But that is a good reminder that when we speak about things like
architectures of restraint, we have to remind ourselves that this is an ongoing practice.
It's never going to be settled. We can't take them for granted. We have to constantly remind
ourselves that this is important.
We need to have proper oversight over what we do politically in order to ensure that our way of life survives
in the midst of all of these threats.
And in architecture, a building takes a long time to build.
That's right, and you have to maintain the building.
Thank you so much, Ron.
Appreciate it, Pauline. Thank you.
Thanks very much. Appreciate it, Pauline. Thank you. Thanks very much.
Appreciate it.
You've been listening to the 2020 Massey Lecturer, Ron Deibert,
speaking with Ideas Producer, Pauline Holtzworth.
If you'd like to hear more from Ron, you're in luck.
Our next episode revisits his sixth and final lecture, where he explores the kinds of restraints we need to place on government and corporations, and on our own endless appetite for data.
and on our own endless appetite for data.
Digital technologies have so deeply embedded themselves into everything that we do,
it is unrealistic to expect we can turn the clock back entirely.
Nor should we.
We need an open and secure means of communicating globally in order to manage our planet and our collective affairs.
It is just that the current design for it,
based around personal data surveillance,
is counterproductive to those aims.
As Buckminster Fuller once said,
we do indeed live in spaceship Earth,
but we are stuck with a poorly designed operating manual.
This episode is part of a series of conversations
with and about former
Massey lecturers to mark the 60th anniversary of Massey College, our partner in the Masseys.
This episode was produced by Pauline Holdsworth. Thanks to Massey College and former principal
Nathalie Desrosiers. Technical production, Joe Costa, Philip Coulter, and Danielle Duval.
Our web producer is Lisa Ayuso. The acting senior producer is Lisa Godfrey.
The executive producer of Ideas is Greg Kelly, and I'm Nala Ayed. For more CBC Podcasts, go to cbc.ca slash podcasts.