Ideas - Reset: Reclaiming the Internet for Civil Society | Tech Expert Ron Deibert
Episode Date: April 26, 2024In 2020, CBC Massey lecturer and tech expert Ron Deibert asked us to consider how to mitigate the harms of social media and construct a viable communications ecosystem that supports civil society. We ...revisit his final Massey lecture that explores the kinds of restraints we need to place on government and corporations — and on our own endless appetite for data.
Transcript
Discussion (0)
Hey there, I'm Kathleen Goltar and I have a confession to make. I am a true crime fanatic.
I devour books and films and most of all true crime podcasts. But sometimes I just want to
know more. I want to go deeper. And that's where my podcast Crime Story comes in. Every week I go
behind the scenes with the creators of the best in true crime. I chat with the host of Scamanda, Teacher's Pet, Bone Valley,
the list goes on. For the insider scoop, find Crime Story in your podcast app.
This is a CBC Podcast.
Welcome to Ideas. I'm Nala Ayyad.
In 2020, Ron Deibert came to the nearly empty CBC building in downtown Toronto to record the Massey Lectures in studio.
The COVID-19 pandemic made our usual multi-city tour impossible.
Ron is the founder and director of Citizen Lab.
It explores global security, information and communications technologies,
and human rights. So he's had a front row seat to online realities that were spiraling out of
control. You've read about high-tech mercenary companies selling powerful cyber warfare services
to dictators who use them to hack into their adversaries' devices and social networks, often with lethal consequences.
First, it was Jamal Khashoggi's inner circle.
Then, allegedly, Jeff Bezos' device.
Maybe I've been hacked too, you wonder to yourself,
suddenly suspicious of that unsolicited text or email with an attachment.
text or email with an attachment.
We're all living in a digital ecosystem that often feels impossible to escape.
The world you're connecting to with that device increasingly feels like a major source of personal risk.
But it's also become your lifeline now more than ever.
Ron Deibert's Massey lectures were called Reset, Reclaiming the Internet for Civil Society.
In the language of computers and networking,
the term reset is used widely to refer to a measure
that halts a system and returns it to an initial state.
In his final lecture,
Ron suggests what that reset could look like.
To get us started, I propose a single, simple, but hopefully potent principle.
Restraint.
This episode is part of a special series we've put together to mark the 60th anniversary of Massey College,
one of our partners in the Massey Lectures.
It features the sixth and final installment from Ron Deibert's lectures
called Retreat, Reform, Restraint. Along the way, you'll also hear from other thinkers commenting
on his lecture. This is Lecture 6, Retreat, Reform, Restraint. Part 1. A Fragile Ecosystem.
There is an undeniable gestalt in the air, a dawning recognition that something of our own
making is contributing to a serious kind of social and political sickness. Our very tools
and techniques threaten to wipe us out, says Siva Vaidyanathan in Antisocial Media. We display
unlimited talents but no mastery. We process infinite data but display no wisdom. It's
remarkable to think it was only a few short decades ago that the internet was heralded as
a wonderful new tool that would enlighten and liberate us, dissolve the boundaries of
time and space, and tie us together more closely into a single global village.
But in the span of a single generation, it has transmogrified into something far more
complex and disturbing, its effects suddenly very ominous. What started out as something so simple as desktop PCs networked together via a common protocol has morphed into an always-on, omnipresent data vacuum cleaning operation undertaken by gigantic corporate platforms with unparalleled abilities to peer inside our minds and habits and subtly shape our choices.
Their operations implicate a planet-wide network of gigantic energy-sucking data farms.
The ecosystem has spawned a bewildering variety of invasive species
that thrive by feeding on the continuously expanding pools of data
that spew forth each millisecond of every day.
App developers, data brokers, location trackers, data fusion companies, artificial intelligence
startups, and private intelligence firms. Accountability is weak and insecurity is
endemic throughout the entire system, creating seemingly endless opportunities
for malevolent exploitation by spies, kleptocrats, dark PR firms, and other bad actors.
It's as if we have sleepwalked into a new machine-based civilization of our own making,
and we are just now waking up to its unforeseen consequences and existential risks.
It's clear that there is a growing consensus that many things are wrong with our social media habits.
Symptoms of this malaise are seemingly everywhere, fairly easy to identify, and increasingly enumerated by scientific studies.
But what to do about them is less obvious. There is nowhere near a consensus
when it comes to a cure. This lack of clarity around solutions is certainly understandable.
The challenges thrown up by social media, surveillance capitalism, and near total state
surveillance have arisen so swiftly that we have barely had time to understand how they work,
let alone fix them. But of course, it's not all just ennui and fatalistic acquiescence in the
face of social media's harms. Proposals for mitigation or alternatives to social media,
as currently constituted, are frequently raised and plentiful. Some are quite
interesting and worth considering. Others may be flawed in various ways or self-serving.
Many that are worthwhile are incomplete. Many of them feel like fragments of a missing whole
of which they might be a part. Take the family of recommendations to mitigate social media harm that I call retreat.
These are the solutions that advocate for some variation of going off the grid,
either by throwing out our devices and applications completely and going back to a time before social media,
or, in slightly more reasonable form, simply taking a break from them once in a while.
Proposals such as these can be found in pleas to unplug, disconnect, or perform periodic cleanses,
digital detoxification as it's widely described.
The concept of a digital retreat is appealing on many levels.
There is a simplicity to it that makes it alluring.
It is true that one thing we need to recover is our connection to the natural world.
Slowing down is a good idea too.
But there's a major problem so obvious that it may be easy to overlook.
Sure, it's fine if a few isolated communities completely detach, unplug, and retreat, and
there's no doubt we could all use a little digital detox once in a while.
Meditation is great too, and best done without an app.
But can it scale?
What if everyone quite literally unplugged?
How would we then manage ourselves, our social relationships,
our problems, and our politics? Detachment and retreat also ignore, or at least slight,
the many positive uses of digital technologies, social media included. In spite of disinformation
and overreaching surveillance, social media have proven highly useful for many
problems. Even prior to the COVID-19 pandemic, digital technologies have been used extensively
to monitor the environment, to share information in ways that the original designers intended
– think of Wikipedia – to mobilize social movements and hold bad actors to account.
Furthermore, it is frankly impossible to live in complete detachment from social media today.
Even the self-isolation and social distancing in response to the COVID emergency did not dissolve
digital connections among people. In fact, it deepened our reliance on them. Even if you were
to try and completely escape from social media, unplug your router, throw away all of your devices,
and never look at social media again, you'd still be subject to surveillance.
Facebook and other social media have shadow profiles of people who do not even use their services.
CCTV cameras are everywhere. Go ahead, live the dream, and move to rural Montana. You'd still be
watched by drones, planes, and satellites. No matter where you go, you will be counted.
Digital technologies have so deeply embedded themselves into everything that we do,
it is unrealistic to expect we can turn the clock back entirely. Nor should we. We need an open and
secure means of communicating globally in order to manage our planet and our collective affairs.
It is just that the current design for it, based around personal data surveillance, is counterproductive to those aims.
Outright rejection of social media is thus both undesirable and futile.
As Buckminster Fuller once said,
We do indeed live in spaceship Earth, but we are stuck with a poorly designed operating manual.
Hi, Tamsin Shaw here.
I like Von's metaphor of a poorly designed operating manual because if you think about the hybrid economy of Silicon Valley
and what it was set out to accomplish,
a huge amount of the
funding came from national security agencies, from their venture capital wings, in order to accomplish
national security goals. But of course, the companies wanted to commercialize those products
in order to make a profit. So those are the two basic functions of that machine,
capitalism and national security. And I think now there's no technological reason that we can't
create social media platforms that don't just serve those two ends. But the problem is,
where does the incentive come from? It's not going to be from the people who own those tech platforms
and make billions of dollars from them. It's not going to be from politicians who now rely on them
to get elected. And it's not going to come from people whose only real form of social and community
organization is actually on those platforms. And I think this is symptomatic of something that's true of humans in general,
which is that technological problems are the easy part.
It's the very big problems like climate change or global inequality
that are transnational and that require political solutions
that we have problems with.
and that require political solutions that we have problems with. Marxists used to imagine that there were these contradictions of capitalism and that eventually they would be overdetermined and that
would lead to us creating something better. I think the worry now is that maybe they don't
lead to something better overthrowing capitalism. Maybe they just lead to
our self-destruction. And I think that's the problem that we have to confront now. That's what
we'll need to address in the future. It's Misha Glenny here. I've felt for a long time that what
we face now are what I call the four horses of the modern apocalypse. One of them we're going through at the
moment, pandemic, and the other three are climate change, which is the ultimate threat, weapons of
mass destruction, which over the past 20 years have been proliferating quite significantly,
and the regulatory systems are deteriorating for that. But the fourth one is our over-dependency on network computer technologies.
And I think this is a really, really serious danger that we face.
And so for me, if we do not address these issues
and come up with a new system of managing network technologies,
then there is little doubt in my mind that they will destroy us.
Part 2. Reform, Reset
Then there are the proposals that advocate for some variation of reform, that is, adjustments to one or another element of social media's business practices.
Reform proposals range along a spectrum from minor to major adjustments and lesser to greater degrees of formal government intervention.
The intent is not to dismantle social media, but to fine-tune them
instead. For example, corporate social responsibility, ethical design, and other such
initiatives typically involve the least intrusive measures and entail only minor fixes to superficial
elements of social media. These initiatives typically advocate for norms rather
than laws, persuasion rather than coercion. The goal is to have business executives acknowledge
certain principles and govern their business practices accordingly, but without any kind of
specific enforcement mechanism to hold them to their promises. Regardless of any particular
CEO's sincerity, however, there is a hard limit to reforms associated with this type of
corporate self-governance. However much the promises made by the companies to better
protect privacy or police their networks may be genuine, the effects will be questionable as long as the core business
imperative to collect it all, all the time, remains unchanged. As long as social media are propelled
forward under the regime of surveillance capitalism, pledges to do better, to protect privacy, will
remain little more than window dressing, a coat of paint to make their platforms more appealing
and ultimately draw more consumers into their fold.
Among the partial or fragmented solutions are the technological fixes.
We just need a new app to help us correct the errors
and false information circulating on all the other apps.
Many believe that governments need to enact strong data protection regimes with independent
regulators who have the power and authority to punish social media platforms that violate the
rules of those regimes. It is common to hear calls for more scrutiny of the machine-based
algorithms companies use to sort their users, what's known
as algorithmic accountability. Proposals have been made to legislate greater transparency in
the social media advertising space, particularly around political advertising. Some believe we
should treat social media as publishers or regulate them in the same way we regulate large utilities
like electricity and water. Others believe they should be broken up using antitrust tools instead.
One potentially helpful way to think of these various proposals is as if they are ingredients
of a long-lost recipe. We know the basics from memory passed down through the generations,
but without the original formula, we hesitate and second-guess ourselves. Do I finish the pasta in
the sauce or pour the sauce over the cooked pasta? Do I roast the garlic or mince it?
The term reset is most often associated with computers and refers to the process of shutting down processes or
systems that are hanging. It can also refer to starting over completely from fresh with factory
settings. A reset provides an opportunity to take stock of the big picture. It gives us breathing
room to evaluate what's working and what isn't and make adjustments moving forward
accordingly. Most importantly, it provides us with space and time to start over from first principles
and a solid foundation and dispense with those practices that have become a hindrance to larger
aims. There are several compelling reasons to have a solid framework to guide us after a reset.
Having an underlying bedrock of principles to which we can continuously refer helps steer our strategies and inform our decisions, especially as novel problems arise.
technological innovation has been undertaken mostly in the absence of any such foundation, other than a simple imperative to collect more data. A well-articulated set of principles,
particularly one that links technologies to wider political ideals, can help remind us that
political principles should take priority over technology, and technology should
be designed and developed to further our political aims, rather than work against or be insulated
from them. It can also help us understand the relationship between reform proposals
that otherwise may seem disparate or unrelated. It can help us evaluate and prioritize, see the larger whole
of which the various fragments are a part. Second, having a principled foundation can anchor our
approach in a rich historical tradition and help us feel connected to well-tested and long-established
wisdom and practical experiments on analogous challenges that
societies have faced in the past. Our social media universe feels uniquely novel in so many ways
that it may blind us to the fact that societies have experienced technological upheavals and
large-scale societal challenges before. Human societies have had to adjust and adapt throughout their history
in the face of new material circumstances, much like we are experiencing today. We can learn from
what's come before, and from the collected wisdom of those who have experienced and reflected on it.
Third, such a foundation helps combat fatigue, pessimism, and defeatism among critics of social
media and surveillance capitalism by showing there are viable and robust alternatives.
If we demonstrate the common roots of numerous disparate efforts to detach,
reform, and regulate social media, we show that everyone's efforts are weaving something larger than their
own separate struggles. This suggests that the whole is larger than the sum of its parts,
and that there are alliances to be forged among like-minded advocates, policymakers,
and researchers, particularly among civil society in different jurisdictions worldwide.
particularly among civil society in different jurisdictions worldwide.
It can help chart a path towards an alternative agenda on which groups can work together more confidently.
This is Astra, and there's a lot about this idea that I agree with.
There are a lot of powerful forces I would like to be restrained.
But I suppose there's something about it that frames our power as inherently negative,
that all we can do, the best we can hope for, is to have these powerful actors, be they corporations or governments, do less harm. So it's a kind of negative framework, right? Be less invasive, be less destructive, be less all-powerful. And I suppose I also would want to couple that with a framework that thinks about what power we have and what other things we could do, including with technology.
So I guess in my affirmative vision, I want to think about how to redesign our technology and maybe into three buckets. So one is what technology do we want to socialize or make public? So maybe Facebook should be a public utility. This is something
that Ron Deibert mentions in passing in the book. Yeah. And then what things should be abolished? I
mean, for me, there are certain types of invasive data collection that just shouldn't exist.
They're organizing campaigns around the United States and in some cities they've been successful
to just ban facial recognition as a tool that the police can use. I think that's the right thing. Instead of trying to regulate, there are some things that just need to be left alone, some data that does not need to be collected. So there's an analogy there to the environmental justice movement. Some fossil fuels need to stay in the ground. Some private data just needs to stay off the cloud.
off the cloud. We should go beyond restraint to just the framework of abolition of saying,
well, actually, that's just because it's technically possible doesn't mean it's morally desirable.
On Ideas, you're listening to an encore presentation of Retreat, Reform, Restraint, the final installment of Ron Deibert's 2020
Massey Lectures. You can find Ideas wherever you get your podcasts, and on CBC Radio 1 in Canada,
across North America, on US Public Radio, and on Sirius XM, in Australia, on ABC Radio National,
and around the world at cbc.ca slash ideas. I'm Nala Ayed.
My name is Graham Isidore. I have a progressive eye disease called keratoconus,
and being I'm losing my vision has been hard, but explaining it to other people has been harder.
Lately, I've been trying to talk about it. Short Sighted is an attempt to explain what vision loss feels like by exploring how it sounds.
By sharing my story, we get into all the things you don't see about hidden disabilities.
Short Sighted, from CBC's Personally, available now.
This episode is part of a series marking the 60th anniversary of Massey College, one of our partners in the Massey Lectures.
It features the sixth and final lecture of Ron Deibert's series, Reset, Reclaiming the Internet for Civil Society.
In this lecture, he explores the kinds of restraints we need to place on government and corporations, and on our own endless appetite for data.
Part 3. Restraint To get us started, I propose a single, simple, but hopefully potent principle. Restraint.
simple, but hopefully potent principle, restraint. Restraint is primarily defined as, quote,
a measure or condition that keeps someone or something under control or within limits.
Secondarily, it also means, quote, self-control, as in unemotional, dispassionate, or moderate behavior. Both senses of the term point to general qualities that will be essential
to preserving rights and freedoms in our supercharged, hyper-networked world of data.
We need to restrain what governments and corporations do with all of the extraordinarily
powerful tools of surveillance that are now in their hands. We need to restrain what they do with all of the
data about us and our behaviors. Restraints will be essential to ensure the security of the broader
information and communication space in which we live, particularly restraints on bad actors
exploiting us for despotic, corrupt, or criminal ends, or governments exploiting it for their narrow national security
aims. We'll need personal restraints too, restraints on our endless appetite for data,
restraints on our emotions and anger as we engage online in the absence of the physical cues
that normally help contain them. We will need restraints on each other, mutual restraints
that apply to individuals, organizations, and even sovereign states. If there is one single mantra,
one simple concept that should become our slogan and guide us as we chart our path forward,
I believe it should be restraint. While most everyone is familiar with
the concept of restraint, what may be less familiar to many is that this seemingly simple
term is derived from and is essential to a long tradition of theorizing about political liberty
and security going back centuries. It is most intimately connected to that broad family of political
thought that for most of us is so entrenched in our habits and dispositions, it is more like an
instinct than a self-conscious philosophy. I'm talking, of course, about liberalism.
Broadly defined, liberalism is a tradition that supports individual rights, civil liberties,
and political reform that pushes societies in the direction of individual freedom,
democracy, and social equality. Political theorists will be quick to point out that
liberalism is not a single theory, but a large family of ideas and prescriptions for how to manage societies that goes back
hundreds of years. Most of us can rhyme off the key features of liberalism, so ingrained are they
into our collective approach to politics. Free and fair elections, constitutions, limited government,
the rule of law, separation of powers, pluralism, social justice, and protection
for human rights. There are many tangled threads that weave their way through liberalism.
There are also competing schools of thought within its large and diverse tent. Those who
consider themselves adherents range from libertarians and free market fundamentalists on one end of the spectrum to democratic socialists on the other.
In spite of competing schools, however, liberalism of all stripes shares a fundamental principle.
is the belief that in order to preserve and maximize freedom while countering insecurity and fear, we must build legally binding restraints on those we entrust to discharge authority.
Restraints on the exercise of political power to prevent abuse. Within the liberal tent,
the specific school of thought that is most associated with this principle of restraint is known as republicanism.
In fact the idea of applying restraint as a design principle for political systems is
one of the most venerable in political theorizing, with roots reaching all the way back to ancient
Greece.
Although republicanism has something to say about many principles,
at its heart it is about preventing the centralization and thus abuse of power.
For republicans, unchecked concentrations of power threaten liberty and security because they
are apt to be abused, and so checks and balances, a widely known phrase that comes from Republican thought, must be institutionalized to distribute power pluralistically and keep it that way.
The Republican opposition to concentration of power rests on assumptions of human frailty.
Humans tend to be both self-interested and prone to lapses in judgment. When opportunities
present themselves, they are tempted by power, which can in turn bring about corruption and
other abuses. As Montesquieu famously observed, every man invested with power is apt to abuse it.
Republican theorists were acutely aware of the accumulation of unchecked and oppressive
power in the hands of government as a threat to individual liberty and security, and so they
devised an elaborate system of power restraint devices to thwart it. These devices are a form
of friction introduced into political processes to make the exercise of authority less efficient.
Strange as it may sound, at a time of near-total surveillance potential in the hands of government
agencies, we need to challenge ourselves to think of ways to artificially reduce the efficiency of
our government's security agencies. The Republican attention to material factors tells us why.
Oceans, mountains, and other rugged terrain, as well as inhospitable climates and even endemic
diseases, created obstacles for invading armies, slowing them down, impeding conquest and protecting long-term control of populations, a kind of accidental
friction by circumstance. All things being equal, the further away people are from the center of
control, or the more natural barriers shelter them, the less efficient the exercise of that
control tends to be. Hence, in times prior to social media and the internet, activists
could flee their home countries out of fear of persecution and feel safe thousands of miles away
from the reach of the power elites they left behind. However, with social media and other
digital technologies, these natural barriers have been almost entirely dissolved.
We now live in something approximating a friction-free environment
in which outside forces, be they companies or governments,
can pry into our most intimate details, even when we're behind closed doors.
Thanks to new technologies, all of us can be tracked to a degree and in a fashion that is
both unprecedented in human history and nearly comprehensive in its potential totality. When our
fridges, baby monitors, video conferencing facilities, smart TVs, and even brains are all
networked to the outside world, natural restraints that we once took for
granted no longer serve the same function. We now face an entirely new challenge from the
material context, thanks to the changing nature of technology. Implanted technologies have the
potential to pinpoint details even down to a biological level with
a precision that borders on precognition. This great leap forward in remote control
raises the prospect of severe abuse of power and near-totalitarian control. The application
of restraint measures to the design and functioning of both private and public sectors
will thus be critical to preserving liberty and security.
This is Daniel Dudney. One way to think about why restraint in this way is so important is to just
recognize the extent to which civilization is a series of restraints
that have over time emerged to solve various problems that have arisen from human exploitations
of technology. We can think of it this way, that over the longer term, there's a cornucopia of increasingly potent double-edged technological
swords. Technological advances, which are coming at ever-increasing rates when one looks at the
longer horizon of human history. You know, there's been this last century or so, which has just been
explosive with regard to technological capabilities.
And in all cases, we're basically getting human enablement.
A science-based technology is enabling humans to do more things.
Many of the things that it enables are things we want, but many of the things that it enables
are things we don't want. And our capacity
to continue to advance and to continue to make technology serve us rather than technology as a
source of disasters and oppressions depends upon how we regulate and restrain the technologies. In every aspect of our life,
we've got restraints on technology and we take them for granted. They're like
baked in. I mean, think about the automobile. The first automobiles did not
have mufflers and so urban areas were quickly overwhelmed by an enormous
amounts of noise.
And so citizen groups got together and demanded regulation.
And of course, the nascent automobile industry said, oh, you know, this is an impingement
on our freedom.
But over time, the automobile manufacturers started putting mufflers on automobiles in
the factory. And the problem just went away. And no one thinks
about it anymore. We don't have to organize ourselves to control urban automobile noise.
So when we say technological progress, what we mean is not just that technology can do stuff we want, it also means that we have the ability to make
sure the technology doesn't do stuff we don't want. Part 4. Mechanisms for Change.
The first place to start is by reviewing the type and effectiveness of existing restraint mechanisms around governments.
While companies can abuse power too and collect a lot of sensitive, fine-grained, and highly revealing data about us,
which they can in turn share with governments, only a government can take away a person's liberty by force.
person's liberty by force. The security arms of the state, the police, armed forces, and other security agencies have a monopoly on violence, one of the definitions of sovereign statehood.
They have lethal means at their disposal, can arrest people and lock them up, and in some
jurisdictions can even end their lives through capital punishment.
In response to emergencies, governments can also take all sorts of exceptional measures that
infringe on liberties, including declaring martial law or simply suspending constitutionally
protected rights that we take for granted, as we discovered with the COVID pandemic.
take for granted as we discovered with the COVID pandemic. The first task of our reset should be to evaluate the effectiveness of the restraint mechanisms we have inherited. Do we need to
supplement them with new resources, capabilities, and authorities? One simple rule of thumb that may
help guide us is as follows. Restraint should increase proportionately
to the intrusiveness of the practices in question. The more invasive a technology is,
the more likely it lends itself to abuse of power, and so the stronger and more elaborate
the restraint should be. Consider location tracking via cellular and telecommunications data.
Most everyone carries around a network device with them all the time that pings cell towers
and local telco networks on a continuous basis and is standard outfitted with GPS and Bluetooth
beacons. Most all of us have dozens of apps that routinely grab location history and data too,
and use them primarily for advertising purposes. Prior to COVID-19, these data could be accessed
by law enforcement, military, and intelligence agencies, but under widely different conditions.
In some countries, certain of those agencies might require a warrant or a production
order, while in others they might simply walk into the headquarters of telecommunications
companies and demand them. In a world where we reflexively look to big tech for solutions,
it's not surprising that Google and Apple have teamed up to develop a protocol for
Google, and Apple have teamed up to develop a protocol for anonymize contact tracing through smartphone apps. Like others who have already weighed in on the issue, I believe that however
much these prove to be useful in public health emergencies, the safeguards around them must be
exceptionally strong too. Some basic restraint mechanisms should include strict limits on data retention,
clear limitations on use, and restrictions on access to ensure that the data are not
illegitimately redeployed for other reasons, like catching chicken wing thieves and jaywalkers,
or monitoring human rights defenders. Similarly, strong restraints should be applied to the use
of commercial spyware and hacking tools by government security agencies, which are among
the most intrusive and, as we at Citizen Lab have demonstrated in our research, highly prone to
abuse. States purchasing spyware are at liberty to abuse it with limited or no transparency or regulation.
Companies that manufacture and sell it have unbridled freedom to rake in revenues by the tens of millions, largely without fear of criminal liability or concern for how their technology impacts human rights. The net result? Harassment, blackmail, and even murder of
countless innocent civilians worldwide. Abuses and built-in discrimination around the use of
some of these technologies today, in policing, immigration, and criminal justice practices,
are already well documented. The prospects for even greater harms down the
road are impossibly large and daunting to contemplate.
Applying restraints to what governments can do is only part of the solution. We live in
an age in which gigantic corporations, and especially technology giants, dominate the
social and political landscape.
The powers of unbridled surveillance capitalism are truly awesome, and when combined with state authority, are potentially totalitarian. Shaping our desires to persuade us to consume
this or that product is disturbing enough, but the prospect of corporations and states colluding
in broader population control is downright dystopian. Just ask a Tibetan or Uyghur.
In addition to the risks of abuse of power related to fine-grain remote control technologies,
there is another reason to impose restraints on surveillance capitalism.
there is another reason to impose restraints on surveillance capitalism.
The engine at the heart of the business model, which prejudices sensational, extreme,
and emotional content, amplifies our baser instincts, creates irresistible opportunities for malfeasance, and helps pollute the public sphere. It also continuously accelerates our consumption of data. More is
better, faster too. But endlessly accelerating consumption of data on the part of both consumers
and firms mining our behavior taxes the planet's finite resources, draws volumes of fossil fuel
power energy, and contributes to one of humanity's most
pressing existential risks. It certainly doesn't solve it. Introducing friction and other restraints
on surveillance capitalism can help improve the quality of our public discourse while tempering
the insatiable hunger for more data, faster networks, and disposable gadgets. One thing is for sure,
business as usual can no longer be tolerated. Decisions that have major effects on the public's
sharing and consumption of information should not happen in the shadows or behind a proprietary
algorithm. To be sure, there are balances to be struck around free expression, content moderation, and abuse prevention.
There are very real risks that mandatory or poorly constructed measures could be perverted as an instrument of authoritarian control, abused by despots and autocrats to enforce their rule.
to enforce their rule. The key will be to make sure that social media platforms manage content in ways that are transparent, limited, proportional, and in compliance with internationally recognized
human rights. These standards may not be possible in all jurisdictions right now,
but they should be an imperative for those that consider themselves liberal democracies.
This is Meredith Whitaker.
I think this is a great place to start examining how we might push back on these systems.
I am also someone who believes we need to examine the roots of liberalism with critique,
understand that small r republicanism, at least in the US context where I'm from,
was concomitant with settler colonialism and slavery and some of these racial projects that
are still with us today, and be very discerning about which of those principles we carry over
and which of those we reinvent.
I think there is also an omnipresent question here about who the we is, who gets to determine
what a limitation is, what transparency is, how to interpret a human rights principle.
is, what transparency is, how to interpret a human rights principle. And this is something that I think points to the requirement for robust social movements and small d democracy that can continue
to foment and push back against the urge for, again, that small council of elders with a view
from nowhere making these determinations on behalf
of everyone else. It's John Norton here. I think that Ron is right that we need to have a comprehensive
approach to this. And one of the great things about the lectures, I think, is that they constitute one
of the first attempts I've seen to try and convey the comprehensiveness of the problem
that we face as a whole. But if we go back to social media, what they have done is they
have transformed humanity's media ecosystem. Now the word media is interesting. It's the
plural of medium, and medium has actually two meanings. One is the conventional meaning, that's to
say, a medium is a communication channel. But to a biologist, a medium is something
quite different. It's a mixture of substances in a Petri dish in which organisms grow. And
if you change the medium as a biologist, then different kinds of organisms grow. And our information ecosystem,
you could think of it as the medium in which human culture grows. We've changed that medium.
We've changed the medium in our global Petri dish. And therefore, we shouldn't be surprised
that some strange organisms are now thriving in it. Organisms that in the end might be really toxic, for example, for democracy.
So Ron is right. We need to think on that kind of global scale.
Part 5. The Road Forward. Where global governance in general goes, so too does governance of the internet and social media.
Discussions to develop norms of appropriate state behavior in cyberspace, while laudable on one
level, seem entirely theoretical at the current time against the practical reality of massive
investments by states in offensive hacking, superpower policing, mass surveillance,
and influence operations. We should not expect international institutions to be anything other
than reflections of self-interested and power-seeking sovereign states as long as
restraints on the abuse of power are not deeply entrenched in domestic spheres. Only once restraints, divisions,
and separations are established in individual republics can they then be extended internationally,
first to other like-minded states and then gradually to others. Liberal democratic systems
of government can ensure that social media platforms and other technology giants are
subjected to common standards of governance so that they cannot play jurisdictions against each
other. Only with such a united front can they develop a truly robust response to the state-centric
model of social media and internet governance being propagated by authoritarian countries like
Russia and China. The painful truths outlined in these talks paint a very bleak picture.
They also present a troubling forecast for the future of the human condition.
It seems undeniable now that the disturbing worldwide descent into neo-fascism, tribal
politics, unbridled kleptocracy, along with the accompanying spread of ignorance and prejudice
we have witnessed in recent years, is at least in part because the social media environment,
presently constituted under the regime of surveillance capitalism, created conditions
that allowed such practices to thrive and flourish. Personal data surveillance and
authoritarian state controls present a perfect fit. Seemingly endless lucrative business opportunities
that undermine public accountability and facilitate despotic rule.
These negative externalities may be amplified by the surge in demand for social media during the COVID pandemic, the enhanced power of the platforms that went along with it, and the
unprecedented emergency measures that tapped into those platforms' surveillance potential.
On top, our insatiable lust for data and disposable devices
is silently taxing resources, sucking up vast amounts of energy and thus contributing to,
rather than helping to mitigate, the climate crisis. While the COVID pandemic has given some
reprieve to carbon dioxide emissions, thanks to a short reduction in airline and other fossil fuel
powered transportation, that reprieve will eventually pass. However, the sudden embrace
of digital network technologies will not, and may in fact deepen. Combined, both real and virtual
consumption could increasingly strain natural resources,
draw from dirty energy sources, drive up emissions, and contribute to waste.
As climate scientists warn, continuing down that path will lead to collective ruin.
Our precious apps will mean little when humans are either wiped out altogether or consigned to a Hobbesian state of nature,
dispersed in small tribes struggling against each other for the scarce resources needed for survival.
It is not unrealistic to imagine a time when the internet and all of its associated infrastructure
will be reduced to rusting artifacts, submersed in rising oceans, or covered over by tangled weeds,
if consumption practices continue apace. That would be one way to mitigate social media's
negative consequences, but obviously not worth the price.
Much of what I've been talking about in this series is sadly not surprising. These are truths because they are widely recognized
by a growing community of experts. But the time has now come to move beyond diagnosis and start
the hard work on solutions. We must squarely and comprehensively address the intertwined pathologies
of social media and surveillance capitalism, starting with that
device you hold in your hand. Fortunately, we have a recipe, a set of principles that can help
guide us on this task. We do not even need to invent something new. Humans have faced enormous
political challenges thrown up by new material situations before, and there is a long tradition of practical theorizing
that can be adapted to our own unique circumstances. It is time to reset, to start over from
first principles, and to work towards the construction and stewardship of a communications
ecosystem that serves rather than diminishes human well-being. We need to realistically
acknowledge that there are major hurdles to overcome and deeply entrenched and powerful
interests that will work in opposition and not always by the rules. A comprehensive strategy
of long-term reform is therefore required, extending from the personal to the political, from the local to the
global. We must begin now with practical and manageable small steps simultaneously undertaken
by many of us spread across the planet. The COVID emergency reminds us of our shared fate.
We have a once-in-a-lifetime opportunity to reset. We can reclaim the internet
for civil society. The principle of restraint should be our guide. You've been listening to a special encore selection
from Ron Deibert's 2020 Massey Lectures
called Reset, Reclaiming the Internet for Civil Society.
Along the way, you also heard from Astra Taylor,
Tamsin Shah, Meredith Whitaker,
Misha Glenny, Daniel Dudny, and John Naughton.
That series was produced for Ideas by Philip Coulter.
This episode was produced by Pauline Holdsworth
with production assistance
from Annie Bender. It's part of a series of conversations and archive lectures we've put
together to mark the 60th anniversary of Massey College, one of our partners in the Massey
Lectures. Thanks to Massey College and former principal Nathalie Desrosiers.
Ideas is a podcast and a broadcast, so if you liked the episode you just heard, check out our vast archive at cbc.ca slash ideas, where you can find more than 300 of our past episodes.
than 300 of our past episodes.
Technical production, Danielle Duval.
Our web producer is Lisa Ayuso.
Acting senior producer, Lisa Godfrey.
Greg Kelly is the executive producer of Ideas.
And I'm Nala Ayat. For more CBC Podcasts, go to cbc.ca slash podcasts.