The Agenda with Steve Paikin (Audio) - Is Data Collection Undermining Human Rights?
Episode Date: May 21, 2025In "We, the Data: Human Rights in the Digital Age," author Wendy H. Wong makes the case that the collection and tracking of our data by Big Tech comes at a cost to our humanity. She's a professor of p...olitical science and principal's research chair at the University of British Columbia and her book won the 2024 Balsillie Prize for Public Policy. She joins Steve Paikin to discuss the link between data and human rights. See omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
Hey, OnPoly people, it's John Michael McGrath.
Join Steve Paikin and I for a special live taping of the OnPoly podcast at the Isabel
Bader Theatre in Toronto on May 28th at 6.30 p.m.
Visit onpoly-live.eventbrite.ca for tickets.
In the book We the Data, Human Rights in the Digital Age, author Wendy H.
Wong makes the case that the collection and tracking of our data by big tech comes
at a cost to our humanity.
She's a professor of political science
and principles research chair at the University
of British Columbia and her book recently won
the 2024 Balsillie Prize for Public Policy.
And she joins us on the line now from Kelowna,
British Columbia and Wendy, I guess,
congratulations are in order for starters.
So well done and thanks for joining us.
Thank you.
I want to ask you off the top here, you keep using a term referring to data, which I haven't heard before.
And so I needed a little explanation of what it means.
You say data are sticky.
What does that mean?
I really mean like gum at the bottom of your shoe, you know how it gets stuck and really
crammed in there because we often don't know that the gum has been stepped on until it's too late.
So this is kind of how I think about data about people as well, which is that we're living our
lives, we're doing our thing with our devices and trying to coordinate our activities. And this whole
time we are we are creating data and those data are about things that are everyday.
They're not necessarily extraordinary.
Those things can't, those data can endure for a long time.
They're used across all kinds of different
other analytical databases.
And it's a process that we are involved in
but we're not the only ones involved, right?
There are these companies most of the time
which are actually interested in collecting those data.
So I talk about the stickiness also as co-creation There are these companies most of the time which are actually interested in collecting those data.
So I talk about the stickiness also as co-creation because it's not just about what we individually
choose to do.
Now you said data and I said data.
Do you know which one is right?
I think we're both right.
Well let's just pack up and go home right now then.
I think we solved it.
We fixed it.
No, we do have more things to discuss here.
Datification, datification, however you want to pronounce it,
that's another term you use.
What does that mean?
It really just means how our lives are really becoming
very much both analog and digital.
So it's really hard to separate the difference
between what you're doing physically
versus what's being recorded or analyzed about you digitally.
And so it's sort of thinking about life as both digital and physical.
The other way to really define it as others have done is to say that a lot of human behaviors
have become digital data, right, stored in a database somewhere for analysis by a computer
or human being.
Let's do an excerpt from the book here.
And this is right off the top where you talk about Ring,
this thing where you can look into your phone
and see a camera that takes a picture
of your front door or whatever.
And here we go.
Sheldon, you wanna bring this graphic up here?
The video doorbell, it seemed,
was the answer to the new problems
created by online commerce
and the solution to answering the doorbell
when multiple packages arrive throughout the day.
But video doorbells are not the only digital devices made for our convenience or increased productivity.
There are also smart phones, smart thermostats, smart TVs, smart speakers, smart refrigerators, you name it.
These smart devices form the ecosystem of the Internet of Things
and have fundamentally changed many aspects of our
lives and perhaps even who we are.
Okay, I got to ask you about that last line.
They've changed who we are.
Meaning what?
Meaning, you know, humans have, since they figured out how to write things down, we've
kept records about people and events for a very long time.
So this is not a new technology.
But what digital devices,
so the ones that you just read out of the excerpt,
those are all kinds of things that are in our everyday
that are tracking our behaviors,
that are monitoring what we're doing
and also operating as a result of the sort of analysis
that are provided by the data that are created.
So this way that we can track people's activities,
we can create data about what we're doing
and what we're thinking on a second by second basis,
this is new.
This is fundamentally a big shift
from how we used to live our lives,
not just comparing centuries ago,
but I'm thinking even 20 years ago
before the advent of what's called big data, right?
Where we had all this information
from all these new sensors we were putting into our devices, whether that's phones, cameras, refrigerators,
cars, you name it, any sort of digital technology that's out there has really become a data
collection device as well.
Well, you go broader.
You say it's actually come under the category of human rights now, not necessarily the kind
of human rights we think of as in, you know,
the right not to suffer harm or the right not to be discriminated against, but a different
kind of human rights.
What do you mean by that?
A lot of the conversation, I think, around AI and especially around data have really
revolved around this idea of privacy, which is one of many human rights, and it's an important
human right for sure.
But I think when we limit our conversation to particular human rights, like, you know,
like privacy or like freedom of expression, what we're doing is really limiting the way
that we think about how these, how the practice of data collection and datafication has
really changed the way we live.
And so I sort of zoom back.
I like to think about,
well, what it is that human rights were created for.
And so in the 1940s, after World War II,
the world came up with this list of rights,
the Universal Declaration of Human Rights,
and the values they were based on
were the values of autonomy, dignity, equality,
and what they called brotherhood at the time,
which I think in 2025 we're calling that
something like community, right?
So what it means to live as an individual
among other human beings while protecting
or trying to balance these various values
that really underlie the things that we care about,
like privacy, like freedom of expression,
like freedom from discrimination, as you said.
But are we vigilant enough,
are we energized enough as a community
to care about this or perhaps more likely
too complacent to worry about it
because we just love the convenience and fun of it all?
You know, I think everyone's different.
But one of the things I think is really difficult
is when you say, oh, this is a privacy issue, let's say,
AI is a privacy issue, people often think,
well, I'm not doing anything wrong,
so people can have the data, right?
And I think that really is a conversation stopper.
I think it's more the,
I think the ideas behind autonomy, dignity,
equality, and community,
why I focus on these big ideas
is because we're all affected by some change in our lives. Like I often talk to people who are really worried
about whether they have choice or not. Can I actually opt out of datification?
The short answer is not really and not easily because it's not just about
what you're doing and what you're consenting to but what others are
consenting to. What companies and governments are doing as a matter of business
or as a matter of government.
They're using these digital systems as well.
So it's really hard to say, you know, to exercise one's autonomy, for example, in the digital
age.
I'm going to come at this again, though, because, you know, I've interviewed a lot of people
on this program, experts such as yourself, who, you know, I get the sense they're trying
to energize the public to caring a lot about this because
they are concerned about the infringements on our privacy and the loss of human rights
as a result of what's going on right now.
And yet the public just doesn't seem to care.
They love the fun and the convenience and are prepared to give up their privacy to do
so.
Are you frustrated by that?
I am not frustrated per se.
I think I am.
I think people have not been told about this trade off, right?
I think sometimes there's a way that people say, oh, well, we've traded off our privacy for convenience.
And I think if we'd actually known that was happening, um, you know, looking back a decade or a decade and a half ago,
I think people would have responded differently.
or a decade and a half ago, I think people would have responded differently. I think now we're sort of, it's a little late in the evolution of technological devices
that are very much embedded in our lives.
Again, this is why datification is such a prominent theme in society.
But I also think that it's more important now to try to come up with rules
that actually make sense for people,
as opposed to lining up behind, you know, existing lines of argument that I think have been very important,
but have reached resistance. I mean, there are people who simply think that privacy is dead.
I actually, I don't think that, but I think privacy has changed. I think the importance of privacy is still there.
We still want to preserve a space for our own thoughts and our own abilities to make
decisions freely.
I think that's very important, but then we should be talking about it from my view, from
a point of view of autonomy, of equality.
Does everyone get the same choices?
Are we making choices for ourselves or a broader space, a broader community of people?
You almost seem to be hinting at a kind of a class war here.
Am I reading that right?
No, no, no, no.
I mean, it's not, I don't think of it as a class war.
I definitely think that AI, as many people have found, magnified differences between
people.
But let me just give you a current example.
So, you know, it seems like every time I talk about this, there's something new that happens
that I can really talk through.
So, so 23andMe declared bankruptcy about a month, a month and a half ago.
And a lot of people, rightfully so, were really worried about being in that
database now that the company is going under and who knows what could happen to
those data once the final, you know, the final sale or breakup of that company
happens.
But that's a, that's a case I talk about a lot in the book prior to this announcement of bankruptcy, precisely because if you've entered
that database, if you've done the spit test, you've not only entered it for yourself, but for
everyone related to you, because DNA are inherently shared. They are, you know, a co-created form of
data that we maybe don't think about in those words.
But certainly everybody understands family resemblances. And I know that most people
entered that data set maybe not to find out about diseases per se or genetic disorders,
but to find out who they might be related to and where their genetic heritage comes from. That's
fun. But it also has now come at the sort of conundrum,
this price that we probably didn't think about
before we chose to do the spit test.
That is a concerning example,
but let me give you the other side of the coin.
And again, we'll go back to Ring,
the example we talked about off the top,
where you can look at your cell phone
and there's a camera at your front door
and you can presumably feel a little safer
about who's on your property
or who's leaving what at your front door in exchange for giving up some privacy rights.
Now that seems to be a trade that a lot of people are prepared to make.
You okay with that?
It's important to point out whose privacy we're talking about or whose data, right?
So yes, if you are the ring doorbell owner, you have, you know, you've thought, presumably
thought about what could happen if
you're recording things at your front door. But then we have to ask, does everyone who delivers a
package to your door consent? Does everyone who visits you in front of that camera consent? And I
think even more, or sorry, less explicitly, it's the people who walk in front of your house, right? Everyone who passes by that camera, including cars and people and dogs and whatever, children,
they're all going to be part of a digital data set now accessible through your app.
You know, we cannot actually make that decision for other people.
And I think that is something that has come up now.
I know some people living in, you know, strata, that has come up
because people living in the strata don't want to or do not feel it's right
to not consent to that kind of addition to one's house.
And so on the one hand, it's about the property owner, the Ring Doorbell owner.
On the other hand, it's about the community that in which that person or that household lives.
You know, this is hardly a new subject.
I mean, it was almost 80 years ago that George Orwell warned us about this in 1984,
the surveillance state and all of that business.
But, you know, I guess for a lot of people, it feels right now like we are neck deep in this.
And it's almost past the point of no return to reverse any of this.
Do you agree?
So given my background in looking at human rights activists and what they do
in the world, then also looking at various social movements, there's never a
good time for norm change, for social change, for political change.
And so you could tell the folks who started the human rights regime, you know,
that I referred to in the 1940s.
Was it too late?
I mean, we just experienced the Holocaust.
There were lots and lots of things that
governments were doing to people that we think of now as unacceptable or as torture that back then
were part of political activity. So in that sense, no, I think that the realization that it's not
just about government surveilling. I think it's about the fact that we actually have companies who are doing the work of the data collection,
companies that are investing resources into developing AI systems that could better evaluate and analyze those data.
It's not about the state versus individuals and communities now.
I think it's also about how companies and their products have really enmeshed themselves in our lives for good and for what I would say is
is at least worrying and something that we should be talking about more
explicitly rather than saying oh this is just about privacy or this is an AI
issue. I think it's really a data issue. Sure but if you're on the internet
you're presumably there forever and I presume you believe we have the right to be forgotten.
The question is, can we get from there to here?
I mean, yes, people do presume that I think the right to be forgotten is an important right.
I think the sort of the impetus behind that right is right,
is that we should have a choice as to whether we are digitized or not. But the right to be forgotten is exercised after something,
after data have already been created. And that's my number one sort of caveat
around that right, which is I think that the step should actually be before the
data are actually in existence to be forgotten or to be erased. We should
actually be very mindful. We, meaning,
people, we, meaning governments, should be doing things to set guardrails for what types of data
ought never to be collected or to be collected and then properly disposed of, if that's possible.
I hint out in the book that that might not be possible. Data are really hard to track.
in the book that that might not be possible. Data are really hard to track.
But right now I do think that, yeah, I think it's really important to think about how data, whether data should exist or not, what types of human data should exist, and then set the
proper guardrails around protecting our rights. Like when is it that the existence of data
actually threatens our ability to exercise our rights that Like when is it that the existence of data actually threatens our ability to exercise
our rights that already exist?
Well to follow up on that, we're trying to figure out who the bad guy is here. And if you read 1984,
it's the surveillance state. But today, is the surveillance state the biggest threat to our privacy
and our human rights, in your view?
I mean, you know, I think there are multiple, if you were to rewrite George
Orwell's book, I think there are multiple parties to be concerned with. And a lot of times we call
that other party, besides the state, we call that party big tech. And I think that that's, that
describes a certain set of companies that are very powerful, they drive the global economy,
and they really underwrite the way that we live our lives today in terms of
providing just the basic technologies that enable all
kinds of human interactions and exchanges. So I don't know if
there's, it's not that there, anyone's bad per se, but I think
that as a political scientist, one of the things that I worry
about is a disproportionate or unequal distribution of power in society. When
one or more actors become so much more powerful than other groups or individuals
in society, that's what human rights were set up for, was to to guard against the
excesses of the state. And I think now we need to think about how we might want to
preserve some of those those key values that human
rights were trying to protect when now we have big tech digital companies also making incursions
into the way that humans experience their lives.
Winnie, let's go back to first principles here for a second.
And I'm sure you remember an occasion where I think it was Mark Zuckerberg who was testifying
before a Senate committee in Washington and an octogenarian senator looked at him and clearly didn't
know anything about Facebook because he said this is free right how do you
possibly make any money when this is free I think most people understand now
how they make money despite the fact that it's free but let's go there let's
follow up how do they make so much money out of knowing, for example,
that I like to go to the local ice cream store at eight o'clock on a Wednesday night for a cone
before heading home? I mean, there's a huge data market. And this is why I you know, it's not just
what we experience. It's not just these devices I focus a lot of attention about on in the book.
experience. It's not just these devices I focus a lot of attention about on in the book. It's also just the way we're living. I mean, people create products, they create services,
they improve what they're doing based on what they can ascertain from the patterns of behavior
that people leave behind through these digital data traces. And so I think it's that, you
know, we wonder how much money there is, but there's an enormous amount of money in that, right, in the sale and exchange of data. And so one of the things that
I think people really struggle with is if you regulate data, what's going to happen to these
very powerful, very rich companies that have made a lot of money in the exchange of data?
And my rejoinder to that is always these are very clever people who have come up with innovative
products that have fundamentally changed how we live. And if we give them more
regulations or more restrictions, I'm not, I don't, I don't have so much doubt
about their capacities to work through, you know, different, different types of
restrictions on the use of data. I think that's exactly, you're asking the right
question. And at the same, it's hard to imagine that data
could be so valuable, but they are, right?
They are.
People have called data the new oil, for example.
We also tend to think about data as detritus or exhaust.
So we have this very bifurcated and conflictual relationship to data.
Do you think Europe is dramatically further along on the path of protecting its citizens from all of this
than we are here in North America? I think they've made headway. I think that they have really tried
to define the potential harms, what they're calling the risks in the EU-AI Act, around technologies like AI. What I don't see as much movement on
is to think more about how potentially
it's not just companies that are bad guys
in this situation, as we talked about.
Governments can also abuse citizen data.
Governments can also use AI systems to discriminate
or harm citizens as well.
And so what I see coming out of Europe, on the one hand,
is encouraging.
I think the AI Act is great.
The Digital Markets and Digital Services Act,
which cover how big tech companies operate within Europe,
is a step forward.
However, the way that they think about data through the often
discussed General Data Protection Regulation, GDPR,
I think one of the issues is that the data have to be identifiable. In order to delete or port
your data, which is an important set of rights in that legislation, you have to be able to find those data. Yet we know that a lot of times
data become more useful not just at the point of collection, the initial stage, but you combine
and merge different data sets together. So where does that go? What happens to the de-identified
data that is not immediately traceable to a single individual, for example? How do we think about
that? How do we regulate that?
That is not possible under the current European model.
Well, let me bring the story home for a second.
We don't have to tell anybody who lives in Northern Ontario or Rural Ontario
about the importance of not having this service when they want it.
And we're spending as taxpayers hundreds of millions of dollars
to try to bring broadband to places that don't have it.
How do we ensure that those citizens have the same access to the service that people
who live in the cities take for granted while at the same time protecting their rights?
That's a tough one.
You know, on the one hand, I think you're absolutely right in speaking to the inequality in terms of accessing critical
technologies like broadband internet or high speed internet.
So on the one hand, that's an equality issue. But it's a
different kind of issue, as you point out from the ones that I
talk about in the book. So this is why I focus a lot on this
idea of literacy in the book, thinking about how citizens and groups within society and governments should be creating policies around ensuring that citizens are literate in the digital age.
Now, in the book, I talk about data literacy because I'm very focused on how data fundamentally change human life.
But others talk about AI literacy or more broadly digital literacy.
The ideas are very similar in that right now,
there are very few people who are experts on this stuff.
They're the ones making the technologies
and drawing the inferences
from the vast pools of data they've collected.
The rest of us are a mix of somewhat literate to,
I would say not literate or illiterate
in digital technologies, which is in data,
which is not to say that, you know, that to cast any aspersions on people, but it is to say that our systems are not prepared to teach us about the implications,
uses and potential downfalls of having so much data about people. And also I would add the benefits, right? So it's not all negative. But I do think that we're sort of our
ways of thinking about literacy and what it means to be competent and understand what's going on in society are locked into, you know, the analog way to think about things, which is reading, writing,
and arithmetic. Even if data or data literacy is a right, most education begins for most people
is a right, most education begins for most people in their early years of life.
So how do we ensure that all generations have access
to data literacy?
Yeah, this again, a big question.
So a lot of people are working on the K through 12, right?
Ensuring that kids are entering their adulthood
and equipped for the rest of their lives with these skills.
And I think that that's commendable and that's absolutely necessary.
But most of us aren't kids.
And so one way to deliver this kind of information, to sort of make data
less scary to most people, I think, I think there's a tendency to think data
are numbers or math, which is true.
But there are also ways to understand how data work, why data are created,
how data themselves come into being. And that could be transmitted. In my ideal world,
policymakers would come to realize that we have institutions that have already passed on literacy
skills quite effectively. These are public libraries. Public libraries are really well
positioned. They're embedded in our communities.
People use them all the time, not just for books
and video, but also just to hang out.
They're accessible.
They're in all these different communities
for the reason that they are accessible to the public.
And I should add that librarians are uniquely positioned
in our society because they are what I call
the original data stewards.
They're the ones who take all this information that humanity has created and create order
around them.
We can go to a library and find anything we want because librarians have gone through
the task, perhaps sometimes thankless, of organizing all that information into coherent,
accessible and easily and readily available resources
for the community.
Terrific.
We want to thank Wendy H. Wong, the author of We the Data, or it could be We the Data.
You say tomato and I say tomato.
No, never mind.
Let's call the whole thing off, Wendy, and thank you for coming on our program tonight.
We're really grateful.
Thanks, Steve.
It was such a pleasure.