Tech Won't Save Us - How Tech Distracts Us From the Bigger Picture w/ Bianca Wylie
Episode Date: April 30, 2020Paris Marx is joined by Bianca Wylie to talk about the response to COVID-19, how governments’ emphasis on tech solutions ignores (and potentially entrenches) social inequalities, and how we might ta...ke control of technology to ensure it works for the public good.Bianca Wylie is the co-founder of Digital Public and Tech Reset Canada, and a Senior Fellow at the Centre for International Governance Innovation. Follow Bianca on Twitter as @biancawylie.After listening to the interview, consider reading Chris Gilliard on luxury surveillance, Jay Pitter on forgotten densities, and Nora Loreto on long-term care facilities.Tech Won't Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Follow the podcast (@techwontsaveus) and host Paris Marx (@parismarx) on Twitter.Support the show
Transcript
Discussion (0)
There's something beautiful about being optimistic with technologies.
There's something really gross about being ahistorical with technology, right?
And that's what is bothersome right now.
Hello and welcome to Tech Won't Save Us, a podcast that wonders why the government seems
to be more reliant on tech giants with every crisis that passes. I'm your host Paris Marks
and today I'll be talking to Bianca Wiley. Bianca is the co-founder of Digital Public
and Tech Reset Canada and she's also a senior fellow at the Centre for International Governance
Innovation.
You can follow her on Twitter, where her handle is at Bianca Wiley. We talked about the problem
with how governments are responding to COVID-19 by centering tech companies and tech solutions
instead of investing more in public services and ensuring that we have a robust social structure
in place to help people who are falling through the cracks. I really think you're going to
like this conversation, so make sure to listen the whole way through. And obviously, if you like it,
when it's all said and done, please leave a five-star review on Apple Podcasts. That really
helps where the podcast is new and only launched in the past few weeks. Thanks so much and enjoy
the interview. Bianca Wiley, welcome to Tech Won't
Save Us. Hello, thanks for having me. A few weeks ago, the government of Canada announced that
they had made a deal with Amazon to kind of manage the logistics of delivering their emergency
supplies around the country. So when you read that story and sort of the very few details that
have come out since, what has kind of been your reaction?
Yeah, a couple of points.
I think the first one was reminiscent actually
of something that was Sidewalk Toronto as well,
was when it was announced.
The prime minister announced it and sort of announced,
we're doing a deal with Amazon Canada
in the middle of this emergency
and sort of looked like, just be happy now because I said that,
right? Like it was sort of this, like that there's somehow like a normative good associated with that
announcement, which is exactly the way I remember the sidewalk Toronto thing. It was sort of like,
when I say a cat brings you a dead animal kind of thing, like, hey, look what I've got. And then,
you know, it would be perceived differently depending on where you'd be situated to the
announcement. And so that was, that was the first thing was,
was the sort of that, that the,
that the announcement was like without detail was supposed to be something that
was good news, but it had no details in it.
It was just we're working with Amazon Canada, be happy.
So I find that interesting because I think it's that power of Amazon Canada, sort of the idea that the brand is somehow now
something that, you know, gets political endorsement from the federal government of Canada,
which was on the heels of the Amazon incident with Chris Smalls, where they had a, you know,
there were leaked notes about the executives at Amazon,
including their general counsel, smearing a worker. And it's one of those moments where you're like,
wow, so that happens, that's making news. And then I turned to our federal government,
and that's the brand that they're expecting us to be happy that we're doing some sort of pandemic reaction with.
Right. It's wild. It's wild to me. It's wild because it's just suck and blow.
Right. It's like it's like, well, it's like you're holding two completely conflicting moments.
And so so those those are two things.
And I think the third thing for me was that there was a sort of immediate flurry of why aren't we using Canada Post?
And because they hadn't released enough details around what the thing was, because they later or
in subsequent publication that day, explained that they were going to be working with Canada Post and
others. But it was almost like they sort of create this confusion by not being clear. But then what
happens is because the flurry goes around sort of the
logistics and its relationship to this sort of what people think of when they think of Amazon,
then the thing that nobody's talking about is, okay, hold on. You're saying they're doing this
deal at cost. You're saying you're using their online store. You're saying nothing about what
they're getting access to that has all kinds of intangible value
right so i saw this thread i wish i could find it on twitter a couple weeks ago which was about
the united states postal service and somewhere in the middle of it was this really specific example
of like the amount of information and like logistics uh both information that is held by
the institution you know there's a lot of institutional knowledge there.
Just that's just one strand of it of what is known when you get to look into and over systems.
Right. So that's just one. Never mind. What is Amazon Canada now privy to throughout Canada's response to this?
Like, what is it? What is it getting access to? What information and knowledge is it getting access to?
And it's being framed as like we're doing you a favor, Canada.
Right?
So, I mean, I think these are these pieces where just between the rhetoric and the narrative
and the like, what's missing and then the not clarity on what it is.
There's not history of the tender.
I can't get a number.
I mean, it could go either way.
It could be a number.
It could be like almost nothing. And then they get to say, oh, look, we're doing this, you know,
the sort of idea of doing something at cost. So there's just there's a lot in there that's missing.
And, you know, again, it just sort of fades into the background because no one is sharing
details. And the last the last thing I'll say here, there was a an article about it,
which and it's common, but it basically said that the details of that tender were covered under the Privacy Act. So you can't get access to what it is. So here we are, just be happy and believe that it's good for us. Like, this is such a, this is, again, like, not the way of a government that's talking about being open and being, you know, like for me, these are always the same issues with governments.
It's like you're in you're always making hard decisions.
Explain what you're doing.
Show your work.
Like if it's defensible, no problem.
But don't just do the like, well, we're just leaving all this stuff out.
So that's those are my pieces there.
There's really unfortunately not enough. I'm going to keep trying to find stuff out about what that contract is. All that I know
is that it was, you know, through the COVID-19 response sort of action that the government is
undertaking. So there's an entire suite of tenders related to this. And I think there's definitely
like open contracting and others are watching this because how these tenders are going down
in the middle of crisis is something that definitely needs more attention.
I'm not surprised at all that these contracts are kind of covered by the Privacy Act, right?
Because we see the same thing when, say, like Uber makes deals with municipalities and stuff
like that, right? All of a sudden, we can't know anything about it because they're dealing with
this private company, blah, blah,
blah, right?
Right.
So it's really frustrating. And it presents these larger questions as we consider, because as you say, this is just
one part of a larger effort by governments to respond to the COVID-19 pandemic, right?
And increasingly, I feel like we're seeing this response shifting toward kind of like
a technological framing, especially when we're considering things like contract tracing and
how to keep track of where people are going.
There's this growing effort to kind of frame it in a way where these are things that are
going to be handled by our smartphones.
Technology can solve these problems for us.
And they often leave out these larger details or seem not to consider these larger issues of
the broader effects of these technological solutions to the COVID-19 problem.
So when it comes to the efforts of tech companies and governments to use technology to respond to COVID-19 with contact tracing or in other ways that you've noticed, what are you seeing with that and what concerns do you have with the way that's playing out?
Yeah. And well, well framed, because a lot of concerns, a lot of them really have to do with the lack of political imagination that is being shown here.
So I'll start by saying that there's this persistent social problem we have,
we being, you know, I would say people who aren't in control of how governments are defining the use of technology in society,
or companies, and that's happening, you know, they're doing that together,
where the public response is always framed as sort of mitigating or defending against the worst
version of a thing. And, you know, so with the contact tracing app, right, it's someone and a
lot of technologists too, and a lot of, you know,, and a lot of policy and legal professionals as well sort of decide that there's an inevitability to it.
And then go in with that frame.
And I don't think it comes from malice.
I think it's very well intentioned.
I get it.
But what happens when you put so much force into that frame is, to me, this is how you solidify the status quo. So I'm finding it very frustrating that
you're having some beautiful, encouraging, and I mean, amidst a time of dire crisis, like this is
a, there is so much pain right now every day. But in that there is this sort of discussion of the,
you know, there's a portal, this is our R&R Roy's, you know, like this sort of what, what, what could,
what could the next version of things look like? And it's not like before. And the only way to think more that way is to be defining and designing how we're using our technologies into the future. at well-meaning people that their lack of political imagination is now defining my reality,
you know, because they're holding power. And I think this is like, for me, it always goes back
over and over and over again to the power of rhetoric and narrative, but also how do we
dislodge the status quo? And I don't mean to say that as a pejorative. I, and I know some people
do. I'm saying it as like more of the same coming from more of the same with more of the same frame and more of the same power.
You're getting more of the same like this. This is all you're going to get to. Right.
And so I feel like sometimes it's more that it's the it's the opportunity cost of a lot of these things. Right.
It's not always to say that the thing you get and this is a smart city question as well.
It's not to say the thing you get is necessarily the worst thing in the world. It's not that.
It's that you then lose all of the opportunity to do 17, 29,000 other, you know, however many
variants of different, you know, things you could do. And that loss is what hurts, right? And so I
think this sort of inevitability, like this is the thing,
like this, this also this focus, like with the technology of the contact tracing app,
it's one of so many different ways you could use technology to respond to people, right?
Like, let's think about the fact that for decades, I mean, in where I live in Toronto,
Nasma Ahmed at Digital Justice Lab, always talking about access to the Internet, access to the Internet.
Right. So you got everybody at home now.
What are the good things that could be done?
What are the equity conversations to have about access to digital services?
And how would we be replicating what's currently a divide by bringing in some of these technologies?
Like it just replicates what's here now. So I think I think the challenge for a lot of us who are maybe like critical of some technologies is also talking, you know, just realizing how we're
not talking about all the different ways we could be using tech to support needs to get in and
support people to make sure people get money they need easily. Like this is a digital service moment for governments.
Like there's so many ways to support each other with technology.
So the idea that we're like drowning ourselves in time on this one,
like one little version of the technology.
Efficacy isn't even the conversation we have.
Like this is the same problem in the smart city.
No one's even talking about what the thing's even doing. The whole conversation is about, you know, minimizing privacy problems.
And I'm not saying that's not one of the things, but it's like, wow, how do you get so far down
this really, really narrow path? And then what are all the things you don't talk about? Right.
And so there's a lot to say in there. I mean, I think my heart broke the other day.
Like there's things that I don't have great mental tools to deal with.
And the one of the one that happened to me yesterday, I have two children.
I had a little moment and many people at home with kids are having, it's challenging, right?
You're at home with your kids.
You're trying to do all the things like you were doing before in some cases.
And then you start to think about children that
are in homes that there's where there's violence in the home. And now they're one little window
of time where they were maybe out of that home with, you know, with people who care about them
at schools or other institutions is gone. And so where are we spending our money on the shelter capacity and that like the spaces for
people to just like tomorrow morning be better? Like, is that like, how, how do we turn the volume
down on these extremely obvious, you know, problems that we know how to resolve? And then
we're turning the volume way up on these kinds of, you know, technologies or
solutions for things that no one's even saying the people who back those apps that they're going to
solve the thing on its own. Like it's that's not even the idea, but just the political energy and
attention and the headlines and the debate about that is completely eclipsing, where we should be
focusing a lot of our energy and time to like help people tomorrow morning, right? Like those are things we know. And so I would just, you know, end that point on saying
that the scarcity of our politics right now are so dangerous, because it makes everybody want to
just have that narrative that well, well, we should do this thing, because they know that
they don't have the resources to do the things we should be doing. Or technically, you know,
technically, that's not true. But from the, you know, let's keep things in the status quo,
you're in a scarcity mindset of saying, well, all of that stuff's super expensive, like
public health infrastructure, testing, like all of the things that we should have.
So you understand that political inclination to some sort of like technology that makes people
feel like something's happening. But that's terrible, right?
Like that's scary and that's what's playing out right now.
So there's a lot of different bits and pieces in there.
I have another thought about sort of the term luxury surveillance, which Chris Gilliard
has talked about a lot.
Sure.
He's hyper visible on Twitter.
And he wrote about this idea of luxury surveillance with smart city
technology, you know, like the idea that you would welcome more surveillance technology, because
why not, right? Like if it means controlled access to your to your home, or, you know,
surveillance that would be used on others, but not yourself. So surveillance in that sort of
administrative term, you might say, well, these are features
that I enjoy because I'm safe from their harms.
Um, I feel like that's, you could take that concept and just pull it a little bit into
this sort of app situation where you have people being like, well, I understand there's
a little bit of this issue, but you know, I'm okay with it.
I'm going to download it.
Right.
Like, and, and I know who says things like that.
And then you think about people who are way more at risk if their phone becomes the sensor that could be used to define where they're allowed to go by the state, by police. We're in emergency
measures times as well. Right. So this gets all the way into military potential for like,
if you really want to empower these devices that way, you have to think about what it means in,
you know, in your daily life. And I think a lot of the voices that you hear that are saying,
yeah, okay, well, I'll do it. No problem. It's because it's luxury surveillance to them,
because they know when they walk into the hospital, they don't get questioned about whether
well, maybe you're having a panic attack. I don't know if you saw that story the other day.
Black woman in America trying to get help, trying to get tested, and is not getting access
to health care multiple times.
And unfortunately, she died.
And it's heartbreaking.
So these technologies do not land in everybody's life the same way.
And I think that's the piece that also gets lost
is you sort of get the sort of luxury surveillance people
being the ones who are out saying,
no, it's fine, it's anonymous, it's decentralized,
it's X, it's Y, but that's not,
like, so even if you're okay, you really,
and I think this is this moment in our politics
to understand like, how do you make sure that you don't just think about your context with these things and I know that's hard
like I'm not trying to say that I'm always great at it I'm the first one to say it's not the way
I wake up in my life it's like feeling my life through everybody else but it's work you know so
I think sometimes we're not sure what our political work is. That's part of our political work is you got to show up as though it's not your luxury use. It's someone
else's like completely different, dangerous, dangerous, you know, life circumstance that
this stuff can enable. And I think that that has been too contained a space where it's like
civil liberties people, you know, there's, there's obviously
people who've been fighting those fights forever, but how do you turn those fights mainstream?
You know, like, how do you make it so that that's the kind of argument that lots of people show up
with and say, yeah, this might be fine for me, but it's not fine for my neighbor. And so, you know
what, I'm not in, and I'm going to use my voice for that, right? Like that's, that's part of our
work with these sort of political technologies. And it's not easy. So I'm just, I'm not trying to say like, oh, we should all just do this. And
it's obvious. But it's definitely, I think that it's instructive as to how we could be thinking
more. And I think Chris's term there, like that sort of idea of a luxury, luxury surveillance
really helps you think about, wait a second, like, how does this land? Right? Is this something I'm
comfortable with? Because I'm safe? Right? Like, I think it's, it's, it's helped me. So I think it was cool to
see it kind of move in my mind from the smart city context to the health context, and it still
applies. What you're saying kind of reminds me of one of the slogans from the Bernie Sanders
campaign when they were saying, like, fight for someone you don't know, right? Instead of just
caring about yourself. And I feel like that's so important. And when we think about the inequities
in cities, when we think about how there was this narrative in the early days of COVID-19,
that kind of like everyone has like an equal risk from the virus, right? But we know very clearly
that that risk is not equally distributed. There are essential
workers who are at much greater risk of contracting the virus. But the early numbers that we're seeing
out of the United States also show that it's racialized people who are being affected and
poor people who are contracting the virus and dying of the virus at a much greater
rate than higher income and whiter people. And so I feel like focusing on these tech solutions
kind of ignores these inequities that already exist. And again, like as you were talking about
with like, do people even like have the smartphone to take advantage of this service and
like who is it actually going to benefit if it's made available um and i feel like we're not or or
at least the people making these decisions don't seem to be asking these questions because again
it does seem like by making an app available for contact tracing or what have you, it's potentially benefiting the people who are already at lower risk of contracting the virus.
Obviously, there have already been these concerns that rich people were getting tested when other people were not and were kind of paying for access to tests. And there was a story the other day that
there's even this like wealthy community down in San Francisco of these like tech elites.
And they not only paid to get themselves all tested, but then they like use their connections
to get one of the universities to set up like a medical study in their community. So they would
keep getting like tested right so they get this
like really privileged access to not only health care but like the tests and everything else that
everyone else really needs right now and that should be available for everyone not just the
people who can pay for it like i said before like it's just wild to see how this all plays out
and i think your point about how the technology allows political leaders to kind of not so
much focus on the political questions and like the questions of public services and
are we funding our health services properly?
Are we funding our social services properly?
Are we funding homes for domestic violence, homes for like, you know, children who are
in danger, getting homeless people off the streets and
providing them with homes, right? Like all of these things don't seem to be getting as much
attention, because now we're going to solve it with this new app.
Exactly. And you touched on a couple of things in there. And I think another piece to highlight,
Jay Pitter, and you know, she's in, she's in Toronto, she's working all over the place in
different cities in North America and elsewhere.
She wrote a piece that uses the term forgotten densities.
And I think I know you got people around who are urbanists listening.
This is a fantastic piece because it really talks about it was sort of in response to I won't try to summarize this piece, but just the term to say there's density where, you know, whether it's
like factory dorms, prisons, like super dense neighborhood where the lodgings are super dense,
you know, rooming houses, like so many different versions of this and what the consequences are
within these forgotten densities for the people living them. And so I'm taking that term because
I would just, it's so helpful. Like I think it's another example of what comes next and reorganizing talks about cities and urbanism around ideas like this, because this is where it is, right? Like this is where it is. situation, there's also Nora Loretto.
She's been tracking across Canada the incidence of the COVID stuff in seniors' homes and other density, right?
And so the thing is, knowing how much is happening within these known densities, right?
And some of them are institutional, whether it's care homes, whether it's homes where I've seen lots of outbreaks popping up in like homes that care
for the disabled, places where like, you know, where this problem is hitting people, like,
you know, and like you've also said, forgotten densities, like we know where this is hitting
people in cities, and we know where they like, it's, it's known. And so not putting your response directly to that is, it's just, it's immoral,
frankly, because what the idea of sort of coming up with an app when you already know your problem
definition is not that. It's not the why, it's not everybody at the same time with the same problem,
right? So there's something there that's also quite politically,
you know, unfortunate to see that, like,
it's clear who is getting hit harder with this stuff. And it's also clear that the history of elevating
and putting, like, testing, right to your point,
putting the tests to those communities
as priority people for testing and containment
of this outbreak.
That's not happening. And it's like the opposite of that is happening.
So when you introduce a technology that if it's even at its best actor scenario,
just telling you where you should be testing, and you're not even doing that properly, right?
Like there's just so many dominoes there that it's there's something there's something beautiful about being optimistic with technologies.
There's something really gross about being a historical with technology.
Right. And that's what is bothersome right now is watching a whole bunch of people be like, well, maybe for the first time ever, we're going to use this thing in a way that like raises raises people up and we go right to where the need is highest and
we prioritize you know the people that are suffering the most like that's not what that's
not how this has gone before like now now where it's going to go that way like it's it's not there
and so i also think that this idea about the institutional layer of this is such an important
one to flag because governments and institutions and government
as an institution, like knowing that, you know, the types of housing that are at higher risk,
the types of care facilities that are at higher risk, the types of shelters, you know, where
homeless people are staying and spending their time because that's their option right now, right?
Like those are the spaces where you would go in and go quickly because, you know, you have an
institutional relationship to those things in a way that you don't have the same one in a liberal
democracy with every person like there's much more complexity in the relationship between the state
and the individual in some ways than there is between these sort of institution to institution
responsibilities and i think that's interesting, because it's just watching governments like go to the go to the most edge scenario to focus their energy when it's like you have a direct channel with your money to these institutions like you fund them.
How are you not following those pre-existing sort of chains of responsibility and logistics to like make sure you're helping those people? Right.
So it's like all of those things add up to me together to this real negligence story. And it is not just now, I think everybody who's,
you know, knows any recent history, like we have been setting up for this for decades.
So it's sort of like all those things coming together at once, you know, from a, from a,
what the state should be doing and could be doing to what it's, what it's doing on a daily basis
right now, like it is not showing up in its best version of itself, for sure.
Certainly not. So often when we turn to tech solutions as the solutions to our problems,
really, instead of actually looking at the deeper roots of the problems and the more political
aspects of the problems, what ends up happening is that we don't solve the
problem that we're seeking to solve. We actually further kind of entrench that inequity that is
kind of at the root of the problem in the first place, right? And I think your focus on the
forgotten densities is so important because it does illustrate that there's this broader conversation happening about
how there's this concern on the right that dense places are where you are going to have a greater
chance of contracting the virus. And then there are urbanists pushing back and saying, no,
the dense places are where we have the least possibility of kind of like dying from the virus, right? They're actually
the safest if we do it right. But that kind of polarization of the debate on it kind of misses
the nuances of what's going on there, which that article on the forgotten densities really
pulls out really well by showing how it can kind of affect like public housing that has been
neglected and all these sorts of things, right? i i think that's a fantastic article and i'll make sure to put it in the show notes so
people can go and check it out beautiful but i feel like i feel like that brings us into this
conversation of what the future of the city should be and how we should be focusing on the future of
the city and obviously this has been you know a mutual concern for us right and you have done a
lot of work um Sidewalk Toronto and
pointing out the flaws of that kind of smart neighbourhood project that's being proposed
in Toronto. And I know this has been like an ongoing kind of fight for what two plus years
now, I guess. So can you just kind of give us an idea of firstly, what the Sidewalk Toronto
project is like briefly, and then what are the biggest
issues that you're seeing there? The quick way to describe what it is, I think I just take it back
to the beginning in 2017, where Waterfront Toronto, which is a tri-governmental public agency,
for lack of a better word, it's a public corporation, but they're in charge of a whole
whack-a-land on Toronto's waterfront. And they're charged through
the three levels of government in Canada to it for its redevelopment, and also to have an economic
development plan for it. So not just let's develop this real estate as sort of a shared public asset,
but also let's think about how we revitalize this like from or vitalize it, because it's,
you know, fallow land, a lot of it now, how do we turn it into a place where there's, you know,
where there's an economic piece to it too?
I raise this because they actually have a mandate for economic development,
which gets missed a lot.
The major reason to explain why I'm starting with Waterfront Toronto,
who is the partner in this work, is it's not the city of Toronto.
So this is a project that's being done through a real estate corporation.
And Sidewalk Labs, which is a
sister company to Google, and both of them are subsidiaries of Alphabet. And so in 2017,
Waterfront Toronto put out a tender and said, hey, there's this 12 acres here. We'd like to
get some proposals on how to develop it for the neighborhood of the future. And we want it to be
many things, sustainable, economic development, we want to have innovation, you know, on and on.
All the key words.
All the key words.
All the words are there.
This comes full circle to a problem with narrative.
So all the words are there.
And it was announced at the end of 2017 that Sidewalk Labs had won this tender and they were going to design the neighborhood of the future on Toronto's waterfront and build it from the internet up.
What was they're saying there was sort of images of where there would be data collection
and also images of all the different kinds of like non data driven technologies, like tall
timber is something they're talking about, you know, building a lot, they talk a lot about tall
timber. So long story short, the idea was, let's, you know know get this company to design this neighborhood of the future they have
a 50 million dollar u.s budget to go through that process and um it has been you know now that was
october 2017 so yeah more than two years um it was it was one of these situations where you have a
whole bunch of people responding to a government asking a corporation
to design a neighborhood and and sort of just just that the opening salvo of that was sort of again
it's that normative thing with amazon like hey we're gonna have a company and it's a subsidiary
of alphabet uh design a neighborhood of the future. Is everyone excited? Good. Okay,
let's go. And like, completely drove over the question of hold on, did the people of Toronto
sign off on this tender that that was a thing we wanted to have, like, hold on, which that question
has never been resolved. But the big point to make here, and we've been through years of many
drawings, many plans, many discussions about
how you use the data, how you don't, da, da, da, it's not about this, it's about that. Like,
it's just, it's hard from a linear perspective to bring you up to today. But long story short,
it did not land in the way I think everybody thought it might. So there wasn't just this
general, yay, there was like, hold on. And I think in the point of time with the tech lash,
there was a factor there of a growing unease with, you know, tech companies moving into all the things, you know, it's not just, it's not just urban planning, it's health, it's like, you can
feel the the implications of these big, powerful companies, sort of moving into public life in ways
that, to my mind, none of us ever have asked for. And it's
just sort of happening because, you know, it's not being challenged enough by the state, to my mind.
So long story short, there's been a lot going on with that thing, and it's still going. And I can
tell you that the point in time where things really changed was October of last year, 2019.
The two parties signed a sort of,
we're going to keep this going agreement with each other
that basically it's where the media story took off
and diverges from reality.
The story that was making headlines
was Canada beats back big tech
or this thing is reined in now.
We've set rules.
It's fine.
Good job.
This is such a Canadian style thing too,
is the insecurity. Just like, oh no, we got this. It's fine. Good job. This is such a Canadian style thing to read,
the insecurity, just like, oh no, we got this. We're fine. No problem. And the tech company being like, oh, we should have listened. You're right. Yeah. Let's listen. We love listening.
Like you watch it. It's a cliche. You're watching the thing play out and you're like, oh, my head.
And this is not to say, again, like with the health story, lots of well-intentioned people,
good urbanists, lots of like good ideas flowing around in there.
But the basic premise of like an alphabet subsidiary designing where and how we live across multiple domains and then procuring lots of other companies, it's like it was
just sort of this control loss that I think has been a problem since the beginning.
But the problem is that story happened.
And it's like everybody's sort of being like, OK, it's resolved.
Everything's fine over there.
The reality is it's exactly the same problem, frankly, from a governance perspective that it's been since 2017. And so this is really hard for me to watch because for me, one of the big arguments was
we don't have regulation to manage the idea of like this persistent data collection all
over the city.
Like you're turning the curb into a market.
Do we want to think about that before we've got someone else setting the rules for that?
Like this is public space.
These are public assets, like all kinds of them.
And now we're like mitigating someone else's product roadmap, because I'm not
sure anymore, you know, like, I don't even know if everybody realizes they're getting used it for
product development for a company, like, which is which is what's happening. Like this is,
you have to look at this on levels. But long story short, there is no regulation that we as
Torontonians, or we as Canadians have put in place to drive something like this.
The city of Toronto is working on a digital infrastructure plan. It's been halted because
of COVID. So like, we don't have policy. We never had policy. We still don't have policy.
But now nobody cares anymore is what it seems like. Like now it's sort of like, oh, let's
pretend to see if they're going to keep going with this deal. It's like, of course they are.
They're writing it together at this point.
And so it's kind of wild from a rhetorical perspective to watch that like once everybody thinks it's fine
and it gets, and this is something that Zuboff wrote about
and she's got a really good cycle of words for this.
It's like, it's a real fatigue.
It's like a dispossession.
Like you kind of get to this point of people being like,
okay, well, you know what?
Like it's not going away. So like, what's the point? And there's lots of other things going on.
And so this idea, I'll just, you know, end here, that regulatory capture, like if you have a lot
of money and time, man, you can do it very slowly. And if you do it slowly enough, the scrutiny goes,
because right now the scrutiny is gone. And we're in the exact same situation we
were at the beginning, frankly, from a governance perspective. The public is not driving it.
There's contracts being figured out behind closed doors. And there's no public directive here. It's
just we're just mitigating what these two partners have decided they want to do. Like it's a
fundamentally anti-democratic situation. But for me, as someone watching it, it's kind of like a wild experience to go through,
like, okay, so everybody cared before. We're in exactly the same spot. And no one cares anymore,
because that thing happened that didn't resolve this problem. Like, that's not good. But that's
like, that's bringing you right up to, you know, that's nose against the glass
for me is Waterfront Toronto says that they were going to set up government task forces
and no one's answering me as to what they are.
And so it's pretty interesting to see how this goes down.
Because we should be saying how we want data and digital governance to work in our cities.
We should not be mitigating someone else's business plan.
But that's what we're doing right now.
And Toronto's doing it and acting like it's all good.
So it doesn't bode well, but I know culturally Toronto is also a perfect place for this because
no one wants to really make a move one way or the other from taking control of this thing.
And I just think everybody's tired of it. So you
know, who will win out in a situation like that. And you've got all these political leaders who've
thrown their capital behind it. You look at this structurally, it's it's very, it's very difficult
to make peace with the fact that the consent of the people was never required for this deal. Like,
I think you got a lot of people that are in a liability that that place was not set up for. Like, I'm still shocked that
lawyers that are paid with public funds are signing off on what's happening in Toronto.
Like, this is still shocking me. But sometimes I think I'm super naive. So here we are.
It's so frustrating. Like, obviously, I have been watching this more from afar,
like not on the front lines of it, like have been and it's been so frustrating just to see this process like continue and for
there to be all of these massive issues with it and it just like doesn't seem to matter like
it's it's it's absolutely wild you know yeah well we we've come full circle to inevitability it's it's like right what we were talking about at the beginning right like I think it's absolutely wild, you know? Yeah, well, we've come full circle to inevitability.
It's like right what we were talking about
at the beginning, right?
Like, I think it's this idea
that somehow you make it complicated enough,
long enough, you put enough experts in there
that somehow saying no almost feels like facile.
You know, like, it's kind of like, well,
and this is what the management consulting companies
have done over decades as well.
Like, there's just so much stuff going on over there that you're like, well, that can't possibly be a bad idea. That's like, yeah, it is. But from an argumentation perspective, it almost feels too simple to say that. But I think that's such a, it's, I just want, I'm trying to pull us out of sidewalk to say this is a frequent issue with all of this technology, is that like, we don't look at
efficacy, and we don't look at whether it's doing its thing or not, you know, like, how,
how is that something that's allowed to be skipped over? Like, what is this thing even doing? Like,
all we're doing is mitigating issues we know about, we're not even asking, like, what it's
like, what is this actually achieving for all of us that we want, which is the way we should be
talking more about our technologies?
So I just wanted to say it's that inevitability that the thing you're describing back to me with my feeling is that thing.
And that is the thing that I think we need to hone in on, that it's not inevitable that we have other ways to build technology.
As a technologist, this inevitability kills me because while we could be doing amazing things, right? Like that's to put us on the rear guard,
defensive mitigation mode,
instead of all of our energy going to like
building what we want, that's a real letdown.
And I wanted to mention something to you
because of how much politics is part of, you know,
this podcast and other things is like,
I don't have a grand nostalgia for the state either.
Like nobody's acting right, you know?
Like, it's not something that we then revert to just, well, states do things too.
So I think this third way that we need to invent together is so important.
And it is happening through leadership of people in communities on the ground,
just like making things and doing things and proving them out
without the support of the
state either, frankly. Right. And I think that's important to acknowledge and to like applaud,
because sometimes what I get worried with some of the rhetoric around the state stuff, it's like,
I don't want some of these big companies entrenched into the state because the state
hasn't been acting right either. You know, like I just I think that's that's an important thing
to keep
real in our politics too. So like that third way is challenging, but it's going to take like
much more people power outside of some of these formal structures right now.
I think that is, is like a fantastic point, right? We need to pay attention to how these
technologies are being developed. And that doesn't simply mean kind of retrenching and looking back to a way that
structures worked in the past, right? And I know that we could go on and talk about sidewalk labs
for hours, I'm sure. But like, obviously, we also need to have a vision for what the future of tech
in the public interest looks like. And I think you started to give an idea for how that
might work right there. One of the ideas that I have been really excited by is this notion of
kind of being inspired by public broadcasting and kind of like having this public alternative to
the tech giants, right? But not simply approaching that in kind of this
state institution kind of way, but updating it and modernizing it for the 21st century
and ensuring that we kind of have teams of technologists that are rooted in communities
that are around the country that are responding to the technological needs of
those communities and helping them learn how to use tech tools, like developing technology in a
much more decentralized, democratized way. So that's a kind of an idea that's been exciting me.
But I know that this is an issue that you spend a whole lot of time thinking about and talking about and arguing for.
So what are the sorts of things that you're thinking about, about how we could kind of turn this around?
And instead of always relying on these multinational oligopolies to actually take control of this ourselves, make it democratic and make tech work for us.
So we're both thinking a lot about this.
And I'm going to share something that my colleague, digital public Sean McDonald said, and it just sticks with me every day.
And I might not get it exactly right, but it's basically, we've talked about before, but also here, is how do so much of research has been sort of suctioned into the companies,
and so little of it left on the table for public interest, from, you know, in terms of intellectual
property, all of that, like it, we have a major, major, major inequity there, right? Like, and
that's fascinating to me, because I always think
about math that, you know, there's like, you know, computer science and math is not inherently
capitalist. Not at all. There's nothing about it that necessitates its activity be commercial.
Having said that, the extent to which the infrastructures that we have right now
are owned and operated by commercial entities, like we're pretty far down that path. And so I
think one of the areas where we really need to think about our power is the governance of these
infrastructures in terms of how many of our critical infrastructures are commercially owned
and operated right now, whether it's hardware, you know, and a lot of the software that's run
through a lot of our institutions, that the opportunity to get more control over use is in its governance rather than in the idea
always of ownership. And again, this actually goes back to Sean. Some of Sean's thinking that's
helped me a lot was getting off the idea of ownership and getting into the idea of how it's
used. Because we can do a lot more in our governance
and to bring a democratic lens to the use of things,
which gets us off of needing to own them.
Like, I think there's something in there
that's really, really important
that across the board in terms of saying,
well, I don't need to own that infrastructure
to define how it's used, right?
So I think this is where there's opportunity
in both the regulatory spaces and in
governance, whether it's at a local, regional, whatever level to make rules. And I think hyper,
hyper local rules around some of how our technologies are used are quite a good idea
and quite possible. And it's even why like the democratization of things like standards, which are historically heavily, heavily corporate driven, but flooding the tables of like standard spaces with people that aren't commercial actors, but are civil society actors and saying like, this software has to work this way, governments aren't buying it full stop and start to use the power
of purchase, frankly.
There is so much public institutional money that has gone into a lot of this stuff that
we should be saying how it has to work.
And so I think that's quite encouraging because not that those tables are accessible and not
that people get paid to show up and sit there for three hours.
It's not to say we go from here to tomorrow morning, no problem. But there's an option. And that option
is process. And that that for me is very encouraging, because of the commercial deficit
thing of like, well, what are we going to do? Are we going to invest billions of dollars in public
computing? I'd like to like I don't get me wrong. I'd like to buy all kinds of stuff and put it under public, you know, public, public control and use,
but let's be pragmatic and strategic right now and say,
is that what governments like I look at the U S government and it's that,
and lots of other countries like they're, they're,
they're not making those infrastructure investments.
So then our next best thing, not to say we stop lobbying for them ever.
And we're all, you know, in our own ways, going to keep pushing that one.
But, but it's very important to know that this sort of defining how these things can be used,
which just to go back to sidewalk is why it's so wrong, is that you should say how that stuff works.
And then people build tech to your spec. You like we all need to know we have the power to define
how all these things work. It doesn't mean we have to build it ourselves. And so we need the
confidence publicly. And really, it comes to confidence, like letting people be able to say
in English what they want, and then having a translation sort of set of work to take that
and make that sort of a technical outcome that can be achieved. That's the kind of stuff I think
we can do. And I think it's important to look at
also like, between centralized and decentralized, there's also something really nice about thinking
of like devolved governance, like you don't want to have to be bespoke doing everything one off,
because I think it really makes it hard on people. And I think a lot of the open source community and
other like, there's some great alternative products to things, but you can't use them if you don't know how this stuff works.
You know, like, it's, and so there's not enough investment at some of the levels of, like, user interface stuff or, like, just general, like, accessibility to some of these tools.
How much community time can you ask for people to invest in this stuff, right?
I don't think it's the same across the board.
So I think there's a few different approaches in there that like of that stuff that I'm offering up. I think there's
lots more than what I'm suggesting. But I do think that this is why the talk about politics and the
talk about democracy and like, I go on and I am like team government and team that man, we got
this big institutional machine. It has been a violent actor. It's been a terror,
but it's also something that we shouldn't be throwing away
because we can make it better tomorrow.
And that's the work that I think we can do
through these institutions.
And I wish the public institutions would be showing up
and throwing their weight around a bit more.
I mean, we often talk about labor.
Look at the power of the public sector unions.
Like, oh, there's so much latent power
sitting there. It sort of reminds me of tenured academics where it's like, you guys have the
cover. Like, you're it. You're it right now. Like, there's not that many more people that are as safe
to be as political as you could be. Like, come on out. And so I think that's like, those are a few
different places where there's so much opportunity. Librarians, like they pull a lot of weight. But there's room to sort of create these spaces of go with the regulatory capture and go with the like, you know, these companies haranguing and
threatening, well, I'm just not going to set up shop in your country, if you're going to make
this hard for me, you know, like, I'm going to take my thing elsewhere. And so there's not a
lot of incentive for that regulatory stuff to just come from a political actor, it's got to come from
more public pressure as well.
And I think all those things together, there's opportunity there, but it's a slog, like for sure.
And look at how many other priorities we have, right? I think that's something we always think about is which fights you're supposed to be fighting. And there's a lot of ways to take that
access to the internet fight as the primary and see how much good could come from that. Exactly
to your point,
right? Because once you have that access, you start to get into different ways to use it.
I'd like to share this thing that someone at the Mozilla Foundation shared with me,
and it has stuck with me too, for a long time. A lot of the stats you hear about access to the
internet is about through it's through mobile phones. And so and this is differing across country, but much like literacy, you know, literacy is reading and writing. And
like, I never think about literacy as writing, I always think about it as reading in my little
shorthand, right? When I hear literacy, boom, I'm thinking about reading and writing. And so you
think about the same with computers, like it's not just access to what you
get, it's building it yourself. And you're not doing software development on a mobile phone,
for the most part. So like, how do you even make sure that you're setting people up to be builders
and makers and authors, right? And it's not everybody shouldn't be a software developer,
isn't it? Everybody needs to learn how to code at all. But it's certainly that you have a much wider array of stuff that you can do with the Internet if you have access to be like a maker and, you know, someone who's able to use it as a creator, not just as a recipient.
Because I think even that idea of that like recipient piece is part of the inevitability. And like when I talk to people, so many people I know are like, well, I don't really like what's going on, but I'm complicit in it as a consumer. And I don't know what else to do.
You know, like, that's, that's got to be the most common thing I hear from people.
So the only way I think to start to bend on some of that inevitability is to make sure that the
things you talk about in terms of like, what people can do themselves, are also more available
to the people who know how to do them and who know how to do them offline.
Like you look at, and it's so politically challenging to talk about this properly. So I'm,
you know, forgive me if I say this wrong, but like, there have been people who have been thriving,
despite every single thing our society has done to them. And they still show up with joy, with love,
with care, with support. And so like, to take that kind of knowledge and bring it into other spaces is
this incredible opportunity for lots of us to learn from that. You know,
like it's, it's, you don't want to make something,
you don't want to make it sound like, Oh,
that's an opportunity because it's terrible.
It's saying like people have built a lot of their own networks around things
in, in the offline world.
And so like,
what does that look like to bring more of that into digital spaces? And what can the rest of
us learn from that? I never ever like to use the word resilience, because I was this wonderful
poster from New Orleans, and the woman's name is escaping me right now. But it's like, don't call
me resilient. It just means you can do more to me. It's so good. Like that word always makes me feel
uncomfortable, because I think that's such a perfect way to describe it. But the reality is and it just means you can do more to me. It's so good. Like that word always makes me feel uncomfortable
because I think that's such a perfect way to describe it.
But the reality is that let's shove resilience aside.
It's pure knowledge of like how to make things work
for a community despite terrible circumstances.
And so that's the kind of stuff that to me
needs to find a space more in our technologies.
And I think there's so much there.
And if people are willing to share, I mean, no one has to.
And not everybody's partied everybody's stuff, which is a whole other conversation.
But I hope you know what I'm sort of gesturing at there, that that's where I see hope.
And it's like we can fall behind some of that knowledge and learning.
We're not starting at zero, you at zero with all of these things.
So yeah, we take our sociology of a lot of this in history
and move it into a technology lens.
And I think, man, we have so much to do.
That's just right there.
But we got to break out of this,
everybody else's inevitability problem,
which is hard when you go to the media narratives.
So hate to bash on journalism
because they're doing a lot of good work,
but they also repeat a lot of stuff that is problematic. So thanks for making a space for us to not do that. And obviously, thanks for all of your work. You know, I think that that leaves us
in a really good place, right? Like the challenges ahead are large, right? They're great. But there's also a lot of opportunity if we can kind of build those collective structures and kind of come together to not just demand that our governments do better, but to start to kind of build alternative structures and to exert our power in whatever way that we can. Right.
Yeah. Yeah. And I love to say like, we are the government. We're the government. Like we,
we need to understand that that's, that it's us. It's not like, oh, you're bad over there. Like,
it's like, that's because you're like, everybody has a different level of responsibility to what
we are. The government means it's not all the same at all, but there's a lot of people that are holding a lot of power and just sort of latent
and just sitting there and just holding space status quo kind of stuff. So I want us to take
responsibility for that machine, like more of us and the people who are very close to like being
able to push it differently. I'd like us to feel more ownership of that. I know people who feel
zero relationship to that and that's completely reasonable as well.
But yeah, that's the thing.
We are the government,
so we are bringing its consequences.
And that's not great,
but also opportunity to your point.
So it's both.
Completely agree.
Bianca Wiley, thank you so much.
It's been fantastic speaking with you today.
Likewise.
Thanks a ton.
Talk to you soon.
Bianca Wiley is the co-founder of Digital Public and Tech Reset Canada. It's been fantastic speaking with you today. Likewise. Thanks a ton. Talk to you soon.
Bianca Wiley is the co-founder of Digital Public and Tech Reset Canada.
She's also a senior fellow at the Center for International Governance Innovation.
You can follow her on Twitter as at Bianca Wiley.
If you like this conversation, please leave a five-star review on Apple Podcasts. That would really help us out.
You can also follow the podcast on Twitter as at Tech Won't Save Us. You can follow me, Paris Marks, as at Paris Podcasts. That would really help us out. You can also follow the podcast on Twitter
as at TechWon'tSaveUs. You can follow me, Paris Marks, as at Paris Marks.
And thanks so much for listening.