Tech Won't Save Us - Canada’s Digital Contact-Tracing Experiment w/ Bianca Wylie
Episode Date: June 27, 2022To kick off a new monthly bonus series on tech in Canada, Paris Marx is joined by Bianca Wylie to discuss Canada’s COVID Alert app, the problems with the digital contract-tracing experiment, and why... we need a public post-mortem so lessons are learned for next time.Bianca Wylie is a partner at Digital Public, a co-founder of Tech Reset Canada, and a senior fellow at the Centre for International Governance Innovation. Follow Bianca on Twitter at @biancawylie.Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Follow the podcast (@techwontsaveus) and host Paris Marx (@parismarx) on Twitter, and support the show on Patreon.Find out more about Harbinger Media Network at harbingermedianetwork.com.Also mentioned in this episode:This series on Canadian tech is made in partnership with Passage, a left-wing publication in Canada. Passage published an edited transcript of the conversation.On June 17, 2022, Health Canada decommissioned the COVID Alert app.In April, Bianca wrote that the government needed to shut down the COVID Alert app because it wasn’t working (if it ever had). She also began writing a series on the app on her Medium blog that month.In July 2020, Bianca and her colleague Sean Mcdonald were already asking questions about the app and the planning around it.Other digital contact-tracing apps were launched in many other countries, including Australia, France, and Iceland, with poor results.Support the show
Transcript
Discussion (0)
The fundamental frame is we as a society and a culture
need to be at a point in time
where not doing technology is an option.
Hello and welcome to Tech Won't Save Us, or should I say Tech Won't Save Canada.
You might be wondering why an episode of the podcast is dropping today instead of on the usual Thursday when the episodes come out. Well, that's because this isn't the usual weekly
episode. This is a bonus episode for a new series that I'm doing on tech and the tech industry and
critical tech analysis in Canada. So trying to dig into some of those issues that are particular
to Canada itself instead of the usual discussions that I have that are more international focused
or, you know, focused on the United States and what is happening down there. And so I'm really
excited to be able to explore this. And obviously I can do it because the membership drive that I held in April for the
podcast second birthday hit its stretch goal. And so I was able to do a bonus series and I've really
wanted to look at what was happening in Canada. So that's what this is all about. I'm also partnering
with Passage, a left-wing publication here in Canada on this series. And obviously the episode
itself will still be
posted here on the feed where anyone who subscribes can listen for free. But then I'll also be writing
an article for Passage based on the episode so that these in-depth conversations about tech
issues in Canada can reach even more people. Now you might be wondering, okay, if this is a series
about Canada and I'm not in Canada, why should I listen to it? Well, even though these episodes will be about things that are happening in Canada,
I think that there will be a lot of relevance to international listeners.
Like this episode, I'll be talking to Bianca Wiley, who is a partner at Digital Public,
a co-founder of Tech Reset Canada, and she's also a senior fellow at the Center for International
Governance Innovation. Now, in this episode, we talk
about COVID Alert, which is one of those digital contact tracing apps that was launched in Canada
by the federal government in partnership with many of the provincial governments around the country
with the idea that people would be able to download it to their phones, it would track where they were
going, who they were encountering, and then if one of those people had COVID tested positive, then they would be able to
get a notification allowing them to go get a test.
So for people who are international listeners, I think many people will be familiar with
these apps, might have also had them launched in their own countries, like in Australia,
like in France, and will be familiar with the fact that they didn't often work as they were
being sold to us in the beginning. And at the time this app came out in Canada in July of 2020,
you know, a number of months into the pandemic, there were already other countries that were
experimenting with them. And we were already getting early indications that they were not
working as they were supposed to. Yet the Canadian government and provincial governments within Canada
push forward with this approach
and really kind of adopted this idea
that we could rely on this technology
to help us deal with the pandemic
while placing less focus on kind of the human infrastructure,
the human contact tracers
who are really essential for that process.
And we get into that in this conversation. Just before recording this episode with Bianca, Health Canada
announced that it was finally going to shut down the app. And that's really important because we've
known for a long time that it hasn't been working the way it was promised, but then it continued to
just exist, even as PCR or lab testing became far less available in Canada after the beginning of the
Omicron wave. And so the app was not functioning and certainly couldn't have been said to have
been functioning properly in the past number of months. And so Health Canada finally shut it down,
which is a positive thing. But the way that Health Canada and the advisory council that was overseeing
the app have framed it is very worrying. And
Bianca explains why. Because as we are examining these kind of tech solutions to health problems
on a broader level, we need to recognize when they are not actually delivering on what they
are promising. And at this point, with this COVID Alert app, what we need and what Bianca is calling
for is a public postmortem on the app to
see what went wrong, to see what was right maybe, so that we can learn for next time and so that
the government can be held to account for this technological solution that it put out there
and that it promoted to the public as a serious way that we could help address this public health
crisis. So I was so happy to have Bianca back on the show. She was on one of
the first episodes that I did way, way back in 2020. And to discuss this issue that is, I think,
really important, and that is probably not getting the attention that it deserves. And before we get
into this conversation with Bianca, the podcast and this bonus series are only possible because
of support from listeners like you.
So if you enjoy the podcast, if you're happy to hear that I'm doing this bonus series looking at what's going on in Canada, your support is essential.
So please consider going to patreon.com slash techwontsaveus and becoming a supporter so I can keep doing this work and making it free for everybody.
So with that said, thanks so much and enjoy this bonus conversation.
Bianca, welcome back to Tech Won't Save Us for this special series on tech issues in Canada.
Super nice to be here, Paris. Thank you.
I'm really excited to chat with you again. Obviously, you've been on the show before to
talk about Sidewalk Toronto and that whole situation where Google tried to plant its
footprint down in the city of Toronto and was
eventually scared away by residents and activists like yourself. A very hopeful event and one I
think that many listeners of the show were following and certainly were happy to see.
But now we want to talk about a different issue today. And, you know, for the past two and a half
years, we've been in this pandemic. It has certainly evolved over the course of that timeline. The government in Canada and many governments around the world have taken different actions over the course of that period to, you know, try to protect the public from this health crisis that we all faced. And that, you know, kind of came out of nowhere. That was a surprise to us, right? And that has really changed so much about the ways that we live. And one of those actions that they took in Canada was to launch an app
called the COVID Alert app, which was a digital contact tracing app. Many people, I think,
will be familiar with the idea of these apps, even if they weren't in their own countries.
And certainly Canadians will know what the COVID Alert app was. It was launched in most
provinces and territories, but not all.
But most Canadians would have had some kind of interaction with it.
Now, this was launched in July of 2020, rolled out to most of those provinces and territories by October.
And just recently, June 17th of 2022, the government and Health Canada, which, you know, kind of oversaw the app, finally shut it down and decommissioned
it. So what did you make of that moment when the government finally came out and said, okay,
it's time to shut down this app. It's done. Do you think that happened at the right time? Or
was it too late? So it's significant that it happened. And I'll speak to the timing,
but it is significant that it happened because they could have got
away with not decommissioning it.
The governments could have got away with that.
So I do want to start by saying it's significant that that happened.
And seeing a government move through that cycle of launching something, maintaining
it, and then shutting it down means that there is a lot to learn from that.
So it's great that it happened.
In terms of timing,
believe it was late. There's different points which we can look at where it likely could have been decommissioned. So it could have been done sooner. But we can talk about that. But I think
that not only is it great that it happened, it's great for a government to be held accountable
and to hold itself
accountable to do it. And also to understand that that's a step in learning in the future.
You can do that. That's part of a trust building exercise. So I think it's really significant that
it happened. We can talk about the details of it, but I want to really stress that that's,
it's significant. It happened and it's great that it happened.
Absolutely. One of the things that came up in
your writing was that a number of months ago, you know, you've been writing about this app since
it was initially announced, identifying potential issues with it. But one of the things that you
were writing about recently was how the app kind of came back to your attention recently,
when a friend of yours mentioned it to you, that it had still stuck around, and, you know,
was not really right for the context, even, you know, was not really right for
the context, even, you know, and I guess we can question whether it was ever right for the context.
But I guess, what did you realize in that moment? And how did it return to your attention?
Yeah. So when this friend of mine said that they and their young family had contracted COVID,
they wanted to report that somewhere. And I think that was an important piece
of this story because we've stopped collecting data about COVID, which that opens a whole other
set of questions and issues, right? But when they realized that they were trying to report it
through the COVID Alert app, and then they realized that they didn't have a PCR test result, which was required for them to report or log their case.
And then realizing, wait, they can't even get PCR tests anymore because the testing capacity had been reduced.
The general public could no longer get a PCR test as of January and February of 2022. And so that struck me because when he shared that, and I thought, if you can't get a
PCR test anymore, that is the one core requirement of that app. So how is the app still up? Because
we can talk about the app and how it wasn't working before we had testing capacity, which is true. But if there would be
one item that at the beginning of launching something like this, where you would say,
if this part breaks, we're done. I think it would have been reasonable to have had that
as something that would have caused the government to shut it down right there. You know what?
The core requirement is not available to the public.
So we have hit an issue that we should have, and it would have been easy to understand this from
the beginning. If you don't have that, you have to shut it down. Because what happens after that
is this sort of availability of an app, and there's no clear story around the status of it. But that's what tipped me off was when I
realized, wait a second, you can't even use this anymore as it was designed. If that wasn't a
trigger for the government to pull it down of its own accord, that should not have required any
public flagging of the issue. That should have been something that in a good policy framework,
you would say, oh, this event has occurred. Now we have to take it down because it's not functioning as it was
designed. That's very basic product management. I wonder if this gets into the point about the
advisory council that you've written about as well, how there was an advisory council established
to oversee this app to make sure that, you know, it was functioning properly, that it was being held to account, right, as it was unleashed on the public in the middle of this
public health crisis. But then about a year afterward, as you've written in your work,
the advisory council, its term expired, as I understand it. And then there was not that
accountability and the app still continued. You know, what is the concern with something like that, where this app is out in the public,
people are trusting it to deliver a certain function during this crisis that we're all
facing, and then the body that is supposed to be holding it accountable is no longer
doing that?
Yeah, this is a really important point because when we think about trust, which we see it
come up again and again, and it's happening right now
with law that is being tabled around the use of technologies in Canada around privacy and
artificial intelligence and consumer protection. There's this constant narrative about trust,
right? Is that, well, we have to have trust and if people trust the tech or trust the government. So let's think about trust in a public health crisis, okay?
Health Canada.
This is not that, you know, we need to think for a minute here.
We've got Health Canada is the one that has launched this app
and is accountable for it, right?
And if we go back to the beginning,
not only was Health Canada launching this app,
which in a pandemic, we have to trust very deeply in what Health Canada tells us to do.
We also had privacy commissioner at the federal level, privacy information and privacy commissioner at the provincial level and an advisory council all stood up, all watching over and saying, OK, we are the oversight.
And we know that this has to be effective from a public
health perspective in order for this experiment to make sense. And we need to use the right
language here. This was an experiment done within Canada during a public health crisis.
That is also something to pin for a moment, and we'll return to it, because this was experimental.
There was no information other than
information that these apps didn't work about how this was or wasn't going to work. And that's very
important. So let's go back to July of 2020. When you have a government, you have the prime minister,
you have the premier telling people you have the public health, telling people, download this app to protect
yourself, protect the people you love. And when people are scared and you are told by the public
health authorities to do this, to protect people you care for, it is reasonable that people would
listen to that message. What we need to understand is that when that message was shared there was zero evidence
of what was being said being true but when you start something like this you can understand
good intentions and a public health authority saying do this protect because if this works
this is something we should try okay so let's extend a little bit of that trust and faith at the beginning.
By the fall of the same year, it was clear that not only did we not know what this app was doing,
but there were significant issues within Ontario, which is where the bulk of the activity happened,
with basic function around getting notifications into the app. And I don't want to get mired in it.
But what I want to say is that by the fall of 2020, we had governments still saying to use the app
when it was clear at that point in time, we're not in the same rush as we were at launch,
that they did not have information to say that it was working or not working.
Because when we say working or not working, that doesn't mean did you download it successfully? That doesn't mean did you upload
a one-time key, which was part of the workflow. What it means is, what is it doing to impact the
pandemic? What is it doing to protect people from coronavirus? Like that was not information that the governments had.
And from the fall until just now, they were continuing to tell people to download this app,
even when it was at a very new level and a deep level of being broken in 2022. So the language I
was using around this was that this was misinformation. Because I think it's very
dangerous for public health in Health Canada to be telling people to download this app when
its efficacy was unknown. But also, what does it mean for people to think it's working?
What does it mean for people who have downloaded this app to say, okay, well, I've got this app,
it'll notify me if there's an issue. The government did not describe the change in behavior of this app, particularly in 2022,
when the testing capacity was gone. So we had months. Now, have we done research into whether
or not people had a false sense of protection from using the app? Not that I see in any of
the reports so far. So I think this
shows us not only was there a confidence in the government to assert something that they did not
have information about at the beginning, and as the information grew, they still didn't come out
and explain with public numbers what the efficacy was. I'm not saying it's zero, but what I am saying
is we didn't have information at all.
And just now at the end of this post, when it's been decommissioned, is there a little bit more
data available? So I think this from a narrative and trust perspective, this stuff is really
dangerous for democracy because it's not about the tech or did it work or did it not, that the public health agency
during a pandemic in crisis can continue to encourage and repeat a message that does not
line up with what was happening in reality and the information they had. That's deeply disturbing.
And that to me has not been addressed in any of what I've seen in the final reporting so far.
So long answer, but that's sort of where this all is in terms of trust and messaging issues.
I think you lay out so many serious concerns in that answer, right? In describing what actually
happened with this app. I want to return to what we've been hearing as this app is decommissioned
in just a second. But before I get to that, I want to extend what you're saying with that question, right? Because, you know, what people were told was you download this app
and it will notify you if you've been in contact or if you've had an exposure with someone with
COVID-19 so that then you can get tested and you can know whether you might have the virus, right?
And so then you won't spread it or, you know, you can take the health measures that are necessary in order to try to protect yourself, right? I wonder,
did this app then, in promoting it in this way, distract from the real work that needed to be
done? You know, placing a technology and an app in the place of, you know, the real work of human
contact tracers in the public health system that actually needed to be
done in that moment. And so then what is the kind of implication of seeing apps and these
technologies as a solution without examining that other side of it? Paris, this question is so
important. And let's imagine that we put the same energy into N95 mask communications. Imagine, multi-million dollar launch, comms, endorsements,
people out saying do this. If you hold these two things beside each other in terms of efficacy,
impact, all of it, it is deeply, deeply disturbing. Because what we did, and this is why we have to keep thinking about the word
experiment we know and we knew throughout the course of this as our knowledge grew what worked
what we have to do as a society with public health is we have to invest in the things that work as a
priority if we go back to the beginning it's part of why some of this app was so misaligned
with the truth on the ground. We had spreads happening in long-term care homes. We had them
happening in prison. We had them happening at work sites, which by definition helped us understand
what supports do people need? They need to be able to stay home. They need to be able to be
supported in having personal protective equipment, all of
these pieces. Those were not prioritized. And instead, there was this technology introduced.
And I think the reason this is devastating is because the public health communications,
it's difficult. And what we need to know is that we need to be able to have repeatable messages.
We need to be clear.
What is it? What works? It's not just one thing. I think we can also agree that the vaccine discourse
really got binary in terms of, you know, not helping us understand the breadth of what we
could be doing to support and help each other. And so to take all those measures that we could
have been teaching and learning
about together and to fail to assign importance to them and to communications, I mean, to hold
that up against the idea that we prioritized or introduced the concept of this app as something
that was so important for us to all do for each other. I mean, that messaging really belonged with a whole bunch of
other interventions. And I think what's really difficult about that is that the governments know
to say we never said it was a silver bullet. It's one thing amongst others, you know, like,
it's not a panacea, like, of course, it was framed that way. But really, it's distracting when we know we shouldn't be experimenting.
And I know people want to.
I understand the idea.
Maybe there's something here that might work.
And this gets into ethics questions that we don't have good public capacity to get into,
which is, well, is one life not worth it?
You get into some really difficult discussions that I don't
think we're particularly well positioned to have about these things. But when you work with policy,
you have to understand trade-offs. You really have to understand there's only so many things
you can do, so many messages you can hold at once. And so putting this up to this level,
the timeline, you know, you think about what we could have done and should
have done differently. It really had an impact on how we understood what was happening. And not
everybody and a lot of people knew right away, you know, like, I think the public has a pretty
solid instinct on some of this, where the lessons that the government saying they learned from this
are different from what I think the public might say they've learned from seeing this app happen. But yeah, those are some things that come
to mind when you raise this, because we need to open up to that level of conversation, which is
to say, you know what, if we have a number of priority messages that we should be sharing,
where does this fit in that scheme? And what are the harms of maybe going down this road when we
haven't put full investment into the other pieces?
So it's a really important point. Really important.
No, absolutely. I completely agree.
I do want to come back to, you know, obviously we've had this decommissioning now on June the 17th.
As we record this, it's just the day after.
So there hasn't been too much processing time. But what are you seeing in the statements from Health Canada and in the final advisory council report that is standing out to you in terms of how I'm looking at right now is really the idea that this is a good idea.
It's good. We did it. And then here are the minor tweaks and adjustments.
And I think that's just as a starting point, an incredibly incorrect read because now that we have efficacy information, which was also available prior to
the pandemic, I mean, there were other applications that had been used in other countries and other
situations. To me, this high level frame, you come out of this and you own it and you say,
this didn't work. This didn't work. So you can see that the capacity for the government to
decommission it is there, but on terms that actually tell a different story than the one
that we really should be talking about, which is this didn't work. These are some of the reasons
why. So let's look at those. It's important that we make space for all opinions here around like, okay, well, but
let's say someone really feels this is something that we should do in the future.
Let's look at the pieces of it that on a technical level didn't work.
But it's just as important for us in our society and our culture to hold space for the idea
that these types of things don't belong in our public health responses in the future.
And I think given what we have
needed to learn about public health, it is a collective undertaking. If there is one message
you hear over and over from across the board in public health, this is a collective undertaking.
And if we hold that truth, which is very important, against the kind of technology that this is, which is an individualistic on your phone
app, I think it's reasonable that we should say this kind of tech does not align with the general
precepts of public health. It just doesn't. It doesn't mean that we don't look at different
technologies, but it is reasonable to me in 2022 to say these types of things do not work.
Here are a lot of reasons why.
If we want to consider mitigating them and thinking about how we might look at this discussion the next time it rolls around, that's fine.
But I just want to say as a general frame, this report does not say this doesn't work. We are not going
to proceed with more of these in the future, which if I would see that high level lessons learned,
that's when my trust starts to increase in government. What this report is, is a back
filling. It's a way to tell a story that makes it okay that this was done. It points out bits
and pieces that enable it to be
done again in the future and strengthen, but it does not put on the table that this is not what
we should be doing. So we can get mired in a lot of the details, but I think the fundamental frame
is we as a society and a culture need to be at a point in time where not doing technology is an option. It really is. And very specific kinds of it.
And that's not a anti-technology stance or, you know, that feeling that way doesn't mean you're
against technology. I think what it really does is it respects the idea of context.
So what this means is this is context. this was a crisis, we have limited resources,
how are we going to use them? That's a very democratic policy understanding. And I just
have one more thing I want to say about what I've noted in this report. And this points us back to
British Columbia and Alberta, both were provinces that did not pick up on this app. Alberta went
its own way and did an app that
was different, but British Columbia didn't. And part of the rationale that came out over time
was they didn't want to burden their public health teams with this additional work. And they knew that
they were already, and this is noted in these reports. And, you know, it struck me that I
wanted to say that I am happy the public health care of sort of communities within, say, the province of Ontario,
even though it was against, you know, doing what was supposed to happen to support the app,
that they prioritized the work that they knew they had to do first and foremost.
And I think that helps us remember who really is at the heart of making things work or not. It's us. It's us as
workers and it's us as residents and people who are in this crisis together. And that stuck out
to me too, was that this was burdening a set of workers in public health with something that
wasn't functioning. And I think that's important. And I don't want us to miss it because there's a
lot of
other stuff going on in these reports as we, you know, do this debrief and think about it for
lessons learned. But it really stood out to me that there was a province that knew, no, we have
enough going on. We're not adding this. And I think that's interesting that there was that authority
to say, no, we're not participating. And that's functioning in Canada. Like that's interesting to me, but also that the public health workers, they should be the ones if they don't
feel vested in making this function, which was happening from the fall of 2020, they were not
fully vested in solving some of the technical problems. That's a major flag. They know,
you know, so I think that was sticking out to me as
I was reading the report. It's easy to get mired in the numbers and the downloads and what did we
do and what happened? And is it our fault because not enough people participated, which is a little
bit of the narrative that gets pushed when it says, oh, we should enhance communications next
time around. But let's never forget the people at the heart of the crisis response and their relationship to these technologies. And I think there's a big
message in that one too. Absolutely. You won't find any debate on this podcast that there might
be times where we shouldn't be looking to tech solutions when human solutions work better.
And just to kind of pick up on what you're saying there by contrasting the provinces,
one of the things that stood out to me as I think about my experience during the pandemic is that, you know, I was in
Newfoundland and Labrador, which was part of a group of provinces that took a different approach
in, yes, we did have the COVID alert app, but we shut down our borders for much of the early part
of the pandemic in order to stop the virus from being able to come in. And there was, you know,
I don't know what the media was like and what the discussion was like so much in the media in
Ontario and Quebec on like a more local level. I only followed the national part of that. But over
here, it was very well recognized that the human contract tracers were essential to keeping us safe
and to finding and identifying those cases that did breach the
kind of border restrictions and then ensuring those people isolated so that you know we could
continue to have very few cases in the community and so when i was thinking about the app and the
covert alert messaging what always came back to me is even though the government said like we have
this app and it's signed on with the federal government here in the province, I think it was important that that
recognition was there. And I don't know if it existed the same way outside of, you know, our
few provinces in the Atlantic part of the country. So in Ontario, I'm in Toronto, you know, Ontario
was the first province to launch the app. And there was a conversation at the beginning around what does it mean to receive a notification on your phone instead of getting a phone call from a person?
And I think it's important for us to think about the efficacy, firstly, like as you've been mentioning here in terms of like manual contact tracing.
And this being something that couldn't perform the same because of the privacy
and all of the safeguards that were built into this app, it also made it so you couldn't replicate
what this was based on, which is people knowing who they were in contact with and being able to
get in touch with them. But when we think about beyond the efficacy and this like being less so,
what it means to get those notifications when you're by yourself in a crisis. And I keep coming back to this, of this emotional place that we've been in, continue to
be in, exhausted, grieving, so many things. And the idea that we should be supplanting or switching
how we work together and support each other in a really hard moment. And I think like that,
that was the piece of this, like looking at prior
process and then looking at what we're trying to be pointed toward. So not just the reduction in
the labor and the quality, the experience of that and what it means to have someone hold your hand
when you're getting scary news that may then knock, you know, have these knock-on effects where
you need a wraparound of someone to tell you, okay, here's what you do on this front.
Here's the kind of supports you can have over here.
Oh, there's children in your home.
Here's the thing about this.
Like you think about what it means to be supported by another person
who's trained to do that work.
It's so important.
So I think losing that piece of it,
which is different and related to efficacy
is really dangerous.
And I think like the sort of lack
of what happened in Ontario,
at least understanding when we talk about research that wasn't done, how many people
stressed our testing capacity because they got those notifications. Like we need to know how
when this like testing capacity was a requirement at the beginning, but what did it mean to flood the province of Ontario with people who were getting these notifications, which were
based on science that had already evolved from the time, you know, it was understood what distancing
vaccines, masks, transmissions, like there were so many things that had shifted and it's sending
these notifications out. Firstly, there's a human impact when you get that notification.
But on a systems level, what did it mean that you had people who were taking those tests
that for the most part might not have needed to take those tests?
And who are those people that were getting that notification versus, you know, there's
a large population.
We know the history of what's happened in Ontario and across Canada.
We know this is not an equitable
situation of who's suffering the most from COVID. So I think like you see how every time you open
a little door and look at what this shift or addition or evolution of how we, you know,
provide services, it just has myriad negative impacts for those that were already previously not in a position of strength
to get good community support. So, you know, it just, I think maybe the story that's difficult
to tell here is what isn't written down, what wasn't researched, what is unknown, because when
the government frames everything from how contact tracing would now be,
I think they would try to say, oh, we're not trying to change it entirely. We're just trying to
supplement it. Right. So when they say things like that, it's like, no, hold on, wait,
there's a whole bunch of questions we're not asking because now we're following down that road.
And the last thing I want to say on this, and it's something about the influence
and impact of industry, of innovation, science and economic development, Canada, like the people
involved in pushing this, we're really pushing it from an economic development perspective. And so while we know our whole
response to the pandemic has been in trying to keep the economy going, no matter the costs,
we also need to add this into that frame, which is this appeared on the scene because Google and
Apple had an interest in showing up and saying, hey, we're in a lot of people's phones.
What do we have to do here? How do we participate? That didn't come from any of us. That didn't come
from a history of working. And then you've got, you know, an Economic Development Council. That's
who oversaw our Advisory Council. It's, you know, again, an easy thing for the government to skip over because
you've got Health Canada talking, but who's acting? Where did this come from? What does it
mean that Shopify is there saying, hey, want some code? What does that mean for procurement?
So we just have people donate things and then because they're free, we're going to use them
for an experiment in the middle of a pandemic.
Like, honestly, some of these pieces, they don't snap into each other very easily. But I wouldn't, you know, it doesn't feel right not to acknowledge that the pressure to do this and the ongoing
pressure to do this again is coming from industry. It is. And of course, those within public health and others
have an open mind to everything. It's not to say good things don't come out of industry.
That's not the message. But really, we need to go back to foundational basics here on public
health response and make sure we're starting from there, including how we do testing, what our
capacity is, how we manage it, how we're
doing contact tracing. Paris, we don't have data anymore about what's going on just at all. And I
mean, if we think about what we've wiped out and now we're saying, but let's do more of this
anonymous electronic notification alerts, like it's really difficult to hold these things together
in any way that's coherent. I think there are so many good points there, but I think what you're
saying about this app and this approach being driven by industry is really important because
I think it plays into a larger question that is beyond this conversation about how technology and
tech solutions in healthcare were pushed a lot during the pandemic. And I think that is going to have
repercussions that we're going to have to deal with post pandemic that I think our government
certainly are not ready to seriously grapple with, especially as there are issues with healthcare
funding and there are pressures to privatize and all of these kinds of things that we're dealing
with. But those go beyond this conversation. So I don't want us to get wrapped up in those,
even though I know we could talk about it for ages. You know, we've been talking about how
Health Canada and how the advisory council have framed the app now as it's being decommissioned.
In your work, you've been saying that what we really need is a public postmortem of the app
that really dives into the impacts that it has had. What should the focus of that postmortem look like?
And what is the risk if it doesn't happen?
Yeah, so we could think about a policy framework
that is based on a life cycle of a technology.
I think it would be so helpful for us to think about this framework
well beyond like this public health intervention and technology and just generally in government technology and public technology.
Because it helps us look at what happens with the product, which is you think about it, you design it, you launch it, you have to maintain it.
And often you decommission or sunset it.
This is very normal. So you cannot
just set a policy to launch something because once it's launched, you got a whole lot of other
operational things that you need to keep on top of. So I'm introducing that as the concept that
if we think about policy as a life cycle, because technology does not just go away. We know this, right? But that's,
it's important to stop and say that good intentions, you know, well, let's just try,
needs to also have, let's just try and plan for if it doesn't work. So at a very high level,
thinking about a life cycle and thinking about those public benchmarks, which are important. So let's say there are probably two or three really critical components of what the framework you need to have in place looks like before you launch an app like this, right? do this or not. I know it was going fast. I know people wanted to try something. I know they were
hopeful. That's all fine. But you still cannot be ahistorical. You have to say, okay, hold on.
What's happened so far? These are the potential risks. Let's look at the trade-offs and have a
very clear way to have that conversation. Because instead of that, what happened with this app was as long as it's privacy preserving,
let's go.
That cannot be repeated because efficacy was not discussed.
And efficacy absolutely has to be there.
And efficacy and privacy have a relationship to each other, particularly if we're going
to talk about public health, where, and I need to be very clear
in this, I don't believe this type of an approach in any circumstance is a good idea, these personalized
apps. That's me, one person, really, who cares about that? So then let's say, if you're going
to get into the conversations about these apps, you have to say, well, look, for this to actually
work, your privacy approach may not be the one that you would do again.
You may need to have more information. Then when you start to get into the realities of how this
may or may not work, you can have a proper trade-offs discussion. So let's just say,
step one here, you have to have a, do we do it? Do we not? And that includes, do we build it?
Do we buy it? How do we manage the procurement?
Because we need to hold on to accountability. We cannot just ingest technologies because they were
donated to us, right? So we completely skipped a gating step of procurement, which in an emergency
gets worse. So we've seen a lot of stories about what happens in emergency, you know, for tendering. So step one, big box
around it is the do we do this? Do we not analysis, not if it's privacy preserving, we're fine. So
let's never replicate that if it's privacy preserving, we're fine. It's a factor in a
bigger matrix, right? So that is step one of this lifecycle. Then before you get out of there, you say, under what conditions would
we pull it down? How do we know it's working? And how do we, and this is critical, this is sort of
the, what's the plan for post-launch? You have to publish your benchmarks for when you will shut it
down or change it or adapt it before you launch it. Because if you don't do it before you launch it, you will always
have a story to tell as to why you should just keep trying or changing or evolving or adapting.
This is called path dependency. It's very difficult if you don't set that up at the beginning
to stick to it. So that's this really important part of the first thing, which is establishing.
And I know it is hard. Please don't think I'm saying it's easy to say the numbers. You have to though. What does it mean when it's working? What do we do if it's
not? How do we pull it down? How do we adapt it? What we saw with this, and this is all before
we've launched, okay? I've got two more shorter pieces. But another thing that we've seen in these
reports, there were two adaptations the governments wanted
to do. One of them was to add QR codes, which we saw in other countries, which would say,
if you go to a cafe or you go to a restaurant, rather than people copying your information down,
that you could scan a QR code. That was something that was on the table. That didn't happen,
but they were trying to do it. The question there would have been, well, what if it's only
voluntary, right? So that would have made it difficult. The other thing they were
looking into was wearables, which was to say, how can we hand out lanyards, bracelets, low cost
options if people don't have a cell phone? And I'm reading this going, okay. And there's examples of
it. And they talked about using notification lanyards or wrists, Air Canada, they did a pilot and all of this was happening. And so they were really
taking this down another track, which even gets into different hardware. Like there's a whole
other market with wearables, right? I share those two examples because there was no path to figure
out if those were reasonable things to be adjusting toward. There was no governance
documentation for that. So that's the big chunk of it is the work you do before you launch something.
It has to say, do we do this? Do we not? What are the trade-offs? It cannot be only focused on
privacy. And it has to lay out that life cycle of how do we make decisions about shutting it down
and maintaining it? What do we do when
the public health guidance changes? What happens when people are vaccinated? What happens when
masks are introduced? Like there was no, there was no governance around any of those changes.
So big chunk of the work has to happen before it's launched. Following launch, should the decision
be made to launch something, you get into maintenance mode and
tracking. And this is where it is so critical to have this be public, public benchmarks, public
publishing of all of the things and not the data that confused us, which is what happened this time.
The numbers we were given were how many people downloaded the app and how many people uploaded a one-time key.
Neither of those numbers tell us what the app did for our COVID response.
They just say whether people downloaded tech.
I feel silly looking back at it that I didn't pop to the like, wait a second, this doesn't
even tell us anything.
So just to say, it's very easy for us to be given data and to feel like we're following
along.
But what we were given didn't help us at all understand efficacy.
So in maintenance mode, critical that we know what those benchmarks were before launch so that we can follow along and then say, hey, it's not working.
Hey, all of us are in this together.
Government, public health.
We see we wanted to try it.
We see it's not working.
Pull it down. That has to be public. We had these advisory count, privacy commissioners,
advisory councils, they were supposedly looking at information we weren't seeing,
then there was an audit done, that's still not public from Health Canada on if it was working
or not. So in the middle there is critical publishing benchmarks, holding accountable on whether it's working or shut it down. Like, I'm glad to end on
that note, because that did happen here. We have very capable people working in technology in the
government. And I think this was a difficult thing to watch, knowing that there was really good
people who as technologists would know this thing should not be here anymore. Right? So at least
it's this proof,
hey, we can decommission something.
It can happen.
The world doesn't fall apart because we said we try something and it failed.
No problem.
And technically we can support that.
But politically, that's where we need to have
the expectation that that can happen.
And I think that held up this being decommissioned
for a very long time, the political consequence,
which is so
different than the technical, right? But I think those are like, those are some key components
of what we would need as a framework from before we would do anything next time. And people will
have a good thing to say back here, which is we were going fast. That's fine. but this is important enough that let's spend the time on that in between not on
ratcheting up how to you know do the same thing again so that's sort of the what we need for next
time as a starting point there's lots of other things i'm sure people can add and expand and the
only other thing which is separate is to do a proper post-mem. Because just at a glance, a day after seeing what I've seen so far,
this story is not getting at what actually happened in the ways that we need to think
about efficacy and taking this stuff seriously. This is providing a story, the government is
sticking with its story from the beginning right to the end. And that is not helpful for us to work in real terms in public health with real
information. So that this so far looks like a political undertaking of a lessons learned.
I would suggest we do a proper public postmortem driven by public and driven in sharing lessons
from other places in the world. Because some of these things in here are true. Yes, the provinces
needed to be held accountable. Yes, we were, you know, like there were problems there.
But you could have known those things without an app.
We know the provinces are in charge of health care delivery.
So like there's no surprises in here that, you know, that we shouldn't be working on always anyway.
So, yeah, I think that's the crux of like the what should we try to have in place, but also a real public postmortem,
because what we're going to get from the government is not going to be expansive.
And it's not going to get into those issues that we touched on, which aren't anywhere.
They're not written down because they don't track with what the rhetoric is here.
You know that this is a good thing.
We just need to try a bit harder next time.
Absolutely not.
I'm not saying that it shouldn't be on the table.
But if we want to have a proper
conversation about it, we have to be able to say, this didn't work. So one track is not doing these
kinds of technologies again. And if you want to say, we want to try it again, then that's when we
go down that road of, well, what do we put in front? What's the framework? What's the maintenance?
And what's the shutdown plan? I think those are such essential points that you've laid out for us to actually get an
understanding of how these technologies and these technological experiments, especially when coming
from government, should be approached and should be understood. And I think your final point there
about the politics of it really leads me into my final question that kind of broadens us out from
the COVID alert discussion to a broader look at,
you know, how the government approaches technology. And that's basically, you know,
what does this whole experiment, this whole saga, tell us about tech governance in Canada,
and how the government approaches technology? So a couple of things. Firstly, what we know
is that government accountability in a democracy in Canada,
we have Westminster accountability, which means you have to find someone who's in charge of the
decisions that they can be held accountable, right? And what we also know from talking to
different public servants over the course of, you know, the COVID alert experience, but more broadly, is that we don't have a comfort in
saying something didn't work politically, which is why it's such a big deal that this app was
decommissioned. Because in long historical terms, what you need to do with technology
is fundamentally misaligned with Westminster accountabilities,
which is really deep and complicated. And I don't mean it like philosophically deep and complicated.
I mean, in some really gnarly public administration terms, what we know about technology and what we
know about democratic accountability, they do not map to each other. They don't. And so we have to
be honest about that. Because I
had some public servants say, you know, in the last few months, they're like, of course,
they're not going to pull it down. They can't admit failure, that would actually be counter,
like, they're not incentivized to acknowledge what has happened. And yet they did. So that's
encouraging. But also, there's an entire public administration, democracy and government situation
here, which is when you launch something, you plan one way, all the accountabilities go a certain way. There's a federal provincial piece to it. So I just want to say that what this says about our government to me with technology stuff is ideologically, it is pro technology. It is pro technology as an economy. It is a way that it wants to be seen. I think this entire app,
you could look at it as a public relations campaign to say, hey, look at us. We're doing
new innovative things, which captures imaginations around the world. This is a long, big problem.
But I think the way that it intersects with actually holding governments accountable
is something that we need to like hive off and get out of the technical privacy conversation
and get into the like, no, there's actually a misalignment here. Like if you're going to be pro technology,
you're never going to be able to align with what good technology standards look like.
So we have to pick what we're doing there. I think secondly, just more broadly, the idea that you can
perform with technology, right? Where you can say, well, we're doing this app and it's good for you and
protect you and your family. Or we've got Arrive Can, which is this mandatory app, which is big
problem and different than the COVID alert one, which was voluntary, which was a good part of
what was happening with COVID alert, a mandatory app at the border. And that we are somehow trying to normalize the idea that major events like travel or entry
and exit to a country, much like public health response in a pandemic, that engaging with an
app on your phone is an appropriate way to manage such an interaction. And we have to keep an eye
on that because this is the normalization and creating norms around things
that are not actually beneficial to us. They're not. They're not. And if we don't challenge that,
you can see how the ratchet happens. So I think we can see the enthusiasm for technology. We can
see it being applied in different ways and trying to, you know, just to keep ratcheting it up and normalizing. And I think if we think about how we don't have a place to challenge these things in
terms other than go with it, or you're against the technology, it's like, no, no, no. We have
ways to invest in people and we have ways to invest in our services to pull off the same, you know, outcomes in terms of like coming in and out of a country or supporting people during a public health crisis.
Like they do not require apps. They just don't. And I think it's if I think about this too long, it gets you know, it's pretty dark. The stuff isn't good. It's just not good technology,
even if it's deployed properly or accurately or, you know, or it functions. And so we need to
figure out how to move the priority investments into what we know works. But that's a long fight.
You know, I think people have been trying to fight for that in so many different ways for so long. And we just have to realize that we need the shared language to challenge this without getting mired in the privacy or efficacy or whatever else. It's just, even if you think these things are good and they work, that do we protect the space to say, even if they do work, we don't
want that to be the way our, you know, that this country operates, we just don't want it, it's a
choice. And we need to figure out how we, you know, continually engage both sides of it. Because I
just feel like with a lot of the tech stuff, if you're not into it, it's almost like, well, it's
a normative good, how could you not want to do x, Y, Z? And so many of us experienced this, but closing that gap continues to be the long work,
but I think that's it. I'm happy that you closed with that point because I think it really gets to,
you know, something that this podcast is all about and that this series is about in looking at
technology in Canada and, you know, how we approach technology and how the government
thinks about technology. So I think it's a really essential point and I'm really happy that you made it.
Bianca, great to talk to you as always and to discuss such an important issue that didn't get
the attention that it deserved just because we were in this public health crisis where we just
needed to take on whatever might work. And now we have the opportunity to at least look back at that and learn the lessons.
And I think the work that you're doing
in making us focus on the COVID Alert app
and what happened is really key for that.
So thank you so much for taking the time.
Thank you for that.
Thanks so much for having me by.
It's great to talk to you as always.
Bianca Wiley is a partner at Digital Public
and a co-founder of Tech Reset Canada. You can follow her on Twitter at Bianca Wiley. You partner at Digital Public and a co-founder of Tech Reset Canada.
You can follow her on Twitter at Bianca Wiley.
You can follow me at Paris Marks and you can follow the show at Tech Won't Save Us.
Tech Won't Save Us is produced by Eric Wickham and is part of the Harbinger Media Network.
If you want to support the work that goes into making the show every week,
you can go to patreon.com slash tech won't save us and become a supporter.
Thanks for listening. Thank you.