Tech Won't Save Us - Surveillance Won’t Protect Students w/ Chris Gilliard

Episode Date: September 8, 2022

Paris Marx is joined by Chris Gilliard to discuss the push to expand surveillance technologies in schools during the pandemic and in response to school shootings, and why they’re making life worse f...or students without addressing the problems they claim to solve.Chris Gilliard is Just Tech Fellow at the Social Science Research Council at a recurring columnist at Wired. Follow Chris on Twitter at @hypervisible.Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Follow the podcast (@techwontsaveus) and host Paris Marx (@parismarx) on Twitter, and support the show on Patreon.The podcast is produced by Eric Wickham and part of the Harbinger Media Network.Also mentioned in this episode:Chris recently wrote about why school surveillance won’t protect kids from shootings.Chris and David Golumbia wrote about luxury surveillance for Real Life.Pia Ceres wrote about how students’ school devices are still tracking what they do on them.Amazon is launching a new show called “Ring Nation” to make Ring surveillance videos seem less invasive.Studies by the Center for Democracy and Technology have found negative effects from surveillance on student expression and increasing their contact with police.After nine members of Axon’s AI ethics board resigned, plans for a taser drone in schools seem to still be inching forward.Todd Feathers reported on how school monitoring tools could flag searches for sexual and reproductive health resources.Pasco County in Florida deployed a predictive policing system targeting children. Some books mentioned: David Noble Progress Without People and Forces of Production, and Dan Greene wrote The Promise of Access: Technology, Inequality, and the Political Economy of Hope.Support the show

Transcript
Discussion (0)
Starting point is 00:00:00 The failures of surveillance are always met with calls for more and deeper surveillance, more data, more cameras, improved AI. In an environment where people say we have to do something, it seems as if it's a solution. Hello and welcome to Tech Won't Save Us. I'm your host, Paris Marks. And this week, returning guest is Chris Gillyard. He was on the show in the past to talk about digital redlining and why you shouldn't be buying surveillance gadgets as gifts for people. And especially as these technologies move into our homes and many other aspects of our life in a really worrying way. And certainly I think that only becomes more obvious as Amazon is now
Starting point is 00:00:58 pushing a ring reality TV show to use these videos from people's surveillance cameras in order to do like an America's Funniest Home Videos, but much more creepy and surveillance oriented and trying to push this really terrible idea of what society should be and how these things should act. And so Chris is a Just Tech Fellow at the Social Science Research Council and a recurring columnist at WIRED. In this week's episode, we talk about how there is this new push to expand the reach of surveillance technologies and really punitive technologies in general within schools, and how there are two ways that this is really being pushed. And the first of that is early on in the pandemic, in, you know, the early lockdown period when a lot of schools were closed and students were often learning online rather than in class, there was a big push to add new forms of surveillance software onto students' laptops and devices so that teachers and schools could monitor what they were doing on those devices. And now, you know, kids are long since back in classrooms, but many of those
Starting point is 00:02:12 systems have not gone away and in some senses have been even further entrenched and expanded in the time since. And then the other piece of this is, you know, of course, we all know about the devastating effect of school shootings in the United States. And again, they just keep happening. And we keep having these terrible stories in the news about what happens at these schools. And so because there is so little action on gun control, there are always attempts to solve this problem through other means. And one of those is, of course, through technology. And so especially after the most recent shooting in Uvalde, Texas, there is a new push to expand the scope of surveillance and really almost like prison,
Starting point is 00:02:59 in their degree of control technologies onto schools and onto students. And of course, this is part of a long line of technologies and various measures that have been taken to make the school more of a prison-like environment without actually dealing with the core problem, which of course is guns. And so I think that this is a really fascinating conversation, a really important conversation, especially in this moment as many kids are going back to school. And this is something that should really be top of mind, you know, as we consider what is happening with schools in this moment where things are going from here. So I was really happy to have Chris back on the program. And I think you're really going to like this conversation. As always, if you like this episode, leave a five star review on Apple
Starting point is 00:03:42 Podcasts or Spotify. You can also share it on social media or with any friends or colleagues who you think would learn from it. And if you want to support the work that goes into making the show every week, you can go to patreon.com slash techwontsaveus and become a supporter. Thanks for listening and enjoy this week's conversation. Chris, welcome back to Tech Won't Save Us. Oh, thank you very much for having me. It's always an absolute pleasure.
Starting point is 00:04:04 Now, I always love having you on the show, of course. And, you know, I feel like the topic that we're discussing today is kind of in line with some of the things that we've talked about in the past, right? Our first conversation was on your work on digital redlining to a large degree. And part of that had to deal with, you know, the filtering that was happening at a college that you were teaching at when students were trying to access things that were online and do their research, right? And then more recently, at the end of last year, we talked about all of these surveillance devices that are being pushed on us increasingly, and in particular, in our homes, and really what that means for how we understand the home and what that actually means for our communities more broadly and things like
Starting point is 00:04:43 that, right? And recently, you wrote a column for Wired looking at the push to increase the surveillance in schools. Right. And what is driving that? And I think that this is a really important topic, especially in this moment where there is such a big discussion about schools when schools are so impacted by the aftermath of the pandemic. We're still talking about the impact of school shootings.
Starting point is 00:05:03 And really, this is ongoing for several decades now, that conversation. It's certainly not a new thing. And so I guess I wanted to start with a broader question, and then we can dig into many of the specifics that you've talked about, both in your piece and more broadly on Twitter. Why is this topic of school surveillance, if it's not an obvious question or an obvious answer, why are you so concerned about this topic? Why is this something that we should be paying more attention to and really wanting to push back on? Well, I think it's not obvious. It's a pretty layered question, and hopefully I could provide a layered answer. I think one of the things I try to get people to think about when we think about this is the degree to just how important it is. I mean, it's a pillar of
Starting point is 00:05:46 a functioning society, whether or not you can protect the most vulnerable, the youngest, you know, children. We're in a lot of ways failing at that. I think there are a lot of ways that people are hoping to address it. And one of the things that happens, I think, is that barring an answer or barring the political will to do things that we know might work, that other societies, other countries have done, people are grasping at anything they think will result in safer schools and safer children. And so what we see is a lot of kind of tech solutionism applied to these issues because companies, I mean, I'll be fair to companies in a way I'm usually not when I'll say that there are a lot of people looking for solutions. There are also some people who are just in it for the cash grab, and they often promise things about their technology and its ability to ferret out
Starting point is 00:06:41 danger or possible school shooters or to predict certain kinds of behavior and things like that. They make all kinds of wild promises about what their technology can and can't do. And institutions, you know, schools are desperate for solutions. Because, again, it applies to an essential element of society, which is keeping children safe, people will do almost anything to do that. And they'll climb on to anything that seems like a solution in order to move forward. It makes perfect sense, right? Like, obviously, people are seeing, you know, in particular, when we're talking about the mass shootings at schools, people are seeing these children die. And at least for some people, there's, you know, a really kind of inherent reaction that we should want to keep these children safe, that we should not want to have this happen in schools. now? Because obviously the most recent shooting in Texas was not something that was entirely new.
Starting point is 00:07:45 There's a long history of this going back to Columbine, and I'm sure even before in the United States. So how has that conversation about dealing with this in schools actually evolved over that time period? Has it always been this kind of response? Well, that's an interesting question, and I'm not sure I know fully the answer to this. I do know that in many ways, as someone who does a lot of research looking at surveillance and the history of surveillance, that the way this has evolved is the way that a lot of technologies in the last 20, 30, 40 years or solutions, in quotation marks, have evolved, which is the notion that more surveillance is somehow going to increase the level of safety. As we've had more technologies available at somewhat cheaper prices and things like that, more cameras, more things that people are calling artificial intelligence, more machine learning, more systems that can be installed on young people's computers. As these things have become
Starting point is 00:08:46 more available, they've been pitched more and more as some kind of solution to this problem. But again, to go back to that history of surveillance, there's a very longstanding assertion by advocates of surveillance that more surveillance equals safety. That has not borne out. I mean, there's a lot of examples we can use, whether we want to look at the prevalence of CCTV in the UK or body cams, police body cams in the US or the level of surveillance in the United States, after all. It doesn't equal more safety, but people often feel like it does. And I think that gets to part of your question, which is why these systems have become so
Starting point is 00:09:34 prevalent in schools in the United States. And not only in the United States, but that's where most of my focus typically is. Absolutely. No, I think it's an important point. And I think you're completely spot on. Like when I was reading your piece, I just kind of thought about some of the things that have been added to schools or that I've read about, you know, being added to schools in the past decade or two in response to this desire to want to try to stop school shootings or find a way to stop them, that, you know, have they really been successful? I don't think so. Things like ID cards or adding on-site security people, whether that's police or someone who's just hired to be security at
Starting point is 00:10:17 the school, probably someone who has a gun as well, because of course, one of the slogans of the NRA is that you need a good guy with a gun, right? The metal detectors in schools in order to try to find someone bringing a gun into the school, clear plastic backpacks that some of the students have to wear so you can see the contents of the backpack, security cameras, of course, as you say, with CCTV being rolled out. And I'm sure there are many other examples. And, you know, I don't know if you have any thoughts on that or anything that you'd want to add, but it does seem to be like there's this long history of trying to add these different layers of security or policing or surveillance to the school, but really that doesn't address the inherent, the fundamental problem. Yeah, absolutely. We saw and we continue
Starting point is 00:11:01 to see that after some of the more recent shootings, there was an increase in talk about hardening schools using military language in terms of adding, I guess, the way to think about it or the way I think about it. I don't want to use this term, but I need to talk about it in the ways that people are talking about in order to kind of tease it out, right? Making it a more difficult target when they talk about hardening. And the ways that a lot of these people mean essentially means turning schools into prisons or military installations. And, you know, again, that gets to the thing that I mentioned early on, which is, how shall I say this? It's a very sad indicator for where we are as a society that the proposed solution to making schools safer is to essentially turn them into prisons or into military installations. It also makes you incredibly sad, right? Like,
Starting point is 00:12:01 just to think about, you know, the school is supposed to be, at least if we imagine it to be, you know, this place that is welcoming of students where they can, you know, learn to explore different ideas, learn to be creative, like, you know, explore their different passions and figure out what they're interested in in the world and learn about the world, right? And then to think about that kind of a space that should be playful and creative and fun and engaging and enriching. And then to think it increasingly being turned into, as you say, like a kind of prison atmosphere where students are constantly being surveilled, where they need to feel worried about what they share, even on their devices or whatnot. And someone at the school might see that and see it as an example of them being a potential threat to the school. It just seems completely counterintuitive to what we want a school to be. And then I also wonder, when we think about how this affects different
Starting point is 00:12:56 people, is this something that is just happening to people in public schools, but then wealthy people who go to private schools and things wouldn't have to deal with something like this? Yeah. I mean, so Center for Democracy and Technology just did a really interesting study about school surveillance systems, and especially some of the software that's put on kids' computers and things like that. And one of the things they noted is that I think something like 50% of the students who they talked to said that they did not feel free to express themselves freely and openly when they knew that they were being watched. This number increased when we're talking about queer and trans students or people who are in some ways exploring their gender or their sexual identity. And so, the sort of bedrock of how we think about schools and, you know, what the most kind of fertile ground is in order for people to learn. One of the first things that you have to take care of is people feeling safe.
Starting point is 00:13:58 And by hardening these schools and by increasing the level of surveillance, I think it doesn't do one thing and it does another thing that people are trying to not do, I think, in that there's no evidence that these things actually make schools safer. The independent studies that I've seen suggest that they do not. In fact, it makes certain segments of the population of schools less safe empirically. But in addition to not doing that, not doing the thing it's supposed to do, by adding this constant surveillance, you're making people feel less safe, not more safe. And so it not only doesn't do the thing, the most important thing, and the thing that boosters of these technologies and these practices and behaviors propose that it does. It not only does that, but in some ways it makes the
Starting point is 00:14:52 situation worse because people do not feel, young people do not feel free to express themselves and do not feel safe at a place where that's one of the most important things that you can have if you want to have learning and growth. Yeah, it's so concerning, right? Because it's the complete opposite of what you actually want, how you want the students to feel and what you want the school to be. Obviously, there are renewed proposals for the expansion of this kind of surveillance in schools, and particularly after the recent shooting in Texas. Can you walk us through what the vision for this kind of hyper-surveilled school that is supposed to be super safe as a
Starting point is 00:15:31 result looks like? What other measures do these people want to see added to schools and that they claim are going to make these schools safe? Well, if I could paint a picture, it's going to be very extreme, or it's going to sound very extreme to someone who doesn't follow along with this tech and these narratives. But essentially, it would look like a prison or a military installation. There might be guards out front. There would be metal detectors. There'd be cameras everywhere. But not only to get in, but there'd be cameras inside everywhere. There might
Starting point is 00:16:06 be microphones that detect sounds. These cameras would maybe feed into some fusion center. They'd be equipped with facial recognition. There would be some kind of what people call artificial intelligence studying people's movements, their biometrics, their gait, in order to predict some kind of threat. There might be some additional layer of some kind of drone, or it could be a robot dog that would seek to intervene in the case of a shooting. But also, there'd be software. There would be, you know, as I mentioned, these systems would probably be equipped with some kind of threat detection. But it's also software and systems that go on students' computers that look at their emails and their messages and their social media and their social media, and the little notes they type in Google Docs,
Starting point is 00:17:08 and, and, and, and their browsing habits. And so when I paint this picture, it might seem as if I'm being hyperbolic, but many of these things already exist and are widely deployed at schools. And many of these things have been proposed, incredibly proposed by companies who sell this stuff. And what I mean by credibly is that these companies often have the pull, the influence to make these claims into reality. And so an example I would give is that the CEO of Axon, briefly, I might add, okay, so this was pulled off the table after wide backlash for now. But the CEO of Axon proposed the idea of equipping schools with drones that were armed with tasers. I should back up. If people are not familiar with Axon,
Starting point is 00:18:00 they're most known for being the company that makes a lot of police body cams and produces the Taser. a potential school shooter or a school shooter that could be released from their housing, fly to the site of the threat and distract and disable the potential shooter. He even constructed, composed a graphic novel with this future that he envisioned. There was wide backlash, including much of the civilian board of Axon resigning because he had proposed it to them, and many of them told him it was a bad idea, and he moved forward with his plans anyway. But when this became public, there was widespread backlash, and he eventually kind of pulled that off the table for now, right? I don't know that that is permanent. But one of the things he said, I think, and I'm paraphrasing, one of the things that the Axon CEO said, and I think it's really important, so this is not a direct quote,
Starting point is 00:19:17 but he said something along the lines of, I felt like we needed to do something. And I think this is really important because I'm going to take him at his word for a moment in that he's looking at this situation, which we all agree each time it happens is an atrocity. So he's taking a look at the situation and feeling like he needs to do something. And part of the other thing that he said is that the gun issue was kind of a non-starter. And so, I mean, bluntly, part of the reason it happens here in ways that it doesn't happen, and here I mean the US, part of the reason it happens here in ways that it doesn't happen as often or as in the most gruesome ways is the ready availability of guns. And so even the CEO of Axon sees the gun issue as something kind of intractable in ways that he'll reach to what many people see as a dystopian fantasy
Starting point is 00:20:19 in order to try to alleviate a thing, which everyone agrees is an atrocity and needs to be stopped. It's a very gruesome picture overall to think about sending our children to these places in order to learn, to send them to these hardened institutions where they're constantly surveilled. But then they're also constantly surveilled. One thing I left out, they'll also be surveilled depending on who they are and whether or not they can afford their own device. They'll also be constantly surveilled at home.
Starting point is 00:20:52 And this might even extend to other family members who may use that device because they don't have their own. And this, yeah, I'll leave it at there for now. We could get into some more stuff, but yeah, I'll leave it at that for now. of gun violence in schools, it's inherently a political issue, right? It's an issue that requires a political solution because as you say, it's happening as much as the NRA and people like that would like to say differently because there are so many guns and because guns are so easy to access in the United States, right? That's really the driving force behind all of this gun violence. And so you need a political solution in order to deal with that problem.
Starting point is 00:21:47 But because there's not the political willingness to do that, these other solutions, techno solutions in particular, especially more recently, step into that void and say the politics isn't going to work or you don't need the political solution, maybe if you're someone on the right wing. And instead, we're going to solve this by saying, all we need to do is put these new technologies into the school and that will make everything safe and that will solve the problem. Yeah, absolutely. I mean, it's right there at the name of the podcast, right? The tech will not save us in this instance, in most others for that matter. And again, I think there's wide agreement on this.
Starting point is 00:22:25 If there's a thing that would make children safer, most people agree across a very wide spectrum that it's a thing that we should do. But there is no evidence that these things actually do make children safer. And that is the sort of dichotomy about this situation, is that people are clamoring for solutions. Tech companies are offering solutions. There's no proof that they actually work as solutions. And again, they make people less safe, particularly some of the most marginalized and vulnerable populations within a set of students. It seems like in that case, like those solutions then, whether it's adding an armed guard or whether it's putting in metal detectors or whether it's adopting these security cameras
Starting point is 00:23:13 or these AI systems or what have you, not only, you know, offer a tech solution or some other policing solution in place of a political solution, but also give politicians and other groups kind of the cover to say, look, we don't need to change the gun laws or the Second Amendment or what have you, because, you know, we have these other ways to solve the problem. And every time one of these, you know, supposed solutions is shown to not work. There's just a new one waiting in the wings to be implemented instead to further this kind of dystopian trajectory of schools so that you never have to actually look to the political problem and the root of the problem. Yeah, absolutely. I mean, I've made this observation many times in that a thing that
Starting point is 00:24:00 you can consistently notice about surveillance is that each time it fails, what's offered is that more deep and detailed surveillance is needed, right? So it's never that the tech can't do a particular thing, that it is inherently not going to do that thing. It's that we need more of it. The failures of surveillance are always met with calls for more and deeper surveillance, more data, more cameras, improved AI. In an environment where people say we have to do something, it seems as if it's a solution. What I've run into, what I even ran into when I was writing that Wired piece is the excuse or the refrain of it's better than nothing. And I actually don't know that that's true. I actually don't think that it is true.
Starting point is 00:24:52 It's not better than nothing because it appears or it offers the illusion of a solution. But again, it does a thing that makes certain students less safe. I mean, one of the things I referenced at Center for Democracy and Technology study, one of the things that they found in their research is that I think 70% of the time that these systems were used for discipline, not for safety. And so this is a thing, again, that we see kind of when we think about surveillance creep in these systems, right? That the claim is that they're there to make students in the institution more safe. But one of the things they're actually used for is to perform like their more carceral function, which is like cracking down on students.
Starting point is 00:25:40 Yeah, I think there's two really important points there right first of all that especially in a moment like after a mass shooting i think it's difficult to be the person who says no we shouldn't have more surveillance in the schools because kind of the accepted wisdom is that this is going to make things safer even though the evidence often shows otherwise and then these systems are presented in a way such that they are supposed to be targeted at the shooters, right? The people the shooters are rare, it's hard to predict them, but are actually turned against the students themselves. And so they are subject to this more carceral, more punitive system. And the school becomes a place that's not about learning and being yourself and discovering things, but where you feel less safe and surveilled and that you can't be yourself. Yeah, absolutely. And it's only ever going to be this way, right?
Starting point is 00:26:46 I mean, and I know that there's probably an air of kind of tech determinism, or it may sound that way when I say this, but I frame it a different way. I've said in the past that kind of surveillance always finds its level. And what I mean by that is that the nature of surveillance, right, not the technology, but the nature of the practice of surveillance is that it is an attempt to exert control. And so we get control through discipline or the claim, they are instances of a particular set of ideologies about control. They are going to be used in ways that attempt to control students. And what that looks like often is discipline. Again, that is often, you know, when I say discipline, people might think that what I'm talking about are things that lead to violence in schools.
Starting point is 00:27:51 But it might be something like dress code. It might be something like eating in the hallway. It might be something, you know, it might be talking loud. You know, all these things that are an attempt to control students that often have very little to do with their actual safety when they're at school. It's such an important point. I want to pivot a little bit to something else that you mentioned, because we've been talking, I think, about the physical infrastructure of the school to a large degree, right? And maybe software that's implemented within that physical infrastructure. But you also talked about the real surveillance of the students themselves in the devices that they receive, right? And I think that this became something that was particularly noted by people or really came to people's attention in a significant way, especially in the first kind of stage
Starting point is 00:28:40 of the pandemic when, you know, there were lockdowns to some degree, at least, and many students were not in school. And so they were given devices to be able to continue their school work and their learning at home. But those devices had, you know, software installed on them that allowed the school or the teacher to be able to see, you know, everything that the student was doing on the computer. So can you talk to us a little bit about that and what it actually looks like and what it means for the students to have these devices that can see everything they're doing? Yeah. I mean, there's a wide range of technologies that fall under that category. And I should note too, that we've also seen this, that the last
Starting point is 00:29:20 couple of years has also shown us an uptick in the use of that for people in the workplace as well. And I don't think this is an accident. I mean, they're parallel for a lot of reasons. But these systems, they possess kind of a wide range of capabilities. Seeing everything that a student is looking at if they're browsing, the ability to turn on a student's camera on or off, the ability to shut down a browser if a student is looking at things that the instructor doesn't think they should be looking at, the ability to monitor the student's social media and messages and who they're communicating with. And along with that, often these systems are either tied to either third parties who are poorly trained, individuals who are monitoring children,
Starting point is 00:30:15 and are not in any way trained or experts at this, but also machine learning systems that attempt to look at the ways that teens talk and communicate and figure out whether or not those patterns or those communications pose a particular danger. And I think audiences who listen to your podcast will be well aware of some of the difficulties in machine learning, trying to predict things. I think there's an added layer of difficulty. I would say a degree of impossibility when we're talking about young people who are constantly shifting the ways that they speak and talk and communicate partially because they want to do those things outside of the watchful eye of adults. But also, I mean, frankly, outside of the watchful eye of machine learning systems and platforms. And so, there poses a great potential for false positives. But also, we got to think a little bit
Starting point is 00:31:22 about what it means to be young, what kind of experiences you're going through, what kind of growth you're going through, what kind of experimenting you might be doing, and think about how that doesn't match up with how certain parties do or do not think about safety. And so, I'll be very explicit when I say this. Young people are in the process of figuring out their sexuality, their gender. They're often looking for information on sexual health. They often might confide in people about those things. They might confide in their peers about those things.
Starting point is 00:32:00 There are large pockets of populations who don't want students to have access to this information and think it is being able to get information about those things, being able to experiment, to wonder, to confide. And these systems potentially rob students of all of those things. You know, Todd Feathers did an interesting piece in the markup about a month ago. And he asked, he reached out to some of these companies and asked them if they, in their sort of a heat list or list of blacklisted terms and things like that, if they included things about, say, sexual health. Most of the companies denied it. I don't mention the company's names because I'm not absolutely convinced that they're telling the truth. And as we've seen with the overturn of Roe,
Starting point is 00:33:06 that what a particular company is doing at this moment doesn't necessarily speak to what they will do when the laws change. And so I think it's very likely that one of these systems will do is start looking at those terms too and notifying parents, notifying teachers, notifying police when students exchange certain terms. I think in a lot of cases that can be very dangerous. international listeners who might not be more familiar when you're referring to Roe, that's the overturning of abortion rights at the federal level in the United States, which likely means and has already, you know, started to mean that access to abortion, that the right to abortion has been rolled back, has been criminalized in some U.S. states. And so that could very clearly impact some of these students who might be looking for abortion information on these devices that would be surveilled in this way with these softwares. And you also mentioned, I believe it was in your piece, maybe it was a different piece that I read on Wired, that there was a
Starting point is 00:34:16 student who was also outed to their parents as a result of these softwares. And, you know, in some cases that might be perfectly fine for the parents might be cool with it, but in other cases that could be a real problem. Yeah. I mean, this is, it's not an uncommon occurrence. This is another thing that came up repeatedly in that study I referenced from Center for Democracy and Technology, that there is often, I don't have the stat in front of me, but these systems, these surveillance systems that are placed on student devices have a very high potential to out students, right? There is a very high potential for involuntary disclosure of people's gender and sexuality in ways that
Starting point is 00:35:00 students did not consent to. One of the other things there that stood out was that in many cases, these notifications are not even sent to the parents, but are actually sent directly to the police and reported to the police. And that can obviously have some very concerning, very harmful interactions once the police get involved. Yeah, absolutely. And we know there have been multiple, multiple studies that talk about the ways that there's a higher danger to certain populations when police are involved or school resource officers. By a large margin, Black girls are kind of over-policed in schools, but that also applies in Black girls most severely and most often. This also applies to Black boys, to gay and queer and trans teens, right, and young people.
Starting point is 00:35:57 And so, again, it's really important to note that not only do these systems not do the things that they claim to do, but they also pose a significant risk and create significantly more risk for students who are already marginalized in some sense. The other thing I would add, and this is getting away from students for a second, but I don't want to understate the connection between this and kind of employee monitoring. I bring up this example quite often, but one of the biggest companies that sells this kind of technology as part of their promotional materials, what they bragged is that they could anticipate, predict, and prevent organizing within the population of teachers. Now, they later deleted this, right? And so there's a reason I don't say which company it is.
Starting point is 00:36:59 But I can put it in the show notes, right? It's on record. I can certainly add in the show notes, right? It's on record. I can certainly add that link, yeah. Right? They openly articulated that one of the things they could be used for is to prevent union activity amongst teachers. And so, when I talk about surveillance systems, I try to always point out that while it's true, these systems are often going to be used to control and surveil particular populations, you know, and that the harms are going to fall the earliest and the most often on the
Starting point is 00:37:40 most marginalized. This is true. But the other thing that's true that I always encourage people to think about is the ways in which these systems will be eventually turned on everyone. And so, if we think about the ways that it's going to harm students, and I mean, they are currently harming students. So, we can think about that. But I also encourage instructors and professors and administrators to think about the ways that these systems will also be turned on them as well. Yeah, it goes back to your writing with David Columbia on luxury surveillance, right? And who
Starting point is 00:38:19 is actually subject to this surveillance and who thinks they're not until it actually turns on them as well, right? Which it very often does. And I'm happy you brought up the point about the teachers as well. Like, shocking to hear that a company would, I don't know, maybe it's actually not so shocking that they would come out and actively say they want to stop you in organizing and will offer you a tool to do that. But like, especially when you think about the context of teachers in the United States, where, you know, I would say in many cases, they're already treated quite poorly, especially in a lot of public school systems, very underpaid, in many cases, not unionized, have a lot of increasingly kind of draconian restrictions placed on what they can say to students or talk to students about without possibly seeking or being subject to some kind of punishment or even prosecution. I don't know. I've been seeing some reporting lately that in some states they're having trouble even finding teachers now. And it just seems not surprising seeing how things are advancing. Yeah. And I think it's always relevant and important to think about surveillance systems as an attempt to exert control, you know. And I think, I don't think, you know, as a parent, right, but like, just in general, I think it's wrong-headed to look at young people and think what your goal is, is that you should control them. I mean, not even speaking as a parent, but just kind of like having been a young person at one point myself, I think that the idea that you're
Starting point is 00:39:52 going to control them or that you should control them as a means of sort of keeping them safe or making sure they develop well and properly, I think is the wrong way to go about it. But there's all sorts of things that come with that too. So that because they're systems of control, that it's not going to just stop with students, that these mechanisms are also going to be an attempt to crack down on the workforce and control them in ways, even outside of sort of like things like curriculum, right? But control their ability to have an influence on their workplace and things like that. So yeah, it's pretty dismal. Yeah, absolutely. It also brings to mind, like, you know, I remember reading what David Noble
Starting point is 00:40:38 would write about technology and the development of technology and how it was so focused on like the development of technology was kind of shaped by both the need to make profits for companies, but also the need to increase control in particular of the workforce, right? And so when you can see the ways that these technologies are moving, like it's almost natural that that is ultimately how these things progress, right? Because that is kind of built into the incentives behind the development of this tech. Yeah, yeah, absolutely. I think this extends to a thing we haven't talked about, which is remote proctoring systems. That these are also, I think, whenever we talk about school surveillance, that these are methods and technologies that need to be included in that discussion, because they do that thing too.
Starting point is 00:41:25 They're also an attempt to exert control. And as I try to note whenever I talk about them, that they're not, the end goal is not just that they'll only be trained on students, right? And by trained on, what I mean is focused on. The end goal is not just that they'll only focus on students, but that they will result in being focused on instructors, teachers, professors as well in an attempt to control them. Again, whether that's, I think people may be familiar with the viral example from sometime in the pandemic where a school put out a contract that said you had to prove you were not taking care of your child while you were also teaching. Now, I'm forgetting some of the specifics, but essentially the school was saying that you couldn't both be teaching online and have
Starting point is 00:42:16 your child in the room. And I mean, so we've seen all kinds of instances of this, and I think people tend to think of them as isolated incidences rather than the canary in the coal mine, so to speak. That I think when we see these things, we need to recognize that, again, they may start on a particular population who is more vulnerable, who has less ability to object, but they are eventually coming for almost everyone. And yeah, I so wish that more people would recognize that. Hearing you describe that also kind of forces me to think back to like the promises that were made about these technologies back in the day, right? Like they were going to be personal liberation and we're going to be about freeing the individual
Starting point is 00:43:04 and all this kind of stuff. And, you know, it was all going to be fantastic. and we're going to be about freeing the individual and all this kind of stuff. And, you know, it was all going to be fantastic. We were going to have this great utopia offered by these digital technologies and the Internet. And then just to increasingly actually see how these things are implemented in our lives and the great divide between that future that was sold to us in order to kind of get us to buy into this privatized vision of the internet and digital technology, and then to see how our society is increasingly shaped by this surveillance, by these monopolistic forces that have taken advantage of it, it leaves me quite angry and frustrated.
Starting point is 00:43:39 Yeah, I mean, one of the CEOs of one of the proctoring companies said, when he was quoted in a piece, he said, we're the cops, we're the police. And so this is like a very perfect distillation of how they think about their product and the carceral nature of it. And so it's disappointing to me as well. But I mean, as someone, you know, I know that you've studied the history of this stuff very well. And so, right, I mean, we saw this coming. It's just that it's been very hard to sort of be the person, you know,
Starting point is 00:44:19 or part of the group of people who say to people, hey, this is what's going to happen. Hey, this is what's going to happen. Hey, this is what's going to happen. And then it happens. And I don't want to be reduced to, and I told you so, but yeah, I mean, much of this is exactly playing out in the ways that people had predicted and have been predicting for 50 years. Yeah, no, absolutely. And it brings to mind like Dan Green's work as well. And, you know, he was on the show last year talking about his book and just how at the moment that the internet is rolling out in the United States, there's also this gutting of the welfare programs and this expansion of the carceral state alongside of it. Right. And, you know, I guess you can just
Starting point is 00:45:00 kind of see that built into what we're talking about now and these kind of trajectories and things just continuing, right? And we still see it today. I think there's a final point on what we're talking about with these technologies that I want to mention before we kind of start to close off our conversation. And once again, it's really the classism that is built into this, right? As you know, we've already kind of talked about, but just to really cement the point for the listeners, there are certain students who are very reliant on these devices, and even their families are reliant on these devices that the schools are giving them, and everything they do is surveilled. And even if they plug their phones into the computers, what happens on their phones can
Starting point is 00:45:40 also be kind of picked up by these systems and seen by the schools. And then on the other hand, you have wealthier students. We've already said how private schools might not be subject to this to the same degree. But then, you know, if you're more well off, you're likely not using these school provided devices once you leave the school. And so you're also not being subject to this surveillance in the same way. And so it very much has a differential impact. Some people are being impacted much more than others because of their class position. Yeah, absolutely. There was a story that ran in The Guardian about three years ago, and it was about a group of parents in, I think, the Virginia DC area who decided that they were
Starting point is 00:46:21 going to petition the school and thus tech companies to, at the end of the year, delete all the data they had on children. And it was DC, Virginia area. So you can imagine lots of federal employees, probably people from federal law enforcement, judges, politicians, things like that. And they were able to go to the school and say, do this, and go to the tech companies and say, do this. And the school and the tech companies did that. They deleted the information that they had on these children. And what you're just referring to, that these students are having massive amounts of their data extracted, used for who knows what, maintained in some cloud in perpetuity. And so there is such a chasm between those two groups of people and the ways that these products and processes are used and deployed against them.
Starting point is 00:47:27 I appreciate you outlining so many different aspects of this for the listeners and to really show how harmful it is to have these systems continue to roll out. As you said, these are not just things that are ideas, but are being installed in many schools in the United States. And I'm sure other parts of the world are starting to grab onto them as well, right? The United States will be the testing place and then others will follow suit. What do you think might be the next stage of this? Do you see this further entrenching? And what tech might they try to put into schools next, do you think? So, unfortunately, I see in some ways that it is going to get worse before it gets better.
Starting point is 00:48:13 And I would love to be wrong about this. I hope I'm wrong about this. One of the things we'll see is that as more and more people return to school in person or as we move to our next stage of dealing with the pandemic. And unfortunately, often that means ignoring it. But our next stage of dealing with it is that a lot of the tech that was ramped up and purchased during the pandemic is now going to be used for its more surveillance purposes. And when I say that, I don't mean disease surveillance. And so, an example is that for a time when people thought that it was essential to take people's temperature before they entered a building. And there are lots of
Starting point is 00:48:59 companies that were selling these kiosks and things like that to schools. So they consistently try to upsell the schools with not only temperature detection, but facial recognition. And so now that these devices are in schools, they can say, oh, well, you know, you don't have to put this thing in a closet, right? You spent all this money on it. and so now you can continue to use it because it has facial recognition. So we'll see a lot of that. We'll see that any pandemic technology or technology that was particularly deployed in an effort to stem the pandemic, we'll see that used for kind of its other surveillance purposes. I think as long as there is this persistent myth about the abilities of artificial intelligence and machine learning to predict things, as long as people are able to still perpetuate
Starting point is 00:49:58 that myth, we'll see these systems move further and further down the line of prediction technologies that claim they can identify a potential threat before that threat manifests. I think we'll see a lot more of that. The other thing is I think we'll see a lot more of petitioning and approaching social media companies for the messages and the communications of students, whether that be geofence warrants and things like that, or as we've seen really recently, the requesting or warrants to Facebook about messages that people have exchanged. I'm unfortunately often in the position of telling people why something's bad or how it's going to get worse. And again, I'd love to be wrong, but I think that we're going to see more of that in the near future.
Starting point is 00:50:52 Yeah, sadly, I think you're right. And I remember talk of how these technologies that were adopted to deal with COVID could then be later turned in a way that, you know, was not their original purpose, but allows them to be entrenched and continue to be used because this investment was already made. And, you know, unfortunately, this is one example of that. And I guess to close our conversation, maybe on a more hopeful note, if there is a more hopeful note to be found, do you see any kind of positive steps toward opposing these technologies? Or do you see good ways to try to stop this development? Or is it really until there's the willingness to have a political solution, we're constantly
Starting point is 00:51:35 going to be served with these techno fantasies of surveillance that are just going to keep making things worse and worse and worse? Well, I don't think any of this is inevitable. And one of the things I think is really important. So I was approached by a group of students who are working on this stuff. And I think young folks in direct opposition to prevailing myths about whether or not they care about privacy and how they use technology and stuff like that. Young folks are keenly invested and aware of how these systems work, how they don't do the things that they're supposed to do, how they make them less safe. And many of them are advocating and
Starting point is 00:52:19 organizing against them. I think that's really important because it speaks to the ways in which they're resisting and speaking out against things that are supposedly done in their name. And so, that is a place where I have tremendous hope, right? Is the ways in which young people are saying, no, no, this isn't for me and it doesn't work for me and it doesn't work anyway. I mean, it's unfortunate, right? I mean, I hate to pin hopes on young folks because that sort of inverts some of the... I mean, I want them... They have agency, which is a thing I'm glad they have, right? But it's unfortunate to pin hopes on them to create a better and more workable society, right? that should not be on them. But unfortunately, in some ways, it currently is.
Starting point is 00:53:08 And that is a place where I've seen a lot of action. And the other thing, one other thing I want to go back on, like, is the thing we didn't mention, I'm sure you've seen, and I wanted to make sure we got to, is like the ways in which these systems, like the Pasco County example, for instance, where they did their own version of predictive policing, but they were targeting children and looking at things like attendance and grades and whether or not there had been violence in the homes and using that to predict, in their words, which young folks were likely to become criminals. This is their language, not mine. And what they did with that information was to then harass those families in the hopes that they would move out of that area. The Tampa Times did a Pulitzer-winning series on this.
Starting point is 00:54:06 They got the Pasco County Sheriff's Department, got federal money, developed their own predictive policing algorithm, aimed it at students, and used it to harass students and their young folks and their families in the hopes of driving them out of the area. And I think it's important, again, when I mention these things, they're not one-offs, right? So, often we see kind of the worst episodes or the worst instances, and people are able to dismiss these things as one-off, whether it's remote proctoring system that can't see students, and so they're forced to shine bright lights in their faces. Students who fail the bar because of like wonky technology. Students who have some kind of forced disclosure about disability or gender. These are not one-offs, right? These are the ones that we hear about,
Starting point is 00:54:59 but the nature of these systems is that there are many, many, many more that we're not hearing about. And so when we hear about these, it's important not to think about them as isolated incidents, but as part of a pattern that as we have these systems more and more deployed against students will happen increasingly. I think it's a really important point for you to make. And even though there's that hopefulness to see the students pushing back against these things, it really shouldn't be on
Starting point is 00:55:30 them to have to, you know, lead this charge and lead this fight in order to push back against technologies and systems that are just inherently oppressive and harmful and that shouldn't exist in our society anyway. And so, Chris, I always love to have you on the show. I'm so happy that you came back on to discuss this with us. Thanks so much. Oh, it is, again, absolutely my pleasure. Thank you. Chris Gillyard is a Just Tech Fellow at the Social Science Research Council and a recurring columnist at Wired. You can follow him on Twitter at hypervisible. You can follow me at Paris Marks, and you can follow the on Twitter at hypervisible. You can follow me at
Starting point is 00:56:05 Paris Marks, and you can follow the show at Tech Won't Save Us. Tech Won't Save Us is produced by Eric Wickham and is part of the Harbinger Media Network. If you want to support the work that goes into making the show every week, you can go to patreon.com slash tech won't save us and become a supporter. Thanks for listening. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.