Tech Won't Save Us - Don’t Give Surveillance for Christmas w/ Chris Gilliard
Episode Date: November 25, 2021Paris Marx is joined by Chris Gilliard to discuss the ethics of tech media recommending surveillance devices, aspects of “smart” technologies you might not have considered, and why we should think... twice about surrounding ourselves with cameras and microphones.Chris Gilliard is a Visiting Research Fellow at the Harvard Kennedy School Shorenstein Center. Follow Chris on Twitter at @hypervisible.🚨 T-shirts are now available!Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Follow the podcast (@techwontsaveus) and host Paris Marx (@parismarx) on Twitter, and support the show on Patreon.Find out more about Harbinger Media Network at harbingermedianetwork.com.Also mentioned in this episode:Chris wrote about the history of home surveillance, and the concept of luxury surveillance with David Golumbia.Caroline Haskins went in-depth on what Ring does to communities.Health apps and fitness trackers help rich people but don’t do much for poor people.The New York Times thinks the Theranos scandal soured the media on Silicon Valley. (We don’t buy it.)US Infrastructure Bill will require monitoring systems for drunk driving in new vehicles as early as 2026.Amazon has worked to kill or undermine privacy legislation across 25 US states.In 2019, video circulated of a man talking to someone’s 8-year-old daughter after he hacked the Ring camera in her bedroom.In the US, wage theft matches all other property theft combined, but the media sensationalizes shoplifting while ignoring wage theft.Chris recommended people read Simone Brown’s Dark Matters: On the Surveillance of Blackness.The Wirecutter Union is asking people not to visit the website from November 25-29.Support the show
Transcript
Discussion (0)
Think about how much extraction's involved.
Don't give the gift of surveillance as a Christmas present.
Hello and welcome to Tech Won't Save Us. I'm your host, Paris Marks, and this week my guest is the fantastic Chris Gillyard. Chris is a visiting research fellow at the Harvard Kennedy
School Shorenstein Center. He's been profiled recently in the Washington Post and has had some
great pieces in Wired and Real Life that I'll include in the show notes that are relevant to today's conversation. You might recognize him most notably as at hyper visible on Twitter.
Last time Chris was on the program, we talked a lot about his concept of digital redlining,
what that means for cities as these technologies are increasingly kind of integrated into urban
life, but also what it means for education. In this conversation,
we're shifting it a little bit and instead focusing on the personal level and the community
level as these so-called smart technologies, probably better known as surveillance technologies,
proliferate through our lives, you know, with all of these Amazon home surveillance devices that
it's trying to sell or with, you know with fitness trackers or any number of other things.
And this is particularly a relevant conversation right now
because we are in this kind of hyper-consumerist moment
that happens leading up to Christmas.
And especially since this episode
will drop on US Thanksgiving.
And the following day, of course, on Friday is Black Friday, which is the
biggest or one of the biggest sales events of the year. That is really like the time to get your
Christmas shopping done ahead of December and the further consumerism that will happen through that
period. And I wanted to talk to Chris because, you know, every year around this time, what we see
is these kind of shopping lists, these listicles from tech
publications that recommend all of these gadgets that we should be buying, right? And among them
are the very things that even some of these same publications are warning us about throughout the
year. You know, how Ring has a ton of partnerships with police departments and how all of this
surveillance shifts how people think about their neighborhood, for example. And in the way that we talked about digital redlining last time Chris was
on the show, he's also written about another concept with David Columbia, who you might
recognize from episode 67 back in July, where we talked about why Bitcoin is a right wing technology.
But they wrote about how some of us, you know, buy these surveillance devices, surround ourselves
with surveillance, because it's a convenience, whereas those same sorts of things can be
wielded or weaponized against other people in a very different way. And notably, while those
things might be convenient to wealthier and more well-off populations right now, that doesn't mean
that they will be forever. And so I guess, you know, the goal of this conversation is maybe to make
people think twice about what they buy at this time of year as gifts or for themselves at Black
Friday sales, and whether, you know, we should really be surrounding ourselves with all of this
quote unquote smart technology, or whether we should maybe look more toward, you know, the
analog alternatives, or whether we can think about these things in a different way.
So I was very happy to have Chris back on the show. I think you're really going to like this
conversation and, you know, think twice about what you buy at this time of year.
Before we get into this week's episode, I also wanted to note that workers at Wirecutter are
going on strike this weekend because management at the New York Times is refusing to go to the
table and reach a bargaining agreement with them. Wirecutter is, of course, a website that does a lot of product reviews and recommendations,
and they are asking people not to visit the website from November 25th to November 29th
in support of their strike. So just a heads up about that. Naturally, the New York Times,
which has a ton of money and is doing really well with their subscription program, should be treating these workers properly. Tech Won't Save Us is part of the
Harbinger Media Network, a group of left-wing podcasts that are made in Canada, and you can
find out more about the other shows in the network at harbingermedianetwork.com. If you like this
show, make sure to leave a five-star review on Apple Podcasts, and you can also share it on
social media or with any friends or colleagues who you think would learn from it. This episode of Tech Won't Save Us, like every episode, is free for everybody because listeners like you support the work that goes into making it every single week.
So if you listen to the show regularly, if you feel like you learn from these in-depth conversations that I have with experts every single week,
you can join supporters like Emily and Carrie Lynn by going to patreon.com and becoming a monthly contributor so I can dedicate the time that's necessary to put these shows together to do the research and to make them available for everybody regardless of whether they can pay.
Thanks so much and enjoy this week's conversation.
Chris, welcome back to Tech Won't Save Us.
Thank you so much. I'm really happy to be here. You know, I am really happy that I mean, I don't engage in Black Friday.
I mean, in the consumer-ridden Black Friday anyway.
I mean, I am Black on Fridays.
But yeah, it's a pleasure to be here.
Well, it's a pleasure to chat with you about it, and we'll get into it a little bit more.
You know, basically, what I was thinking, and my idea with this episode was every year,
you know, around this time of year, because this is like, you know, when really the consumerism
kicks into high gear for Christmas time, there are all these kind of lists that you see on
regular news sites, but obviously, you know, our interest is in technology.
So you see them in particular on the tech news sites that are promoting all of these
products that, you know, we can get as a deal now, or maybe that we should consider like this is the best version of the smart home gadget or whatever else that you should consider getting as a gift or getting while it's on sale or almost like dependencies between the media and the
companies that like manufacture all this hardware and how like in this moment, you can really see
how when they're reviewing gadgets or when they're making these recommendations, any degree of like
criticism or kind of critical thought goes out the window. And it's just all of a sudden,
like here are all these great products that you should buy don't think about like the potential negative consequences of them yeah i mean i think
that's such a really interesting um and unfortunate disconnect i mean some of the journalistic
outfits that do some of the most critical and important reporting on these things whether
that's washington post or vergeired, also spend a lot of time
reviewing which is the best doorbell, which is the best voice assistant, you know, which is the best
gift of surveillance that you can give. And I mean, I think I understand the sort of economy of that.
And I think it was Jeff Fowler from the Washington Post who, a little over a year ago,
talked a bit about how, and I'm paraphrasing, right, but he talked a little bit about how he thought that the privacy and security concerns should be a part of reviews moving forward.
And unfortunately, I don't think a lot of places have adopted that, but I absolutely agree.
I mean, if they're going to talk about these things, right? So it's really unfortunate
to see on one day a blistering expose about all the ways that Alexa is violating people's privacy.
And then two days later to see, you know, Alexa promoted as the best voice assistant
to give for a gift. I wish that were addressed more and more consistently. But, you know,
hope springs eternal, I guess.
Yeah. Did you see this recent article in the New York Times where they basically said that
the Theranos scandal soured the media on Silicon Valley?
I did see that piece, yeah. So I don't know if you and I had an exchange about it,
or maybe it was a librarian shipwreck on Twitter, but if we just look at a couple of recent examples,
both the metaverse, so to speak, and the billionaire space race, I think we can make
some fair claims that the tech lash and the souring of tech moguls is a little bit overstated.
So since Facebook changed their name to Meta and, you know, Microsoft came out and said amount of coverage for a thing that, I mean, it kind of doesn't even exist.
But they have, and when I say they, I mean kind of tech titans, have the ability to almost speak something into existence in a way. And what I mean is not that it'll ever exist in some cases, but that
just by the fact that they've said it and they're billionaires or they run a trillion dollar company
or whatever, media often feels like they have to cover it as if it exists or that it's important
or even that it makes sense, right? So Bezos came out the other day and said,
well, one day, again,
I don't know what his exact analogy was,
but one day we'll visit Earth
like we visit the Grand Canyon
or something like that, right?
That we'll have colonized space
and that people will just visit Earth
as like a sort of touristy thing.
I don't believe that's ever going to happen,
but no one who's alive right now will see that happen
i mean so it's absurd on so many levels but it's treated as some kind of revelation because
bezo said it and so yeah i think the reports of the tech clash are greatly exaggerated
yeah i i would absolutely agree with you, naturally. And, you know, librarian shipwreck, Zachary Loeb, who's been on the show before, if people want to go back and listen to that episode. Yeah, you know, I think it was definitely overstated. I think it's really tough to actually argue that the media or the tech media has like soured on Silicon Valley. Like, I think it just sounds ridiculous. And like, I think this kind of Black Friday stuff really kind of puts it into perspective,
right?
Because as you say, on one hand, you can have these like really critical reports about bad
things that are happening at say, Facebook or Amazon or something like that, that really
shows like the consequences of these kinds of business models.
But then right next to that, you can have really uncritical repetitions of whatever kind of bullshit the companies are putting out.
And then now at this time of the year, alongside those types of articles, you get the listicles of
look at all these great deals you can buy, get your deal on Alexa, and all this sorts of stuff.
And so it shows that yes, there is this kind of critical coverage that exists, you know, no doubt
it's there. But, you know, there's still a lot of this really uncritical stuff and this stuff that
exists to kind of hype up and encourage readers to pay attention to and even to consume these very things that these companies are putting
out. And like the critical reporting that happens often doesn't then get kind of applied to those,
you know, shopping lists and things like that. Yeah, absolutely. I mean, so much of that stuff
is siloed. Yeah, I don't know how you can, again, on one day run an expose about Alexa or Ring Doorbell and then put it on your top 10 list of products to buy.
We need to get to a point, right, Ilya, if anybody cares what I think, right?
We need to get to a point where the security and the privacy and, frankly, the societal implications of these technologies is not removed from the reviews because whether or not, you know, obviously, like, whether or not a thing works is important.
But, like, what is its purpose?
And kind of, like, not only does it work for an individual consumer, but what is it doing to us is a question I think that needs to be asked.
I mean, I think for so long it has not been asked, and that's kind of explained some of the reasons
why we are where we are. But I'd love to see a stronger integration between that critical aspect
that's been so important and has revealed so much to us about these technologies and computational processes.
But I'd love to see where it's less divorced from the recommending of these things, particularly
at this time of the year.
Yeah, it's tough to get the affiliate revenue then, though, if you're saying, actually,
this thing I'm recommending that you're going to buy is probably not something that you
should buy.
It might have negative consequences.
Yeah, yeah. is probably not something that you should buy it might have negative consequences yeah yeah and i
mean you know i i think there are certainly some people who don't want that but it's a little bit
different with products that we've seen in the past right like a toaster or refrigerator and
things like that like these don't necessarily i mean in some cases they do but they don't
necessarily have the same societal implications as like uh cameras facing outward and ingesting everything in their in their view
yeah no i i completely agree and you know obviously since a lot of these listicles these
shopping lists don't have this context that's kind of what you know i wanted this podcast to be and
and why i had you on because, you know, you always have
great opinions on all of these things. You know, every morning, I feel like you can jump on your
Twitter feed and like, you know, you get the updates for the tech news of the day that's
come out that morning, along with like these great, like snappy, critical perspectives on them.
And so, you know, obviously, I thought that would be great to kind of dig into
some of
these things that people would see on these shopping lists. And why maybe, you know, they
should think twice about buying them for themselves or other people. And so I feel like these kind of
smart gadgets have like proliferated over the past five or so years, you know, it kind of started
with the smart speaker. And like, all of a sudden, like, you know, randomly smart speakers, Alexa's Google Homes were like popping up in people's homes, you know, without really thinking twice.
It's like, oh, this is just this convenient thing that I can say to turn on the music or set an alarm or whatever people use them for.
I don't know. And then it seemed like all of a sudden everything needed to have something smart
built into it. You needed to have a smart toaster and a smart fridge. And I saw the other day a
smart coffee cup and a smart toothbrush. And it's like, why does everything need to be smart?
Yeah. I mean, I'm on record as lamenting that the ground has been seated as calling these things smart, I wish we had a snappier and more accurate term
to describe them, right?
Like a surveillance toaster or surveillance washing machine
doesn't kind of have that ring to it.
And I use that example because when I had to buy a washing machine,
this was about three years ago, actually,
there weren't any that weren't smart
right there were no offerings that were not internet connected like obviously i mean every
device and every company is now kind of a data company that also does x you know or a surveillance
company that also does x uh and i i think again like detrimental. But I think even from a practical sense, one of the most detrimental because often they're insecure or often the companies
that put them out don't run updates or customers don't run the updates that have been released.
And so, I mean, it's a very poor long range thinking for a lot of reasons, but again,
just from a practical standpoint of security to have these devices, dozens of devices in a home that do these things,
but ultimately really don't do their core function better.
So an old school buzzer or chime on my washing machine is just fine.
It doesn't need to text me and tell me that the load is done.
That has not improved my quality.
I mean, I didn't set it up, but it would not improve the quality of my life, I'm guessing.
I mean, I can't say with absolute accuracy, but I do not need my washing machine to text
me, nor my refrigerator, my toaster, my coffee cup, or any other thing that traditionally
hasn't done that thing.
Because again, it doesn't cover its core function any better.
It doesn't wash clothes or dry clothes any more efficiently if it's connected to the internet and i would
just say you know with all these devices you know potentially taking down the internet don't tempt
me with a good time right maybe i want that but you know i think what you're saying is, is totally true. Like, you know,
this is a bit of a different example, but I've been looking for an electric toothbrush recently.
And it was just shocking to me, like how many of them, and, you know, I've been reading these
like reviews and going through these articles of like the best toothbrushes. And it's just shocking,
like how many have Bluetooth? Like, why do, why does my toothbrush need Bluetooth? Why do I need to connect it to my phone?
I just want to like press the button and it like does its little buzz and I can like,
you know, brush my teeth. And then when I'm done, I turn it off, done. Like, I don't know why there
needs to be an app that tracks all of that. It just seems like completely unnecessary.
Yeah. Yeah. And I mean, we see it, I'm based in Detroit, and we see it with the automotive industry.
You know, I think it's really interesting the ways in which some of the stuff in the
infrastructure bills has come out about the ways in which future versions of cars are
going to have a lot more surveillance pointed at the driver, right?
Whether that's an attempt to curtail
impaired driving, as in, say, someone being under the influence of alcohol, but also to check
people's awareness and alertness as they drive, right? A lot more cars will be seeing this. And
in fact, I think it may be mandated at some point. The claims of some of the car companies,
as far as I can tell, is that most of this processing will be
local right that'll happen on the car you know that'll stay with the car i don't believe any of
that because if a car company has that information or the car contains that information pretty soon
insurance companies are going to want it law enforcement is going to want it on and on and on so um yeah i mean that whole again like
environment where every technology every technology company but now not only sort of
explicit tech companies right but every company that makes a product is like a surveillance
company that also does x you know um and i think very dangerous not the road i think we should be going
down yeah like this is a bit of a you know side conversation but i would just say like i'm so
frustrated by like the degree to which a lot of this this like transportation technology is focused
around the automobile and like you know ensuring that we keep driving our automobiles and just like
upgrading them with technology when i'm like why can't we just ditch the automobile and like have great transit systems?
And like, then we just have our little transit pass and like, we're not being like, you know,
surveilled by all these cameras and sensors that are being added to the automobile. We don't have
to worry about the insurers and like all this kind of stuff. I don't know. It's just frustrating to
me as someone who looks a lot at transportation.
And I would also add to your point on security. I think we'll get into the actual products themselves in just a minute. But I would add to your point on security that I think it's also
like an environmental issue, right? I think that when you add more of these digital technologies,
more of this internet connectedness into so many of the products that we rely on every day. I think at the same time, you not only build in security vulnerabilities, but also like increase
the risk that this is something that's going to break down and can't be fixed as easily
in a shorter time period. Like, you know, I think there's a lot of people today who say,
you know, washing machines and refrigerators and stuff like that don't last nearly as long as they
used to, like in the past that their lifespans are much shorter at the same time as we've thrown all this technology into them.
And so I think when we're thinking about like what a sustainable society looks like,
is that really one where like everything we own is connected to the internet and has some kind
of connectivity? Or like, is that actually kind of going against the sustainability that we want
to practice? Yeah, absolutely. I mean, um, I, I've mentioned this before, but my, my father was a
washing machine and dishwasher, uh, repair person. And, uh, you know, yeah, I mean, machines,
washing machines used to last for 30, 40 years, refrigerators, freezers. Um, they do not anymore uh you know i mean again part of that is because um
you know companies well i mean i'm not going to do a great job of explaining like the economic
model but essentially selling something once that lasts for 40 years isn't like it's not
sustainable on the part of the company like it's better better for society, but the company wants to do,
wants to do something a little bit different.
Yeah, that's why Apple wants you
to upgrade your smartphone every year, right?
Same sort of thing.
I get it.
I get it.
It's easy.
All right.
Well, you know, we've talked about,
I think some of the bigger picture
on these smart devices.
And I want to get into some specific,
like, I guess, categories of them that you very, very frequently see on these smart devices. And I want to get into some specific, like, I guess, categories
of them that you very, very frequently see on these lists, especially in the past few years.
And, you know, the first of those is these kind of home surveillance devices, I think most notably
the Ring camera. But, you know, there's been a lot more kind of built out from that other kind
of cameras and drones and things like that for the home. And so I wonder what you make of these kind of devices and what are, what is some of the
context that I think people should be having when they see these sorts of things on these lists,
or they consider putting them in their homes that they might not otherwise consider about what that
does to the people using them, but also kind of the surrounding, I guess.
Yeah. You know, I think there have been some noteworthy cases where
information's come out about the degree of information that these devices ingest,
or whether that's human beings actually listen to the recordings and things like that. But I'm not sure that the average person is keyed
into how much data these devices extract, how long it's kept, and what's done with it.
But I also think that there's a prevailing notion for a lot of people who buy these devices,
the nothing to hide belief system sort of still exists, right? Which is that if you're
not doing anything wrong, that who cares if Amazon's recording all your interactions? Who
cares if Google is watching you every time you come in and out of the house? You know, who cares
about those things? I mean, so that's patently untrue, right? Like, there's a lot of reasons
to care, even if you don't think you're doing anything wrong. You know, I think in the hands of different parties, again, whether that be a government that you don't necessarily agree with their policies, you know, your employer, your neighbor, in the case of automated license plate readers, like a nosy neighbor who's got a grudge against you. I think that there are ways that
these devices and the data they extract can and will be leveraged against people all along.
And so, again, it's often a maxim in surveillance and privacy studies that surveillance falls kind
of most and hardest, or like the pernicious and harmful effects
fall most and hardest on the most marginalized. And I absolutely believe that. I think that's true.
But the other thing is, I don't think that the harms stop there. I think it's important to
realize that in some ways, that a society that is covered in cameras and microphones is not good for anyone even the
most privileged um i don't think we've gotten to a point where people recognize that you know again
my push would be to try and get people to understand that but even i mean there's just
some really you know i don't necessarily like this word but it's a better one's not coming to
mind right now just some really creepy and and very invasive things that these devices do. So there's a story in Reuters just a few days ago about Amazon and the ways that they've in these really kind of intimate moments.
And I can't wrap my head around the idea that people think it's okay to have some giant conglomeration consistently recording their children in their most private and intimate moments sometimes, and working all kinds of computational
processes on these. And these are things that are going to persist over dozens and dozens of years
and decades. And again, doing who knows what, right, with this information, right? Whether
that's sort of trying to diagnose illness, diagnose disability, or assign disability.
Do all these things that are not necessarily going to be in the best interest of the customer
or the consumer, but are always going to work to enrich the company.
I don't think those are good things.
And I don't think we fully recognized how that's going to play out.
And partially because I don't think the companies fully know how that's going to play out.
But that is part of the reason that they extract and hold on to all this data is so that they can continue to work with it in a variety of ways. Some they know and some they don't, right?
Because they're consistently innovating in that space.
Yeah, I think that's a really good point.
And I want to get to kind of the broader kind of what this means for the communities around
us kind of piece of that as well.
But I think when you bring up their children and the recording of
children, I think that is actually a really important point. You know, I don't have kids,
I don't know a whole lot about kids, I'll admit that. But I found out recently that a lot of
daycares have cameras that parents can go on to like online throughout the day, and like,
watch the children and the workers to like
surveil what's going on there. And I think it's also notable how, you know, naturally parents and
new parents are very stressed. They have a lot on their plate. They want convenience and things to
be done easier. And I feel like these companies can take advantage of that really easily. When
we think back to, you back to Amazon's earlier days,
its desire to really get parents to get diapers.com so that it had that relationship to
parents so that it could get parents into Prime, into a subscription service so they would keep
using them. And then the idea being that they would keep using Amazon down the road as their
children grew up and they could take advantage of that relationship, but also how there are so many new technologies that have to do with like,
you know, baby monitors that are internet connected and that you can see from everywhere.
And like, there's this whole kind of technological system and like economy built around new parents,
because when you get people in at that stage, you know, you can hold on to them and keep having
things for them
as they go along. And so I think that's like a really important point to note there.
Yeah. And I mean, I mean, there's a whole quantified baby stuff. I mean, there's so much.
And, you know, to some degree, I think there's a history of saying, you know, adults can consent
to these things. And so, you know, how they use these devices and how the devices interact with them are okay.
But children, for the most part, are not able to consent in any reasonable way.
I think we've seen some of this pushback when Facebook, now Meta.
Facebook? I don't know of Facebook. What are you talking about?
I don't know if they're doing an Instagram for kids. We've seen some people recoil and some legislators
push back against that. We've seen a stronger push by certain parties to not post the life of
your kids on social media because they're not able to consent to that. And it's going to follow them in ways that they may not desire as they get older.
So in some regard, not ingesting kids into this system has been somewhat of a,
and a rule isn't the right word, right?
But it's been a guideline.
But particularly with voice assistants, like this has gone out the window.
We've seen, again, pushes by Amazon to make it seem cute to have Alexa read bedtime stories to your child.
Or to have these viral, look what happened when this kid asked Alexa, you know, is the tooth fairy real?
Or, you know, whatever.
Is the elf on the shelf really watching me?
You know, like, so there's been that push on the part of the companies.
And yeah, I mean, I do think there's a whole different set of rules or set of expectations when it comes to, and
set of dangers when it comes to surveilling kids.
I think, again, we've confused surveillance with safety.
And there's a strong belief system that cameras make people safer.
And again, that they make kids safer and adults.
I think in general, that is not true.
There are some cases. I mean,
certainly people will talk about cases where abuse has been discovered and things like that.
And I don't want to discount that. But I go back to my initial point, right? That I think the end
game that these companies would have us exist is a space where every space from our bedrooms to our bathrooms, to schools, to hospitals,
and everything in between is covered with microphones and cameras. I don't think that's
good for society. I think it mainly, it enriches the companies and certain individuals within those
companies, but not us as a whole. Yeah. You know, I naturally would completely agree with you on that point.
And, you know, what you're describing with, you know, kids using these technologies and
interacting with these technologies brings to mind that story from a few years ago. I think
it was an Amazon camera, but I might be wrong with that, where it was like hacked because the
security was so poor and you had like this person talking to someone's kids, like through the camera or through the
speaker or something like that.
Yeah.
And like, you know, so there's that kind of like super creepy stuff that can happen too.
And I think like Amazon's apparently improved its security, but like, what if you're using
a different camera or product that doesn't have that?
And I think what you were talking about there as well, where, you know, we're creating
the society that is surrounded by cameras, I think kind of gets to the bigger piece with
these, you know, all of these kinds of surveillance technologies that people are filling their
homes with.
And that kind of touches on some work that Carolyn Haskins did when she was at motherboard,
but also a recent wired piece that you wrote, um, that kind of gets into what this means
on the
neighborhood level, right? What this means for, you know, the communities that people live in,
and also how people think about their communities. Because I think there was something that kind of
stood out to me in Carolyn Haskins' work about how, you know, the Ring cameras and the neighborhood
app that is connected to it kind of builds on this kind of history of white people fleeing
to the neighborhoods, wanting to like protect their communities from black people largely,
and, you know, creating these kind of communities where they're trying to close them off to other
people. And I believe in the Haskins piece, she kind of talks about how people increasingly see
their homes like as a fortress when they're surrounded with these cameras, and that they're
trying to protect it, and that they're seeing all these things that are happening
in their communities that they never knew happened before, but have always been there,
but that they just didn't know it because there wasn't a camera there. And so it increases this
kind of fear and this kind of anxiety about what's happening around the home, and naturally leads to
these communities that just become like much more reactionary and fearful and nothing has really changed. So can you talk a little bit about that kind of effect on the communities from having all of this surveillance technology put in there? that there's no real empirical evidence that suggests that having a ring doorbell, for instance, makes you any safer.
And I, again, look to the, I mean, Carolina's done amazing work, as well as Alfred Ng and Del Cameron.
And I think it was Alfred who talked to some law enforcement. things they said is that what it does is it escalates things that police normally may have
kind of you know let you file a report but wouldn't have had to like follow up on you know
like uh egging cars or package theft or things like that um it escalates it to the point that
they're devoting more time to that and less time to other things so if you're of the mind that
police are out there doing some really important things, I would assert that investigating who egged your car or TP'd your
house or stole the package off your porch are not among those important things that they should be
doing. But I also think there's an interesting way in which not only do these systems point outward,
right, but they've also started now to point inward. So in addition to looking
out into the neighborhood and even often onto the street, you know, pinging people when someone
walks by or drives by, you know, in some cases with, again, like some HOAs are purchasing
automated license plate readers so that they can see who comes in and who drives in and out of the neighborhood. So, and we think about the next door and neighbors and citizen and these platforms that say that they're about, well, they say different things, right?
Citizen claims to be about crime.
Next door neighbors seem to have kind of a different ethos or at least like present a different ethos. I don't know that
that that's actually the case, but I do think that these systems work to a great degree to
increase people's anxiety and fear and to amplify them and allow people to broadcast them in ways
that weren't available before. But the other thing is now, again, these things are pointed inside.
So at Amazon's recent product launch,
they have the drone that flies around the house.
They have Astro, which is a roving Alexa with a camera
and big eyes that are supposed to make it look cute.
It's friendly, Chris.
It can't be a threat.
Again, I'm so upset that people keep ceding cuteness and coolness to these objects. It's friendly, Chris. It can't be a threat. which is, you know, a surveillance device in your home. You know, and as an aside, when people are calling the Facebook Ray-Bans, cool, right?
Like, I think when we do that,
we give those companies a pass, right?
Their purpose is clear
in why they've designed them in those ways,
and we shouldn't let them off the hook with that.
But to go back to what I was saying,
that Fortress notion, again,
is not only having cameras pointed out,
but having cameras pointed in. And I think you make a really important point, again,
I think that we need to consistently talk about, is the way in which these devices not only often
are not secure or less secure than they could be, but they're often leveraged as tools of harassment and abuse, not only by
hackers, but by domestic partners. And I think that's really understated and not maybe as well
understood as it could be. So it should not be a surprise, right? And it's not to many people,
it does not be surprised that a surveillance device, which is essentially a device for capture and control, is leveraged in that manner by all
different kinds of parties, right? So not only Amazon, or not only law enforcement, but again,
people who are like violent partners, parents who want to control their children, on and on and on
and on. And I think that is a thing
that needs to be stated early and often when we talk about these things.
Absolutely. You know, I think you make a really good point there. And one thing that also stood
out to me in the piece that you wrote from Wired is, you know, you talked about how a lot of these
ideas about home surveillance actually come from the 1960s. And this black woman named Marie Van
Britton Brown, who, you know, wanted to keep her home safe from people who might be out around,
you know, with a home security system. And this was an idea to kind of protect her as a black
person, her family. But now what we see is actually that so many of these surveillance devices and
these home security systems are often used against
Black people in particular. And it just creates this kind of society that is not only more hostile
to Black people, but as you say, it might start there. It might start with marginalized groups,
but eventually these things that seem convenient now are going to come for the rest of us in time.
I was really grateful to be able to write that piece
because it's a story I've been wanting to tell
for a long time.
You know, I mean, people knew
about Marie Van Britten Brown.
Simone Brown has done, as she always does,
but has done some amazing work
talking about that story.
If you haven't read Dark Matters,
read it as an aside.
Not you, Paris, but like just anyone listening.
Um, but I was really happy to tell that story, but I do think it's an important thing to think
about. And again, let me say this at the top. I do understand how some people think that they need
security cameras or some people may actually need them. I understand that, right? Like, I grew up in Detroit.
I think, though, that... Well, so I don't think this.
I'm sure of this.
The way in which these systems exist, right,
as systems that upload this to the cloud,
that make everything that crosses the threshold
part of, like, an Amazon ecosystem,
that makes it far easier for people to report these things to police
that uh again allows them to broadcast their racialized and racist anxieties to a large
swath of people in ways that weren't previously possible this is very different from having
it's like a camera that's on your property and a server or a hard drive in a cabinet in your house. The scale and scope of these things is very, very different and I think it's important to note the accomplishments and
achievements of Black women and Black inventors and Black innovators. But also, there's a way in
which people think that these mechanisms are somehow going to save us, you know, like no pun
intended. So when I wrote that article, someone responded to me, well, what about surveillance, right? What about this idea that people who aren't powerful can use technology in order to surveil the powerful and the abuses that are perpetuated by them? cell phone video of abuses by law enforcement that have made tremendous impacts in the world,
right? So I'm not here to say that that doesn't exist. But again, what I tried to get across in
the WIRED piece is that ultimately these cases are the exception rather than the rule. So that
a solution to the systemic issues we have is not to give everybody cameras with facial recognition. So that, again, the entire world, you know, everyone's home, every transit system, everything is covered in them alluded to in the essay is about body cams.
And when a lot of communities and individuals were pushing for body cams, the idea was that they would increase accountability.
The research that I've done and the work that I've seen does not support that that has been the case because so much of the difficulty has been about who controls the footage,
right? Did police have their body cams on when they were doing these things?
Who controls the footage? When is it going to be released? What context is it going to be released
in? How much of it are we going to see? Like all these things, right? And I mean, we know from as
far back as Rodney King that even with footage of a particular event, that people bring
their interpretation to that. And again, these systems exist to serve power. And so the end
game of just everyone having a camera and always pointing it at everything isn't going to do what
I think some people think it's going to do. And again, some of the most egregious abuses
in terms of what we call crime in society
are things like wage theft, right?
So I see these companies,
and I feel like this is somewhat tangential, right?
But I want to point this out as often as I can.
Some of these companies are talking about
eliminating crime, right?
But what they mean is eliminating things like shoplifting. I don't care about shoplifting at all, right?
In terms of societal harms, it's far less than wage theft, but you don't see any of these companies
talking about how they're going to eliminate that, right? And so increasing the amount of surveillance isn't going to solve some of the large-scale problems that we have.
And in my estimation, it's just going to increase some of them.
Yeah, I completely agree, right?
Like an individualist solution to a structural problem that needs to be addressed.
And I would just say on the point of shoplifting, you know, there was that video circulating a few days ago of the
guys in San Francisco robbing the Louis Vuitton store. And then a few days later or something,
the mayor of San Francisco announced that the area in front of the Louis Vuitton store
would become a car-free zone. So like, I think they actually helped the city. Good job.
Yeah. And I mean, I think these narratives about crime, about rampant crime, they serve a particular
power structure, right?
And I think, yeah, one of the things they do is increase the belief that we should have
more policing and more security and more cameras.
But again, they don't get at the root of some of these issues.
And again, I don't think they make
people safer. And there's very little evidence that they do. I mean, so in one sense, I'm just
saying, I don't think, I don't think, but there's not a lot of evidence that supports it in terms of
peer-reviewed research. Yeah, just vibes. That's all.
Yeah. I mean, you know, we have, I think it's understated the extent to which popular culture has promoted the idea that cameras are going to solve everything, right? Like we have at least 30 years of this in movies and TV that it's just like, you know, like, oh, we have video footage of it. So now we can do this thing. Right. And very often that's not how it
works. Thank you. Police procedural dramas. But, you know, so I think we've discussed,
you know, all these kind of home surveillance gadgets. I think we would say, you know,
if they are on your Black Friday list, kick them off of there. You don't need them. They shouldn't
be in your home. But before, you know, we, we enter a conversation, I did also want to get into another category.
And that is, you know, the health trackers and the fitness trackers that I think are becoming
more and more common. You know, people have their Apple watches on all the time now. And I think
early on Apple didn't really know how to promote this thing, how to get people to put it on their
wrists if it just told the time. And one of the things that really became like the selling feature was the fitness features
and the ability to like track your pulse and all this kind of stuff. Right. And so I think that is
really interesting as well. And I will fully admit, you know, I turned 30 recently and I was
like, I should exercise a little bit more. And one of the first things that came to my head was like,
maybe I should get an Apple watch now because that'll like help me exercise. And then all of a sudden I thought
about it and I was like, wait, why do I think that like getting an Apple Watch is like how you
exercise? But I think it's something that we've been sold, right? Yeah, it's very alluring. You
know, I mean, I go for a walk every day and I have to consciously tell myself, like, don't look at
the steps, don't look at the steps, you know, right? I mean, these companies are billion and trillion dollar companies for a reason. I mean, they've
been very effective at leveraging these ideas that these devices are health devices, that
keeping track of some of these things is going to make us healthier and feel better and things like
that. You know, I mean, there's an Amazon, I think it's called the Halo, that tracks your voice,
the tone of your voice throughout your day.
I mean, there's all these devices, many of them that fit under what I tend to call luxury
surveillance, that make claims that are suspect, I would say, about how much they'll increase health and wellness.
You know, and I think, again, it is pretty dangerous in a lot of ways as these devices
become understood as health devices because, for one of the reasons, is because of the
ways they extract a lot of data that, again, is going to be leveraged against
people in ways that maybe perhaps they don't anticipate, whether, again, that's your insurance
company or employer, law enforcement, like a violent partner, on and on and on and on.
But also, I think in some cases, the benefits in terms of your health and wellness are overstated.
Yeah, absolutely. And we saw just, you know, we saw just recently,
I think it was last week,
there was this report that, you know,
these fitness trackers, you know,
they're promoted, of course,
that they're for everybody,
that they help everybody like improve their fitness
and feel better about themselves and blah, blah, blah.
But there was this report that actually, you know,
a study has been done
and the people who benefit from these devices
are rich people, are the same people who are promoting them and saying that they
benefit everybody, but actually, you know, once again, they only benefit themselves. And so,
you know, you mentioned their luxury surveillance, I was wondering if you could kind of
unpack that a little bit. Like, what does it actually mean luxury surveillance? How did
these products, you know, kind of show a divide between how surveillance works
for some people and how it works for others?
The way I think about it is that I used to, and I still do, but I used to make this observation
that a Fitbit and an Apple Watch and an ankle monitor did many of the same things.
That much of the distinction was about who chooses to wear it
and who is forced to wear it and how they look, right, the aesthetics of it. So that in many
cases, an ankle monitor is bulky and meant to stand out in order to stigmatize the wearer.
An Apple Watch, now I'm going to need to qualify this. But typically, luxury surveillance, in terms of its aesthetics,
is meant to blend in, right? Or to be sleek and cool looking. Now, the most recent Apple Watch
is the biggest Apple Watch. And so there's an interesting way, and David Golombi and I,
we don't have the piece out yet, but we hope to have a piece out soon that talks about this a little bit. But the divide between kind of the bulkiness of an ankle monitor and the
sleekness of a luxury surveillance device, that divide is breaking down a little bit.
Some luxury surveillance is becoming more ostentatious. I think that's interesting,
and I hope to tease that out a little bit. But typically, the luxury devices like the Amazon Halo or the Our think that there's a strong divide with a lot of devices that I would label surveillance devices.
There's a strong divide in when people purchase them, which end of the camera, so to speak, they think they're on.
So when people purchase a ring, they know that the ring sees them, but they don't think the ring is looking at them. When people
purchase an Apple Watch, they know that it sees them, but they don't think that it's looking at
them. And what I mean by that distinction is they think that the ways in which the device sees them
will work to their benefit, not to their detriment. So when a ring sees the people it's supposed to see, it recognizes them as
criminals or things like that. But again, as we talked about earlier, I think that a lot of folks
think that they'll always be on the right end of the camera. I don't think that's the case. Again,
I think there are a variety of ways that companies and institutions will leverage data that they have on people, do and will continue to do, you know, governments, law enforcement, in ways that people don't necessarily anticipate.
Again, whether that's your employer, your insurance company.
And so, I mean, part of my push is to get people to understand some of these luxury items as also surveillance, but in ways, again,
that aren't necessarily only going to benefit them because that is, I think, a prevailing attitude
when people purchase some of these things, that the surveillance mechanisms involved are going to
only work for them, not against them. And I don't believe that.
No, but I think that gives us a really good way to think about, you know, the products that we use
and that we buy and whether, you know, we should be wanting smart everything, whether we should be
wanting everything to be able to track us and to be able to record us. And just the assumption that
because we are maybe on a certain level of society or a particular kind of people,
that having all of these smart gadgets around isn't a threat to us. They're just a convenience
device that are helping us and can never be turned against us in the way that you're describing,
right? And so, you know, I did kind of want to wrap it up because our conversation is
getting a bit long. And so I wanted to leave you kind of with that question, right? Because in your wired piece, you mentioned how the surveillance commissioner
for the UK and Wales worried that, you know, having all of these cameras in particular was
building a surveillance society. But I think as we've discussed in this conversation, that there's
so many other ways that these kind of devices to record us and to kind of turn us into the quantified self,
the kind of data producing individual are proliferating and are being normalized through
media, through popular culture, through, you know, the marketing of these tech giants.
And so I wonder what you make of that and how we should be responding to it as we try to think about you
know what a better world looks like and whether we should actually be surrounding ourselves with
all these gadgets and devices and things like that one of the difficulties is sometimes these devices
can offer a lot of benefits and so i think we need to have a particular caution in that you could have some of these devices that offer the beneficial parts but don't have the extractive parts.
That's actually possible in many cases, whether that's a voice assistant or a watch that offers some tracking.
But because companies are data companies, they want to extract as much data as possible. So to the extent that consumers get some benefit from these things, I'd love to see a stronger
push to have them with as little extraction as possible.
But the other thing is, yeah, in a lot of cases, I don't think they work.
I don't think they do the thing that they're marketed as doing, or they don't do them in
any more effective way
than a not connected thing would do that. So what I would say is, I mean, a third thing, though,
to add quickly is like, many of these devices and processes and what's made available to companies
and how they extract data from people should not be legal and the devices shouldn't exist so i mean what i would
say is um firstly right in kind of inverse order is to like take a hammer to some of these things
bluntly i can support that but also to think about um to what extent they are doing the thing they
need to do in terms of like benefits and how much extractions involved.
Don't give the gift of surveillance as a Christmas present. That's what I would end on.
Absolutely. You know, I couldn't agree more, you know, obviously part of the point of
this conversation, but I think that, you know, in having this conversation, hopefully we've given
listeners some, you know, more perspective on the gadgets
and the technologies that they're using in their everyday lives, and whether these are the types
of things that they should be buying, and in particular gifting, you know, at this time of
year on an individual level. But I would also note, you know, going back to your point about
how a lot of these things shouldn't be legal in the first place, you know, think back to what
Chris was saying about, you know, how Amazon is
going into all these states and ensuring that privacy regulations are not being passed by
lobbying against them and what that would potentially mean for their business models
if they were able to. So certainly there is an individual level to this, but really addressing
it has to happen on, you know, the larger kind of government regulatory level as well. And so,
yeah, you know, Chris, I really appreciate you coming on the show once again and giving us
these great perspectives on all of these technologies and the great work that you've been doing.
Yeah, definitely a pleasure.
And I really appreciate the opportunity.
Chris Gillyard is a visiting research fellow at the Harvard Kennedy School Shorenstein
Center, and you can follow him on Twitter at at Hypervisible. You You can follow me at Paris Marks, and you can follow the show at
Tech Won't Save Us. Tech Won't Save Us is part of the Harbinger Media Network, and you can find out
more about that at harbingermedianetwork.com. And if you want to support the work that goes into
making the show every week, you can go to patreon.com slash tech won't save us and become a
supporter. Thanks for listening.